A little while ago, I was trying to add one of my Office 365 email accounts back to Outlook and it wasn’t working. I forgot that I needed an app password because I have MFA (Multi-factor authentication) enabled on my domain/tenant. I initially put in my “regular” password and was expecting it to kick back and let me try again; however, it didn’t do that.
Adding an account to Outlook
This screenshot is what normally pops up when you add a new account. It prompts you for an email address and when you click Connect, you are prompted for credentials.
Here was where I should have put the App Password in, but I mistakenly put in my regular Office 365 password.
Instead of prompting me for credentials again, Outlook simply prompted me with this and the “Something went wrong” error that sent me in circles.
How to fix it – attempt #1
I read a bunch of posts in various forums and most of them said to go to Mail Setup in Control Panel and to set up a new mail account here. In my case that didn’t work either.
How to fix it – attempt #2
The second thing I saw was what worked for me, and that was going into Credential Manager in Control Panel and removing the references to the email address there.
In Credential Manager, select Windows Credentials. Then, look at the credentials listed, expanding to view the details where necessary and look for the items with the email address you’re trying to add.
In my case, I removed the items instead of simply trying to change the credentials but I suppose you could try to edit the credential to see if it would allow a password change.
Once I removed the 2 or 3 things that had referenced my credentials, running through the process once more (and entering the App Password as I should have in the first place), I was able to add the account to Outlook successfully.
There’s no #TipTuesday today since I had website issues over the weekend and was unable to access it to prepare something. So, this will be a form of a tip in lieu of another topic!
There will be no mid-year tax update for Canadian Payroll for Dynamics GP this year, zippo, nada, zilch! Imagine that, not a single jurisdiction has a tax change this mid-year. That’s got to be a first!
There’s an expression “you can take the girl out of consulting but you can’t take the consulting out of the girl”. OK, so that’s not really how that goes, but it fits the situation!
As of June 1st (2019), I’m resuming my consulting business – Kuntz Consulting Inc. – and am back to being a Microsoft partner and independent consultant. This is so new that I haven’t even had a chance to update my website; so, depending on how quickly you read this post and click on that link, you may see the old message “no longer taking on clients”! Rest assured, I’m working on the updates as you read this.
Didn’t you *just* get a job?
Yes, it wasn’t that long ago that I decided to go back to full-time employment (January 2018). As much as I love who I work for and with, I simply was not ready to fit into the structure and routine of regular full-time employment. I gave it the old college try, so to speak, but I really missed the flexibility and variety of my consulting work. So, it was with some very mixed emotions that I decided to resign from that position and return to consulting.
Well, that’s hard to say as it’s still all pretty fresh right now. I am fortunate to keep a good relationship with my now-former employer and I will be continuing to work with them in a consulting role, albeit part-time. I very much look forward to continuing some of the fun projects we’ve got underway and planned for the future! That alone will keep me pretty busy for a while.
I’m heading to Microsoft Business Applications Summit next week in Atlanta, Georgia and it will be my first time at this event. I can’t wait to dig into more Power BI and Power Apps/Flow stuff while I’m there, and broaden my knowledge!
My summer is going to fly by between next week’s conference, a busy June schedule, and a big vacation in July. I’m pretty sure it’ll be August by the time I blink next. I doubt I’ll be making any big plans between now and end of July but will be considering if either GP Tech Conference or GPUG Summit fit into my budget and schedule. I haven’t committed to either yet and would like to do one of them if I can swing it.
Otherwise, I’ll be reaching out to former clients to see who still might have projects they would like me to help with and building out my schedule for the rest of 2019 and into 2020.
Today’s #TipTuesday isn’t a technical tip, it’s more of a business tip. I’ve written this to be specific to Dynamics GP, although what I describe may apply to other applications and software licensing. I have not worked with licensing on any other ERP packages to know if this might be a general rule of thumb or not.
If you’ve been using Dynamics GP for any length of time, it might be worth revisiting what you’re registered for and have paid for, module/functionality-wise. It’s probably worth taking a look at this every couple of years post-implementation, particularly if you paid for things up-front with the intention of phased implementations later. Every year is probably too frequent, but every couple of years allows you time to evaluate if your master plans have changed, if you still plan to implement what you thought you would at the time of purchase.
The math works out that, on average, every 5 years you are re-purchasing your Dynamics GP software (based on a rough average of 20% annual enhancement from ISVs). This doesn’t apply to subscription-based licensing, which is paid by the month, but the vast majority of users of Dynamics GP are not on subscription licensing (or that would be my guess).
Look for the value
When I first started my business, one of my main focuses was helping customers evaluate what they have purchased, to help clients get even more value out of their investment. I still think that same way, 10 years later. I hate seeing a client who paid for something they aren’t using. If you are in this situation, and continue to pay annual enhancement year after year but don’t plan to implement, ask yourself why you’re holding on to it at all. Remember the 5 year rule above.
I recently went through this with a client and we shaved quite a chunk of software off their annual enhancement costs. They had modules that were purchased during the original implementation that were no longer in the plans to implement. Businesses change, personnel changes and it’s quite possible that whatever the vision was during the original sales cycle has changed. It’s one thing to pay for something you’ve never used, but it’s quite another to continue paying for it year after year when you could cut your losses. (You won’t get your purchase price back, that’s a sunk cost, but you will get some relief in operating costs annually if you de-register things you don’t intend to implement).
Important caveat: if you remove a product from your price list and change your mind, you would have to re-purchase the module at the current pricing. Annual enhancement on the majority of ISV products & Microsoft’s core Dynamics GP is based on a % of your list price at the time of purchase. Chances are the pricing has changed since you purchased it so you want to be sure you don’t need it before removing it.
Another important caveat: if some of the purchased software is installed, even if it’s not “used”, you may need to engage your partner to assist in uninstalling and removing the references to those modules or products. You’ll want to ensure this is done properly otherwise it could impact a future upgrade.
Where should I start?
I would look at your original purchase, and make a list if you don’t already have one of all of the things you purchased at that time. I would look at your annual enhancement invoices. If those invoices are in detail, you should be able to trace through what you are paying for each year, which would typically be the same list of software you identified in your original invoice history search. If purchased software was registered at the time of your sale, it would generally trigger an annual enhancement invoice so if you see a purchase but no corresponding credit or annual cost, ask your partner about it.
Generally speaking, your annual enhancement invoices for most ISV and Microsoft Dynamics GP modules should be the same year after year unless you have continued to purchase more user licenses or added other ISV products or GP modules. There can also be foreign currency swings depending on where you are and what the currency is on the ISV products you use, which might account for some variation. Generally though you would see the same things on the list year after year and you have an inventory of things to evaluate.
With some larger ISV suites of products, you may need more info that your typical enhancement invoice may provide and your partner can give you that. Take Key2Act as an example, most people that have some of their products will have multiple pieces in the suite of products they offer yet you might only see a one-liner on your invoices for their enhancement. If your annual enhancement invoice just lists ISV names, ask for more detail, if you don’t have it on another original contract somewhere to break it out.
It never hurts to ask for the details – get a list of the specific modules you are paying for annually. If you don’t know what they are for or what they do, talk to your partner. Lots of modules have funny names that don’t make it easy to identify exactly what it’s for, so ask those questions or have them show you what the functionality is.
There will be things you have on your price list that you might not be able to remove, due to how it’s licensed or possibly some interactivity with another module that you do use. For example: if you’re on Business Ready licensing or newer with your core Microsoft software, you can’t “remove” SOP, POP, Inventory for example just because you don’t use it. The core product is licensed per user, not per module. That is a fairly simple example but the reality is, these are conversations to have with your partner/VAR.
Talk to your partner/VAR
The repeating message here is talk to your partner/VAR. They should know your implementation details to have that conversation with you and it’s in their best interests too, to see that you’re getting the most value you can out of your investment. If you find there are things that you think you’re not using, you’ll want to rely on their expertise to help you evaluate that or confirm that before making a decision. If there are things you can implement, they are also going to be interested in seeing how they can assist you in successfully doing that!
Today’s #TipTuesday is an Excel tip. Did you know there are some funky Paste Special options beyond the basics like Paste Values, Paste Formats etc.? I find if you were to ask most Excel users what Paste Special is, the majority will answer “pasting values instead of formulas”. While that’s definitely true, there are also Paste Operations.
There are random circumstances where I’ve found this to be useful in a pinch and today was one of them. I was working on some Fixed Assets stuff, and my data had asset cost and accumulated amortization both as positive values. I wanted to do some Pivot Table analysis with it, to validate some Net Book Value info, without fussing with formulas. Paste Special Operations to the rescue!
How does it work?
First, here’s some sample data as an example. In my simple dataset here, I want to flip the AA values to negatives.
What I want to specifically do is multiply each of these by -1, without doing it in another column and then copy/paste values overtop of the original. To do this with Paste Special Operations, I simply type -1 in any cell (it will not need to stay there when you’re done). Then, I copy that cell.
Now, I want to highlight where I’d like to “paste” that to multiply it to specific values. After highlighting my AA values, I opened up Paste Special and selected Multiply in the Operation section. Then, click OK.
That’s it! Well, the formatting leaves a bit to be desired when you paste this way, but here is what it looks like after I re-formatted my numbers the way I wanted.
Do I use this every day? No, but from time to time it’s handy to know how to do it.
This is a belated #TipTuesday, posted a day later than usual! Recently, support for longer passwords in Office 365 was announced but oddly enough, in some places the password change dialog boxes still limit you to 16 characters. Here’s how I worked around this.
Standard “password reset” feature – 16 char limit
Here’s a screenshot of a password reset window in my Office 365 tenant, and it clearly states my password must be between 8 and 16 characters. Hmm. My assumption is this will be noticed and rectified fairly quickly as it’s inconsistent with the next part I’m going to show you.
Workaround: “Forgot my password”
The current workaround is use the Forgot My Password link when you log into Office 365. If you use MFA/2FA like I do, simply cancel out of the dialog box that prompts to send a notification code and then click on Forgot my password.
Next, there may be one or two steps requiring you to verify with either an authenticator app code or text message or both depending on your organization’s MFA/2FA setup. I had to do both. Once I was past that part, the standard “choose a new password” section came up. Now I can select a password up to 256 characters.
That’s it! As I mention above, it’s very likely this will be corrected soon… but until then, this may help if you want to secure your account with a longer-than-16-character password.
Today’s #TipTuesday is a bit different from my normal type of post. In the course of monitoring some integrations, I had wanted to change the emails I get when there were issues. What I inherited was email alerts with attachments, without any data that I could use as an Outlook filter to tell me if something was ok or not. I had to open the attachment to see if there were errors. I wanted to find something better.
What I’m writing about today is what I’ve pieced together from various searches and unfortunately, while I got the pieces from various other blogs, I’ve changed so many things and re-worked this so many times, I no longer even have the references of those that helped me get to the point of what I’m sharing today. I wish I could link to them to thank them.
Overview of my process
The basis for this is that I want to be alerted to something on a regular basis, and if my criteria is met, send me an email with the info I need to respond or act on it. In one case, I want to get an email if there are any errors in my nightly integrations, as an example. The key for me was, I want the errors in the body of the email, I don’t want to have to open an attachment nor have to log into the server to read the errors. “There was an error” isn’t good enough for my liking!
So… these are the pieces of my process with examples of my SQL code. It’s fairly rudimentary but you can beef it up however you wish to make it fancier if you wish! All of the pieces together I have in a SQL Agent job as a step. I could probably create a stored procedure for them which would be cleaner and execute that from the job instead but I haven’t ventured that far yet.
Part 1 – Set up variables for email details
This is optional but it might make the rest easier to have the pertinent details right at the top of the code instead of having to search through the code to find what email address something is going to.
-- declare and set variables for the email
DECLARE @subject varchar(50)
DECLARE @recipient varchar(50)
SET @subject = 'My SQL Alert'
SET @recipient = 'firstname.lastname@example.org'
Part 2 – Create a temp table for the details
In the situation I was creating this for, I was checking a few different things in SQL and any one of them might trigger an alert. Instead of having multiple jobs, I chose to create a temp table and one job that “inserts” data into it if there are issues that fit my criteria. This temp table holds the “list” of things that will become the body of the email.
This is a simple 3 column table and the contents will be put into a 3 column HTML table in the body of the email.
-- Create temp table for tracking details of whatever I want to see in the body of my email alert
IF OBJECT_ID('tempdb..##AlertContent') IS NOT NULL DROP TABLE ##AlertContent
CREATE TABLE ##AlertContent
Part 3 – Insert some criteria
In this example – only to demonstrate something simple – I’m pretending I want to know if a new GP company was created and I want to see a user list to monitor new logins. I’ve got 2 insert statements that add things to my temp table.
NOTE: this ultimately is code I’m pasting as a step in a SQL Agent job, and I needed to fully qualify my table names (db.schema.table) since the actual send_db_mail procedure I’m calling is in msdb.
-- Example: find all companies created this year
INSERT INTO ##AlertContent
SELECT 'New Database' as AlertType,
('Database: ' + INTERID) as AlertTitle,
('Company Name: ' + CMPNYNAM) as AlertDesc
WHERE YEAR(CREATDDT) = 2019
-- Continue as needed if you want to have multiple alerts in the same email
-- This example is the user list with some fake criteria
INSERT INTO ##AlertContent
SELECT 'New Users' as AlertType,
('User: ' + USERID) as AlertTitle,
('Name: ' + USERNAME) as AlertDesc
WHERE LEFT(USERID,10) <> 'LESSONUSER'
Part 4 – Format an HTML table & send mail
The next step is code I mostly found elsewhere and modified over time to suit what I needed. (This is where I wish I could link back to where I found it as I didn’t create this part myself).
The first thing I am doing here is checking to see if the temp table has content or not. If it does not, the job will fire without sending an email which is what I want. I only want to see an email if there is an issue.
The end of my script is a command to drop the temp table.
-- Build HTML
DECLARE @bodyHTML varchar(max)
-- Check to see if there is anything in the temp table
-- If there is, build the HTML for a table in the body of the email
IF (SELECT COUNT(*) FROM ##AlertContent) > 0
SET @bodyHTML = '
td = AlertType, '',
td = AlertTitle, '',
td = AlertDesc
FOR XML PATH('tr'),TYPE,ELEMENTS XSINIL
) AS NVARCHAR(MAX)) + '
-- Send email
@profile_name = 'DEFAULT',
@subject = @subject,
@recipients = @recipient,
@body = @bodyHTML,
-- Drop Temp table
DROP TABLE ##AlertContent
When I run this in a query window, this is what I saw. I didn’t include a NOCOUNT command so I see 1 row affected and 2 rows affected because the two insert statements resulted in that in my test data. The Mail queued simple tells me the email worked.
… and here is what the email looked like with my simple example. The “from” address and name are based on my SQL database mail configuration, the subject is from the code above and the table is from the code above.
A real-life example
How I’m using this currently is for monitoring of a nightly set of integrations. I have inserts that are checking my staging tables for errors and inserting them into the temp table. I then have a final insert which populates the contents of a log table added to the temp table. The result is most nights I’m simply getting an email with last night logs because (knock wood) there are no errors most of the time.
I’ve tweaked what I have above slightly so that I alter the subject line based on whether I have errors or not. (I added another variable to hold an ErrorCount numeric value). If I have errors, the subject of my email is “Nightly Summary – Errors” so that I can use Outlook rules to highlight that in my inbox. If I don’t have errors the subject might simply be “Nightly Summary” and it just contains the logs. The Outlook rule is to file that but I know I have it if I want to reference something from a given night.
This may be overly simplistic but the framework for this works well for me. I got tired of getting emails with attachments and having to open it every day just to see if there was an error or not. This is much nicer!
If you want to see more columns of data in the table, you’ll need to do some modifications to the HTML structure to add the extra row and headers. If you’re better at HTML formatting than I am, you can make the table look a lot nicer too!
I don’t know if this will help anyone but I thought I’d share it anyway!
A few months ago, I started re-evaluating the credit cards I was using, and checking into the benefits they offer. I was planning a big trip and have never really paid attention too much to the benefits other than the basic rewards they offer (cash back or travel points in various programs etc.). I knew friends who had used some of the warranty extension kinds of benefits, but that’s about it. It seemed like most cards advertise the same stuff – extending the warranty on certain purchases, concierge services, car rental insurance coverage, some insurances on trip cancellations and things like that. It all sounded the same to me.
I decided to make some changes, cancelling a basic card I had and looking for something with better rewards. Who knew that months later I would have gone through 3 different “new” credit cards to find the “right” one for me? I realize now how many things I never took into consideration that can make or break your experience with a credit card. Here is my story and some lessons learned that may help you too.
I’m Canadian and am working with Canadian financial institutions (FI for short, hereafter in this article). I can’t say for sure any of this is exactly the same in the US or elsewhere, but some things might translate well enough to other countries and credit card providers.
Credit Card #1 – website fail
The first card I applied for was the PC Financial World Elite Mastercard. On paper, it was an excellent value. It has no annual fees despite being one of the “premium” World Elite line of cards in the Mastercard world. I was looking forward to piling up the points (in this case the rewards program was PC Optimum points which I already use). The bonus points offered on the card tie in heavily to the stores I shop at regularly for all or most of my grocery and household purchases (Zehrs, Fortinos, Loblaws, Shoppers Drug Mart) plus some great points per purchase at Esso for gas. Win-win. It seemed like a slam dunk.
Initially the card was great, I was accumulating points quickly and all was good. Then, PC Financial redesigned their website and it went from good to crap (in my humble opinion). [Side note: I’m a Quicken user and have been for life. If an FI website doesn’t allow me to download my transactions to Quicken, I’m moving on to an FI that will.]
This website redesign took away the ability for users to download their info into Quicken, a feature that had existed and worked just fine for a few months before the change.
Note to software developers: NEVER remove a feature that customers use!
Their new website – to me – was borderline unusable. It might look pretty but if you can’t get at the most basic of info without going back to another screen, it’s poorly designed. I won’t waste my time getting into the details of things that frustrated me with it because it’s frankly too long of a list and that’s not what this post is about. I cancelled the card and moved on.
Credit Card #2 – getting closer
After the PC fiasco, the lesson learned was “make sure the transactions can be downloaded into Quicken”. I would never have thought it wouldn’t be possible before the experience above. Quicken has lists of which FI’s do support their product but as you’ll read, it’s not foolproof!
So, before looking for another card, I searched for “best credit cards in Canada” and found a good credit card comparison tool on Rate Hub. It appears to be an updated version of an old Money Sense magazine credit card comparison tool and allows you to input some basic spending habit info to show you what cards might fit your needs best, in terms of cost and rewards value.
When I tried it, one card immediately jumped out among the travel cash back cards that they suggested for me and that was with Meridian Credit Union. I am already a customer at that FI and already download my other banking transactions into Quicken, so once again, this seemed like a good option for me. (Narrator voice: this is foreshadowing…)
I chose the Meridian Visa Infinite Cash Back card and I was intrigued by the Visa Infinite benefits – which many other FI’s also offer on their own line of Visa Infinite cards. The annual fee was $99 which was less expensive than many other FI’s versions of the same level of card, so I was ok with that for the benefits. I think each “line” of credit cards have some common features, like Amex has “Front of the Line” for first crack at event tickets, the Visa Infinite line of cards has some neat features too.
I took advantage of one of their perks, the “Luxury Hotel Collection”, during a trip to the US in March, using that to book a night in downtown Seattle at a better hotel than I would have booked without it (but the price was still in range for what I was willing to pay). This booking included some bonuses – free breakfast for 2 (a $40-$50 USD value but I was travelling solo), a $25 USD food & beverage credit and a free room upgrade. The savings on the meals alone on that trip was close to $50 USD so paying $99/year CAD for the card would surely pay off – plus the first year the annual fee was waived. Sweet.
It was a short-lived experience. Right away I noticed I can’t download my transactions to Quicken but I’ve used some credit cards where you can’t download until the statement is created. So I didn’t panic yet. However, when I got my first credit card statement, their credit card provider doesn’t allow download to Quicken. (Meridian works with a 3rd party to offer the credit card so it’s not within their control per se). Doh. All my other Meridian accounts are downloadable so I just assumed the Visa would be too. “Assume” = yes, you know what that stands for, right?
On that trip, I found out another thing that was super weird about this card, which turned out to be a very good thing for me in the end. The card posted foreign currency exchange (FX) fees separately from the transactions themselves which gave you much better visibility into the fact they exist. From an “employee filing US expenses” perspective, it was horrible to have them listed separately though. Every single purchase I made on a business trip to the US was listed first with the traditional “payee plus USD amount * FX rate” to show the CAD equivalent of the purchase. Fine, that’s normal. However, at the bottom of my statement were a listing of “Foreign Currency Transaction Fees” and it was one line per USD purchase for the markup they charge on the FX rate. WTF. Reconciling and proving they were related to the original purchase for an expense report took a little explaining!
So, I decided that the lack of Quicken download and the FX fee handling were enough for me not to want to keep this card. There was no annual fee in year 1 so I didn’t lose anything on trying it out.
I was about to cancel it when I saw I had slightly more than $50 in cash back accumulated that I could redeem. Why waste that? The simplest thing to do seemed to be to transfer it as a statement credit instead of redeeming it for a gift card. Well, I redeemed this a month ago and it’s still not credited to my Visa! The fine print says it might take up to 45 days to do so! OMG. That’s just another sign that this isn’t the right card for me. “Cash back but you have to wait 6 weeks to get it” isn’t worth it. As soon as that credit hits my account, I’ll be cancelling this card!
Credit Card #3 – we have a winner!
I guess the saying “the 3rd time is the charm” has some truth to it. Back to the drawing board, I was still interested in finding a better card. At this point I figured I am going to stick with “a” Visa Infinite card, I was impressed with the additional benefits this line of cards offered. I ended up back at the other FI I deal with which is Scotiabank. I already have one of their Amex cards, I know they support Quicken, and I’ve given up on trying something new!
When I went to look at the details on their version of this card, the first thing I noticed was “no foreign transaction fees”. Hmm. I didn’t even realize that was a thing.
That is the deal-maker as far as I’m concerned. This FI charges $139/year for the annual fee but like the Seattle hotel example above, if I use the perks of the card, I should benefit by more than that easily to make that fee well worth it. Here are 2 examples of ways I’m going to easily make up for the $139 annual fee:
Example 1: savings on FX fees
I looked at my March US trip spending and I spent just over $2,500 USD on that trip, partially from the travel and partially from booking another conference in that calendar month (coincidentally). Both credit cards I used on that trip – the Meridian Visa and an existing Amex I already had – both charge 2.5% in fees over and above the going FX rate on a given foreign currency transaction. And it gets worse! That’s not 2.5% on the USD amount, it’s 2.5% on the converted to CAD amount! Here’s a real charge on my card, paying for a conference registration fee:
The conference registration fee was $1,145 USD, which at the standard Visa rate was $1,539.30 CAD. That works out to an exchange rate of 1.3444 (to buy $1 USD) which was right in line with the going rates that day/week.
The FX fee was $38.48 on top of the $1,539.30. That is 2.5% of the Canadian dollar amount ($1,539.30) which effectively was an exchange rate of $1.378, $.0336 higher, not 2.5% higher!
On that March trip, with the $2,500-ish spending, I paid $76.29 in FX fees. With this new Scotia Visa Infinite card, I would have paid $0. I would have saved $76 in one trip. Any foreign currency purchase I make, travel or otherwise, I will be paying with this card to take full advantage of this. I typically go to 2-3 conferences a year, mostly in the US so I could quite easily save more in FX fees than I pay for as an annual fee on the card, and that’s even if I don’t use a single other benefit of the card.
Example 2 – taking advantage of the Visa Infinite program.
For a recent birthday, my better half and I booked a night in Toronto at the Royal York hotel as we had tickets for an event downtown. I booked it via the Visa Infinite luxury hotel collection site and once again, it was a tremendous deal: $100 CAD F&B credit (the “typical” $25 F&B credit plus a bonus $75 credit because the hotel is under renovation, an incentive to stay anyway), free breakfast for two (which the bill came to nearly $60 at the hotel!), free room upgrade and complimentary late checkout. The F&B savings alone just paid for my annual fee, with one stay. That’s crazy. I’m a fan…
What did I learn?
I learned to shop around, and really pay attention to the details of the various credit card offerings. I’d never heard of “Visa Infinite” before but once I checked it out, it seemed like something I would (and now have) use. I also have an Amex and I do subscribe to the “Front of the Line” thing and have benefited from getting early access to some event tickets from time to time.
I learned there are different offerings around FX fees. I think of all the USD purchases and travel I’ve done over the last decade and think, wow, I could have saved a boatload in that time had I known a no-FX-fee card existed. The standard FX rates still apply, this isn’t getting USD at par, but you’re not paying an extra 2.5% on top of it.
Other perks. I’m reading through the fine print on some of the trip cancellation, rental car insurance and travel insurance things so I know what qualifies if I need to make a last-minute change on my upcoming summer trip or god forbid, need to make a claim. One card requires that 75% of the cost of my trip be charged to that card to be eligible for trip cancellation insurance – that’s important to know.
Many perks need registration. The Visa Infinite luxury hotel collection thing is a Visa site, not a bank site so it needs registration. Easy to do, but if you don’t know, you might miss out on a perk. The specific card I have with my FI has free lounge access – and you need to register to get the card that gets you into the lounge (it’s not your Visa that gets you in!). I now get 6 free lounge passes a year for travel at airports across the world. Sweet!
Bottom line: if you’re going to go for a card that has some rewards or premium perks to it, take advantage of it, or go to a no-fee card where you’re not paying for something you won’t ever use.
I think of all of what I’ve learned, I focus on the FX markup piece and how much that could save even an average business who does some foreign currency spending. There can be a non-trivial business impact of that kind of charge (or savings). My own business had regular charges in USD for things like website hosting year after year not to mention travel costs.
Many companies have corporate credit cards, is that a feature they look for? I don’t know the answer to that (having never had the responsibility of setting up a corporate credit card program) but if my employees were travelling a lot or buying a lot of USD services or products, I would be seriously evaluating what a no-FX-fee card could save me in a year. It would not be that minor, based on my own experience here.
The flip-side is if my employees travel a lot and use their own cards to pay and expense costs, at what point is it worth recommending to them that they look at a no-FX-fee card even if it’s a trade off for allowing them to expense their annual fee perhaps. In the consulting world, there is a TON of travel and likely a lot of it involves foreign currency spending.
The whole experience opened my eyes to the costs, benefits and opportunities that exist around credit cards. They aren’t all interchangeable, even when they all sound like the perks are pretty similar.
I don’t know if this helps anyone but I really wish I knew more about this a decade ago when I first started out in my own consulting business, that’s for sure!
Today’s #TipTuesday post is about troubleshooting an issue with the My Reports home page area/pane and how to remove SmartList favourites from there. The specific scenario here related to favourites that were “stuck” in MyReports after an upgrade to GP 2013 a few years ago. I was reminded of this when I saw this post on the Microsoft Dynamics community forum and someone asked me if I kept the scripts I used to fix this issue.
What’s the issue?
After a client’s GP 2013 upgrade, a couple of users noted that the SmartLists they had saved under their My Reports pane on the home page weren’t working. It turns out the underlying Favourite was missing, and we didn’t find out why this happened. Ultimately the important part was we could not remove the reference in My Reports using any out of the box functionality in Dynamics GP. If we clicked in the Edit pencil in MyReports, the items we saw there weren’t in the navigation list that came up so we had no way to remove them, within the normal functionality one might use to do so!
What would specifically happen if the user tried to use the shortcut, it would open Smartlist to the right Smartlist “object” (i.e. the “yellow folder level”) but not the specific favourite itself that it was linked to. From the user’s perspective, the link was broken, it only half worked.
What changed in GP 2013?
Starting with GP 2013, when a user creates a new Smartlist Favourite saved as “Visible To” User, i.e. themselves, it then automatically appears on My Reports. That reference can be removed from My Reports via the edit functionality and it can be removed by changing the Visible To in the Smartlist Favourite to something other than “User” as well.
However, for any items that were there BEFORE the upgrade to GP 2013, there was no option to remove them.
SQL to the rescue!
There are some times when you have no choice but to resort to SQL to address an issue and this is one of those times. The table that stores a user’s My Reports pane contents is the SY70700 table. This stores all types of reports, not just SmartLists. Here’s a simple query to view a list of what’s in the table:
SELECT USERID, CMPANYID, MyReportName, DEX_ROW_ID FROM DYNAMICS..SY70700
This will return something like this, in my case, a regular GL trial balance report and 2 smartlists – one added as a favourite visible to User and one as a manually added MyReport.
If something is stuck, you can add a WHERE clause to the query to narrow it down specifically by USERID, and then cross check the exact “MyReportName” from a screenshot from the user vs. what’s in the table. My preference is to then use the DEX_ROW_ID as the filter for a DELETE script which may look like this, if I want to remove the “Accounts created this month” favourite from My Reports. (I also opt to include the USERID even though I shouldn’t need to as DEX_ROW_ID is unique).
DELETE DYNAMICS..SY70700 WHERE USERID = 'sa' and DEX_ROW_ID = 2
If you’re using this script for real, you will replace the USERID ‘sa’ with the user in your system and one or more DEX_ROW_ID’s based on the first select script. My system database is the default DYNAMICS db but YMMV if you have something different, it’s the system database where this table is.
Of course, if you are doing this for real, my recommendation is test your WHERE clause with the select script first. If you are getting only the results you want to see with that, then you should be safe.
The last suggestion, since this is editing a table in the system database, and not all organizations have a separate standalone environment, please make a backup and/or use someting like this to make a quick “table” backup. That way if you delete more than you expect or someting doesn’t work, the data is there to script back in with an insert statement.
SELECT * INTO DYNAMICS..SY70700_BACKUP FROM DYNAMICS..SY70700
When you’re done, if all goes well, you can drop the backup table so you don’t have clutter in your database.
Today’s #TipTuesday is a bit about the “Originating” fields in SmartList, what they are and what you may find in them. These fields are found under the Account Transactions SmartList (under Financial) and in certain circumstances, you will be able to extract *some* subledger information from your G/L to supplement your account analysis.
What are they?
Here are some of the fields and what they may contain. There are some items that I can’t actually tell you what they are for, to be honest!
Originating Company ID
For intercompany transactions
Originating DTA Series
Unsure, perhaps related to MDA (Multi-dimensional analysis)?
Originating Journal Entry
I expected this to be related to Copy or Correct functions but no matter what I tested, nothing populated here. So… I’m not sure anymore about what should be here! (I tested Copy, Back Out and Back Out & Correct entries).
Originating Debit & Credit amounts
When using multi-currency, these are going to be the transaction debits and credits in the original currency of the transaction. In comparison, the regular “Debit” and “Credit” fields will always be the functional currency amounts.
This is often the original Batch Number. There is a field in Account Transactions called Batch Number but it’s usually empty after something is posted. In my Fabrikam data, the Originating Source tended to be useless as the default posting settings are Post To the GL not Post Thru, so my “batch numbers” were the audit trail codes, not the original batch number from the source itself. If posting settings are set to Post Thru for a given Series/Origin, it would show the actual batch in most cases. For GL transactions, it does show the batch number, which is often helpful later.
In my testing, from various transactions, the only thing this shows is “Normal” for a standard GL entry and “Clearing” for a Clearing Entry. Even on a Reversing GL Trx it shows “Normal”.
Originating Posted Date
This is “the date on which the original transaction was posted”, as in the system date.
Originating TRX Type
In my testing, the values here were useless. I believe in the GL tables this would possibly contain the Trx Type integer value of the originating transaction. However, in SmartList, it seems to be translating that to the “word” pair in the GL for that Trx Type value. Example: my data would show “Standard” or “Reversing” in things where that made no sense since the trx wasn’t a GL journal entry originally.
Originating TRX Source
This is the Audit Trail code for the original batch from the originating transaction.
Originating Control Number
This can be the original doc number but not always, especially when the original module has both a doc number field and a sequential transaction number field (like PM Transactions have Voucher Number, which is sequential and independent of the Document Number). So, it’s important to be careful if you are relying on this data for any reporting, when the next field is more reliable for the Doc Number if that is what you are looking for.
In the Bank module, the value was an integer but meaningless as far as I can tell.
In SOP and RM, the value was the document number.
In POP (receiving), the value was the receipt number.
In PM (trx and payments), the value was the voucher or payment number.
In IV, the value was the transaction number.
Originating Document Number
In comparison to the above field, Originating Control Number, this field is pretty consistently the actual original Doc Number.
In Bank, it’s the receipt number, cheque number, or transfer number.
In RM and SOP, it’s the document number.
In POP, it’s the receipt number. ** This is a notable exception: it did not populate the Vendor Doc Number I put in the receiving transaction, perhaps because it isn’t always a required field (depending if you are entering a Shipment or a Shipment/Invoice).
In PM, it’s the document number (or cheque number) .
In IV, it’s the transaction number.
Originating Master ID & Name
These are the single most useful fields in the set, IMHO!
For Bank module, the ID is the chequebook and the Name is the name on the transaction (i.e. Paid to/Received from). On Bank Transfers, interestingly enough, the ID is “Transfer From:” and the chequebook and the Name is “Transfer To:” and the chequebook.
In SOP and RM as well as POP and PM, it’s the customer or vendor ID and Name. I did not test payroll but if you post payroll in detail, this should also be the Employee ID and Name (at least I know it does this with Cdn Payroll if you are posting in detail). ** side note: this would be an important reason NOT to post Payroll in detail as the detail available to anyone with GL smartlist access would be far too much for most organizations!
In IV, the fields are empty.
Originating Sequence Number
This also contains a number of things, but primarily appears to be the line sequence number from the originating transction (or that is my guess based on the numbering sequences I’m seeing. I did not correlate that with the subledger transaction to confirm 100%).
Here is a screenshot of an Excel extract for my testing transactions. On the left, I typed in the nature of the transaction. In many cases I deleted some rows of data just to fit the info into a small picture (i.e. only showing one side of a transaction instead of both debit and credit). I’ve hidden fields that would normally be visible, like Date, Debit/Credit, Account Number etc. because I simply wanted to show you some of the Originating fields on a sample of transactions.
Most of the data I’m describing as something that might be visible is based on the default account configuration and posting settings, namely:
Accounts set to “Detail” under “Level of Posting from Series”
Posting Settings set to “Create Journal Entry per Transaction”
Under these scenarios, you should see the same kinds of data in my example above, for those kinds of transactions. If you have Summary as Level of Posting, some fields will still populate but you may not see others, like Master ID and Name won’t populate, because the GL entry is a summary.
The Fixed Assets module and the Invoicing module (not RM and not SOP) appear to be the anomaly in the group, based on what I’m seeing. It must be coded differently because both of those modules populate very few of the Originating fields at all – it does not post Master ID, Name, Source, Control Number or Document Number. I didn’t test Manufacturing… FWIW.
The other caveat of this post is I’m describing only what the out-of-the-box Account Transactions smartlist holds and how it translates some of the data. The table itself may contain things that don’t translate in the smartlist (such as the Originating Trx Type not making sense). I didn’t look at the tables at all as the intent of this tip is purely to give some information on what extra info you can get from a GL smartlist that you may not be aware of!
That’s it for this tip… as always, I hope you found it useful!