The Dreaded Question: “Why Doesn’t This Metric in System A Exactly Match System B?”

Imagine if you will, a prospect hires you to come in and design a new analytics implementation using a new tool. Everything is going great and you’re getting all the information you need from the stakeholder interviews you’ve conducted. You’ve identified all the systems in play, including the incumbent analytics tool that is going to be replaced. Your client continuously tells you how error prone and incorrect it is.

You complete the initial release of the new analytics implementation and start working with your client on the maturity roadmap to continuously enhance their new implementation. You’re making progress, but after a few months some users start to question why the numbers with the new implementation are different from the previous system. The new implementation has been tested thoroughly and there are no major issues. You reassure your client that the numbers are accurate. Even though they called the former system error prone they insist in comparing the new to the old and since the new system generates different numbers, it must be wrong.

This seems like the perfect episode of the Twilight Zone. It’s something that is encountered frequently with some variation. This is something that I’ve experienced in the past and still experience today. No matter how much you confirm the new analytics implementation is accurate, your client still defers to the previous one.

There are multiple reasons for this. They had used it for so long they were used to the information that was coming from it. They made decisions based on the information. Anything different must be wrong.

It’s a Trap!

I was involved in a few situations early in my career where I watched members of my team fall into the comparison and validation trap only to end up with a client who was unhappier than they were at the beginning. Their goal was to ensure the client was satisfied, but the complete opposite happened.

In one situation, a client was comparing their web analytics tool to their backend financial tool of record. They questioned why revenue did not net between the two. It was always different. My colleagues explained that the backend financial tool and the web analytics implementation collected data differently and that they serve two different purposes. The goal of the web analytics tool was to measure customer’s on-site behavior and to identify functionality and areas of the site that performed well and those that didn’t. It wasn’t meant to track data like an accounting system. While the client was unhappy, this is where my colleague should have stopped (saying that looking back in hindsight).

After a very detailed comparison of the two systems, the client was shown how the backend financial tool was aware of information that the web analytics tool wouldn’t be. This included orders that were cancelled, fraudulent orders, and returns. The client demanded that this information be sent to the web analytics system. A project was scoped but it was extremely cost prohibitive. In the end, the client was very dissatisfied as they felt that the analytics implementation that was recommended to them and they invested time and money in was incomplete, even though expectations of what it was to be used for were set.

In another situation, a client insisted in knowing why there were slight differences between their analytics solution and their marketing attribution tool. After a deep review, the client was unhappy feeling both systems were limited.

Less Is More

This can be such a sensitive conversation and what I’ve found is you can’t address the client’s concern too lightly, but you can’t dive too deeply either. If you don’t take their concerns seriously enough, they feel disregarded and if you dig too deep you seriously risk introducing doubt into your solution. This is where the rapport you developed with your client at the start of the engagement is critical.

I found the best way to address and handle these situations when they come up is to first understand the tools they are comparing. If one is being replaced, subtly remind the client why they are replacing it. If they are two different solutions with two different purposes, the best approach is to reaffirm the expectations you originally set in the beginning on what problems your solution solves.

In many cases, less is most certainly more.

[author] Jim Driscoll [author_image timthumb=’on’]http://i2.wp.com/33sticks.com/wp-content/uploads/2015/08/9239556033_a7e5c89228c15928b3f0_192.jpg?w=192 alt=”Jim Driscoll”[/author_image] [author_info]Jim is a senior consultant on the Solutions Engineering team at 33 Sticks who brings 15 years of solution design and technology implementation experience, with 9 years focused on web analytics and digital marketing technologies.  He is a fitness enthusiast and avid sports fan.  He also thinks he can play golf.[/author_info] [/author]

Published by Jim Driscoll

Jim is an Implementation Consultant on the 33 Sticks team who brings 15 years of solution design and technology implementation experience, with 9 years focused on web analytics and digital marketing technologies. He is also a fitness enthusiast and avid sports fan. He also thinks he can play golf.