Everything You Know About Digital Measurement Is Wrong

Dashboard Disinformation – Best-Practices it’s best NOT to follow

Here’s a few simple, oft-repeated best-practices in digital measurement dashboarding that you may already know:

  • Focus on a small set of site-wide actionable KPIs
  • Measure your overall Site Satisfaction and compare it to your industry
  • Trend your NetPromoter scores to track your efficiency in creating brand advocates

What you probably don’t know is that every one of these « best-practices » is wrong. Deeply, fundamentally, and completely wrong.

Your measurement department has almost certainly been preaching the conventional wisdom in digital measurement: don’t overload on numbers. The key to successful dashboarding and reporting is finding a small set of site KPIs that are understandable and immediately actionable. And they’ve probably delivered you exactly that – a small set of key metrics like Site Conversion Rate, Total Visits Trend, overall Site Satisfaction, and NetPromoter scores all laid out in big numbers with great fonts, pretty colors, big trend arrows and lots of Tufte-inspired whitespace.

Sadly, these reports deliver neither understanding nor actionability.

Suppose I walk into your office and tell you that your Site Conversion rate is up 5%. You’ll probably be delighted. Now suppose I walk into your office and tell you that your Site Search Engine traffic is down 20%. That’s bad, right? But would you realize that the two metrics are related and are telling you exactly the same story? As you drive less early-stage traffic to your site via search, your Conversion Rate will go up. By focusing on individual metrics, you’re missing the system behind the numbers.

In the real world, there are twenty or a hundred or a thousand different reasons why your Conversion Rate might be up or down. Given the simple bare fact of a change in your Conversion Rate there is simply no way to know which of those nearly infinite explanations might be true. No way to know if the news is really good or bad. And by carefully removing all but a few key metrics from your dashboard, your measurement team has thoughtfully made it impossible for you to ever find out.

Site-wide metrics: from Conversion Rate to Total Traffic, are nearly all worthless. To be meaningful, a metric needs to be placed in the context of « who » it’s about and « what » those customers were trying to accomplish. If you’re dashboarding and reporting lacks this context it isn’t worth the paper it’s printed on.

If there is so little utility in site-wide  metrics reported from Web analytics systems like Omniture or Google Analytics, you might be tempted to steer your interest toward another staple of digital measurement – the online intercept survey. Survey research has a rich tradition with deep intellectual roots. So it stands to reason that the number it delivers are likely to be better than from Web  analytics.

Except they aren’t. Online survey research is based on a sample of your Website visitors. Of course. But have you ever thought through the implications of this? Every time you launch a new marketing campaign, every time you improve (or worsen) your Search Engine Optimization (SEO), you effectively change the sample of visitors coming to your website and, therefore, the sample of visitors in your online survey. There is no corollary to this in the traditional survey world where sampling and marketing were carefully separated.

The upshot is that site-wide trends in satisfaction or NetPromoter scores (and those are probably the only numbers you ever see on a Management Dashboard) are completely useless. Instead of tracking true changes to customer satisfaction, you’re actually tracking changes to the site population caused by your marketing program. Even worse are benchmark satisfaction numbers against other sites. Comparing your bad sample to their bad sample doesn’t make for a good sample!

If you’re like most marketing executives interested in Digital, it’s time you realized that the pap served up by your agencies, high-priced Web analytics gurus and measurement teams is not fit for viewing much less deciding.

Do the people feeding you these numbers know better? Mostly, the answer is no. And yes, that’s the real problem.

People who never use the data to make decisions are all too prone to accept it without question and, odd as it may sound, your measurement professionals probably never actually use the data they dish out. Worse, the idea that senior decision makers need simple, clear numbers has become the guiding mantra of the measurement community. There’s nothing wrong with simple, clear numbers of course. But when metrics are distilled down to a point where they lack meaning or context, misrepresent reality, and hide the truth, simplicity is no virtue. It’s up to you to demand more, and by dispelling these common, easy metric crutches from your dashboards, you’ll be on the right path to getting the information you need.