Let’s run through why it’s so important that you know how to interpret analytics data, why it’s dangerous to assume that presented data is representative, 4 warning signs that you’re dealing with sweetened data, and how you can generally guard against trickery.
Why you need to be able to interpret analytics data
In the business world, presentations are important and extremely common. The larger a business becomes, the more likely it is to explore the possibility of acquiring (or investing in) other assets and companies, or even consider merging with a competitor for mutual benefit. With each option will come some form of pitch and an exchange of performance metrics.
It really doesn’t matter what exactly your professional circumstances are. If you aspire to great things in the business world, you must know how to parse analytics data and understand precisely how significant it really is.
Sometimes you’ll have someone to help you with it, but not always — and there are inherent risks to trusting consultants, no matter how solid their reputations may be. In the end, you’ll need to offer some kind of useful input. Data tools can do a lot, but they can’t spot all potential biases, especially contextual ones.
The dangers of assuming analytics data to be representative
In every kind of business negotiation, each party is trying to get the best possible deal for their side of the equation. That’s entirely understandable, but it leads to people taking liberties with the truth when the time comes to break out some performance data.
If you’re in a presentation and the presenter brings up a chart showing smooth upward progress, they might say that it’s a chart of their sales, but you won’t actually know unless you take a very close look.
What if you trust the presenter, though? Surely then you can just take their word for it? Unfortunately, even if they are as trustworthy as you believe them to be, their competence might not be on the same level.
Presenters don’t always create their own presentations, and even when they do, their ability to create presentations doesn’t guarantee that their work is even close to representative. They might genuinely feel that their chart shows incredibly sales performance, all because they never read the fine print.
And when you simply shrug your shoulders and opt to believe the narrative you’re told, you’re assuming a great deal of risk. It could be that the business that looks so strong is actually a dud and will start dragging your operation down the moment you acquire it. Something like Google Data Studio can easily make the most nonsensical data set look professional, after all.
That aside, let’s go through 4 warning signs to look out for:
Warning sign #1: Suspiciously-round numbers
However round a business might make its pricing, performance stats very rarely happen to sum to neat round numbers. If you’re reviewing some analytics and you spot that every number happens to be curiously round, that’s a major red flag and cause to challenge the data. It’s even considered a sign of likely malfeasance.
Of course, rounded numbers don’t always stem from efforts at deceit, or even inaccuracy in the source data — sometimes they’re even rounded down just to neaten things up (perhaps thinking that boasting 100k Twitter followers sounds more impactful than boasting 100,317).
But when you’re trying to make a decision about value, you need to know what you’re getting the data in a plain and unaltered state. It’s always better to stick to the legitimate figures.
Warning sign #2: Arbitrary metrics
Each business will have distinct KPIs — that’s perfectly normal, resulting from differences in goals, methods, and products and/or services. But unless a metric speaks for itself (as in the case of Net Profit, for instance), its presence must be clearly justified, or else give rise to the suspicion that it was included simply because it looked positive. You need to know which stats matter and why.
Using default Google Analytics metrics, for instance, I could likely assemble a positive-sounding full-page report about almost any website you care to mention, leaning heavily on myriad meaningless metrics that happen to sound good.
Take the highly competitive world of ecommerce platforms as another example. There are plenty of multi-channel ecommerce platforms available today that come with built-in social selling functions — it’s both profitable and popular, but how do you know which one to choose? As more and more platforms surface and push into the market, how do you make an informed choice?
If you’re using case studies to research the best platform for you, be warned. Let’s say you’re presented with a case study of an ecommerce platform’s client who reported a 75% monthly increase in social sales. There’s a decent chance that it’s referring to a total of 7 sales from a single social network, up from 4 the previous month in a wholly-insignificant and arbitrary “improvement”.
Warning sign #3: Unexplained timeframes
Since performance is to be demonstrated over time, it’s entirely standard to chart year-on-year or month-on-month metrics. Done transparently with the right metrics, it’s a great way of honestly and quickly showcasing the overall performance of a business, but it isn’t always done in a such a thoughtful way — that’s where unexplained timeframes come in.
For instance, you might look at a report and see three highly-positive charts, only to look more closely and notice that they use radically different timeframes. One goes back two years month-on-month, but another dates back to the start of the year, while the third only covers the last 30 days.
This is a worrying sign that timeframes have been chosen specifically to avoid showing negative results — and if you only get to see the positive results, you’ll leave with a very inaccurate idea of the value of the business.
Warning sign #4: Cherry-picked comparisons
To lend some additional context, it can be useful to provide industry comparisons in an analytics-heavy presentation.
After all, a conversion rate outside of any context doesn’t mean that much — if you can’t relate it to the previous conversion rate of that system, you can relate it to the industry average, showing superior performance and making it easier to gauge value.
But what if you’re reading a report and you see that the average post length of a blog is 2.5k words, far above the average of the industry-leading competitor? You might well wonder why that specific thing merited a mention, and further inspection might reveal that every other metric falls short relative to the competitor.
The presenter simply searched for a metric (any metric) that could allow a favorable comparison to a hit website, all in the hope that people would see it and draw wild conclusions about overall relative quality.
Always factor in the bias of the presenter
We’ve been through why you need to know how to interpret data, how it can be very damaging to assume that data is representative, and 4 warning signs to look out for when you’re reviewing a presentation: but your skepticism shouldn’t begin and end there.
It isn’t always fun to be doubtful of everything you read, but it’s essential that you find a way to do it. To that end, always factor in the bias of the presenter.
With a vested interest in the consequences of the presentation, they’ll be eager for the data to come across in a particular way, so anticipate that and be ready to ask follow-up questions if needed. If you have to directly request an unfiltered view of the source analytics, do so.
Keep in mind that it can be quite hard to be dispassionate about business, particularly when you’re talking about the value of something you’ve put a lot of work into.
So don’t hold it against someone if they’re a little too eager to dismiss the negatives. It’s possible to be polite, assume good intentions, and remain skeptical about everything presented to you.
In the end, being too trusting opens you up to manipulation and exploitation, so don’t leave yourself vulnerable. Know what data really means, and never be fooled again.