Why we use averages - and the impact on decision making


Actual Experience

Average load times. Average revenue. Average ROI. In business we work with averages a lot. But why? And how does it impact our business outcomes? In an increasingly data driven world, it’s important that we understand the way we consume that data and the implications this has on how effective it is when we implement, act on or make decisions based on it.

A pattern as old as time

As human beings, we are predisposed to spot patterns. In fact, according to renowned tech thinker Ray Kurzweil, author of How to Create a Mind, the ability to see and understand patterns is the foundation of how the brain - and thus AI - works.

Patterns help us to understand the world. If someone says “2, 4, 6, 8…” you automatically follow up with “…10, 12, 14, 16.” If you continued “18, 20, 22, 35, 37” it would be very easy to spot that something had gone wrong.

The problem is that while simple patterns - and the anomalies within them - are easy to spot and therefore correct, the real world is not a simple place. The data we are presented with every minute of every day is complex beyond our ability to understand it as quickly and as easily as we often need to do so.

That’s why over the years we’ve worked hard to create our own patterns to simplify the world to a point where we understand it. Enter, statistics.

A lesson on statistics from Nicholas Cage

Statistics serve a very useful purpose, helping us extrapolate answers from manageable samples of data. But, as the saying goes, “there are three kinds of lies: lies, damned lies and statistics.”

Without the right context, you can make numbers say just about anything. Just look at Tyler Vigan, author of Spurious Correlations, who very neatly charts a correlation between the release of Nicholas Cage films and the number of people who drowned by falling into a pool.


Nic CAge


Of course, we all know that correlation doesn’t automatically imply causation. But you can see how in a less frivolous context, charting two sets of data and making assumptions about how they are linked could lead to problems.

And then there is the problem of averages.

If it’s neat, it’s probably not accurate

The problem with averages is that by flattening out the inherent peaks and troughs in our data - as they’re designed to do - they can hide the sort of fluctuation that has a marked effect on the way a person interacts with the digital world - their human experience.

Let’s take a simple example: webpage load times. According to research by Google, if the load time of a web page goes from 1 second to 3 seconds, the bounce rate increases by 32%. If it goes up to 5 seconds, the bounce rate increases by 90%.

Notice that the second of the 2 second increases makes more of an impact on how many people get fed up and stop trying than the first 2 second increase. When you think about human behaviour that makes complete sense - a fairly slow website is annoying but you’ll persevere if you’re committed, a very slow website is just too frustrating and definitely not worth the wait.

The problem is that if the data shows that the average load time of your website is around 3 seconds, you’re going to feel fairly confident that it’s doing ok. But what if those times are actually oscillating wildly between a very acceptable 1 second and a dangerously less acceptable 5 seconds?

Those dips where it takes 5 seconds are going to lose you far more web browsers than you’d have lost if it had held consistently at 3 seconds, even if you factor in the viewers who are happily getting the 1 second load times and not disappearing.

Strip Banner - Reporting vs Insight

The impact on the bottom line

This particular example could have massive implications for your brand reputation, as the Financial Times found out in 2016. They did an experiment in which they divided subscribers into two equal groups and allowed one group access to the normal site while the other was served with a site that took an extra 5 seconds to load.

First of all, the second group engaged with less content. That engagement also dropped the longer the experiment went on (in total it lasted 28 days). Which makes sense when you stop thinking about numbers and again think about human behaviour - we’re pretty tolerant when something happens once or twice, but a lot less forgiving when it keeps happening.

And brand reputation is only one area this is a problem. Internally, poor human experience can have a huge impact on your teams, causing poor productivity, disengagement, low morale and even staff turnover.

The point is that by using averages, you hide the true nature of the fluctuations in a system and can miss the most extreme cases. As a result you would be unaware of the true nature of the problem that you’re facing - in this case a higher bounce rate or drop in engagement than you might expect based on what your averages are telling you.

And the longer you don’t notice the problem, the greater the impact on the business.

Need to get more from your reporting? Download our 'Reporting vs Insight whitepaper' and find out why CIOs are using human experience to drive their digital strategy.