Too much love can kill you, sang Freddie Mercury. So can bad statistics…

Freddie Mercury sang “too much love will kill you”. That might be true, but as an accountant, I’ve got to say that love isn’t usually on anybody’s mind when they come to see me.

So I’m not qualified in the love department, but I do have stronger views on how businesses can end up doing crazy things just because people don’t understand basic statistics.

Now, I know what you’re thinking… “it’s easy for you, you play around with numbers every day so you’re used to them”. But I always struggled with statistics while doing my accounting exams, until, mercifully, something “clicked” shortly before my statistics exam and I managed to pass.

The problem I had with statistics was that, like most people, I was taught statistics as a series of techniques – how to do such-and-such a calculation – without being taught the broader concepts and underlying principles which made a particular technique work.

It wasn’t until I realised that the trick to making statistics easy was to do the thinking bit first, and only after sorting that out, getting your calculator out to crunch the numbers.

Don’t get me wrong, there are still some complications in getting the correct answer, but the way statistics is usually taught, emphasising techniques over thinking, accounts for many people struggling to understand what it’s all about.

In just the last couple of weeks I’ve come across these three examples which hopefully illustrate the point.

“Consistently high quality”

A recent article on tes.com suggesting that leadership teams in further education colleges were consistently better than leadership teams in schools caught my eye at the weekend.

I’ve worked extensively with schools, colleges and universities and encountered lots of leadership teams across the education sector. And while it;s almost certainly is true that a good leadership team in a college is better than an average leadership team in a school, this isn’t comparing like with like.

I’ve encountered appalling leadership teams in colleges and brilliant leadership teams in schools, and indeed vice versa. But on the average a good leadership team pretty much anywhere, inside or outside education, looks pretty much like every other good leadership team in the world.

There’s nothing “consistently” good about leadership in colleges, although Further Education does have some exceptional leaders, including some I’ve been privileged to work with.

However the sector also has its share of leaders who have led their colleges to disaster.

The TES article is only one person’s perspective, admittedly, and the writer makes that clear. I’ve no reason to suppose it isn’t an accurate reflection of his time working in both schools and further education colleges.

But one person’s perspective, however valuable it might be for them, isn’t necessarily reflective of an entire sector with hundreds of schools and colleges and thousands of people in leadership teams across the country.

So don’t make the mistake of scaling up one person’s experience and imagine it’s reflective of some bigger sector or market. It’s very unlikely to be anything of the sort.

Using outliers to prove a point

An “outlier” is a out-of-the-ordinary result that’s so far away from the “average” that relying on it might not be wise.

You see this in marketing a lot – “Apple do this and it works, so we should do it too”. Apple…or indeed any other business darling…does a lot of things that nobody else makes work. Just because Apple makes a premium price offering work like gangbusters doesn’t mean you can 10x your prices and still expect to win any business.

HR people like this line of argument too – “Google organise their teams this way, so we should too.”

Some people even wrote a book about how Indian managers are unusually good, due to, according to the authors, a combination of the Indian education system and the competitive upbringing for children there. They even cite half-a-dozen or so global CEOs who do, in fact, happen to have an Indian background.

But it’s also true that most global CEOs do not have an Indian upbringing, and it’s also true that there are many non-Indian CEOs who achieve the same or better results than the Indian superstar managers used as case-studies in the book.

India has approximately 17% of the world’s population, but considerably less than 17% of superstar global CEOs. India does admittedly have some business leaders who successfully steered global companies to considerable success. But, statistically speaking, you can’t infer that Indian managers are exclusively excellent, just because a handful of them clearly are.

Here’s the simple truth. Hard-working, smart, competitive people in just about every country of the world are more likely to end up as global CEOs than their countryfolk who prefer an easier life. This is true whether you’re American British, Bulgarian, German, French or any other nationality.

I’ve been to India several times and worked with Indian managers both there and outside India. My experience is that there are undoubtedly some exceptional managers amongst them, but average managers there are about the same as average managers in every other country, and a similar proportion to any other country in the world are truly appalling.

The percentages in each category are not materially different from any other nationality of ethnic grouping between India and any other place on the planet. Intelligence, a capacity for hard work, a competitive nature and many of the other personal qualities the book cites as evidence for Indian managers being so good are approximately normally distributed across every single human being on the planet as far as I can tell.

Nobody would write a book about smart, hardworking kids who went to Eton and Oxford ended up in a top job somewhere. A book about a Harvard graduate running a global business probably wouldn’t be a best-seller either. In both cases, that’s pretty much what we’d expect to happen – both are non-stories.

So it should be no surprise that, statistically, at least some of the population of India are world-beating international business executives. I’d be a lot more surprised if there weren’t any Indian executives leading global multinationals, given that 17% of all the people on the planet, 1.4 billion of them, live in India.

So don’t use outliers…management practices from Apple, Google, India or anywhere else for that matter…to evidence your arguments.

Outliers exist in every field, and I’ve no problem celebrating them where they exist. But be clear that the results of exceptional businesses or individuals are unlikely to be repeated if you try the same thing elsewhere.

By all means use outliers to inspire you to make improvements, but don’t imagine for a moment that the results outliers deliver are likely to be reproduced somewhere else by entirely different people, because you’ll almost certainly be disappointed.

Small groups


“Research shows…” is one of the most-overused expressions in management.

It’s often the magic key that someone hopes will unlock a budget or a strategic decision of some sort, but I always want to know exactly what sort of research has been carried out, and how rigorous it really was.

The tweet above is a great example. Even if you do research which is decent enough in itself, the results from any small group are, at best, no more than mildly comforting in terms of how an entire target market might view your business.

Sometimes people feel the need to draw statistical conclusions where none exist. Everybody likes to see some numbers and. as an accountant, I get to see more than most, to be fair.

However I also see enough numbers to get a feel for when I’m being fed a line.

If you’ve called together a focus group of 20 people and 58% of them say they like (or don’t like) something, in my book that means almost nothing.

58% of 20 people means 12 people in the whole universe like what you’re doing. All it takes is a couple fewer saying “yes” and it’s a 50:50 deadlock.

Even if those 12 are theoretically representative of your entire customer base, which has about the same odds as looking out your window and seeing unicorns dancing on top of rainbows, the realities of everyday life means just about no meaningful conclusions can be drawn from those 12 people’s responses.

Yet I’ve seen many a business case constructed on a similar premise. You might be surprised how few people seem able to explain their “research” in statistical terms.

Never take statistics at face value. Always find out how they did the research and if the answer isn’t “we ran a statistically-valid survey across a representative sample of couple of thousand people”, the level of reliance you can place on the numbers meaning anything sensible is probably close to zero.

Even professional polling businesses know their 2000-people surveys have an error rate of a few percentage points one way or the other. Your 20 person focus group will have an error rate many times that so you’d be unwise to make significant business decisions based on conclusions that could be out by 20-30% or more.

That doesn’t mean small group research hasn’t got any value. It most definitely does.

Individuals and small groups are ideal ways to get qualitative feedback about your business and its operations, or how people feel and experience your services.

Good qualitative research is under-appreciated in businesses because people are mostly looking to use numbers to support the case they’re making.

But most business cases which come my way have not been constructed using research methods rigorous enough to support the conclusion they propose.

Business cases tend to draw conclusions using quantitative research methods – that is, just using the numbers drawn from their research as if they were statistically valid – even though there was no statistical validity to the data at all.

That doesn’t meant heavy-duty quantitative research is necessary for every single business case, though.

If there isn’t a statistical basis for the conclusions, just say so, give me the verbatim feedback from a focus group to read through and let me understand what you’re proposing based on that.

Don’t present statistically-meaningless tables of numbers, graphs and charts to support your case when the underlying research isn’t rigorous enough to justify any conclusion, much less the one you’re putting forward.

Summary

The world would probably be a better place if we stopped allowing politicians to quote highly selective “statistics” (which is what they call their numbers…they’re nearly always a completely partisan interpretation of numeric information which was not been collected using valid statistical methods, just because it happens to suit their case).

But much as we rail against politicians’ misuse of numbers and statistics, something very similar goes on in every business in the country on a more or less daily basis.

And a large proportion of those end up on my desk “because we have to show the Finance Director some numbers to get his buy-in”.

I’d rather they didn’t bother. Either do the job properly and show me something that’s statistically-valid or stop pretending that some random numbers which were chosen because they happen to support your preferred outcome have any statistical meaning.

No Finance Director or CFO should be close-minded enough to rule out every initiative which doesn’t come accompanied by a bevy of numbers, charts and data tables.

We just want to know that you know your stats, because if you don’t, we’re unlikely to believe that anything else in your business case is going to be a solid foundation for decision-making either.

Next time someone presents you with a plethora of numbers to support a business case, try a quick “sense check” against the experiences above. You might be surprised by the insights you get from questioning people about how they’ve approached their research, and hopefully you’ll make better business decisions as a result.

(Photo by Stephen Dawson on Unsplash )