The late British Prime Minister Benjamin Disraeli said it best: “There are three kinds of lies — lies, damned lies and statistics.”
In theory, statistics are intended to make matters more understandable, but, in practice, they are often little more than a tool used to mislead the public into believing a particular falsehood.
And while there are certainly those who will argue that numbers never lie, any statistical survey can easily be manipulated by selective case studies, small group inclusions and a biased focus on data review.
Indeed, getting an objective, neutral analysis of statistics can be as slippery as trying to maneuver your way through an ice skating rink in a pair of roller skates.
So why do we hang on every new media poll that comes out on the U.S. elections, or stop eating potatoes because some medical report said that it might increase our cholesterol levels by as much as 8 percent?
To begin with, because we have been taught to believe — blindly — in what scientific research tells us.
But like all human endeavors, there is, in fact, nothing neutral or objective about research.
Take, for example, India’s statistical claim last year that its national GDP growth had mysteriously skyrocketed from 4.7 percent to a whopping 6.9 percent in just three months.
After a lot of national and international economic doubting Thomas Bach began to question the Narendra Modi administration about this seemingly miraculous uptick, and the government finally admitted that the new study had been based on a “more discriminating” assessment of GDP using specific industries and geographies.
And then there was the case of China, which recently acknowledged that government reports that coal emissions were down by as much as 14 percent in 2015 might not have been quite accurate, since another official study suggested that coal consumption was up by 17 percent during that period.
Whether measuring unemployment or taking the pulse of public opinion, statistics are unreliable gages.
Statistical studies can be both erratic and deceptive.
The same study, presented in a particular light with a selective choice of data evaluation, can, in the hands of different interpreters, be literally turned on its head to “prove” an opposing theory.
And with a plethora of different studies to choose from, whoever is presenting the results of statistical research can cherry pick whatever numbers best support their particular ideologies and biases.
It all boils down to what numbers are presented and in what context.
As the Huffington Post pointed out last July, Denmark, which was proclaimed in March as the most contented nation on Earth by the World Happiness Report, is also the second-largest per capita consumer of antidepressants.
The chief fallacy with statistics is their subjective nature, which allows researchers to adjust their methodology and emphasis based on their personal motivation.
When considering a series of statistical numbers, it is crucial to bear in mind the authorship of the data, i.e., who funded the research and why.
In the end, everyone has an agenda, and generating data that supports a particular agenda has become big business for politicians and corporations alike.
So, next time you hear or read about statistical evidence that proves this or that hypothesis, consider the source and keep your dubious guard up.
The data presented might just turn out to be another damn lie.
Thérèse Margolis can be reached at [email protected].