Mythbuster, Les Binet and Sarah Carter, DDB
Les Binet and Sarah Carter get a little bit angry about some of the nonsense they find around them… like the way that we ignore inconvenient data.
We have now written more than 30 of these Mythbuster columns. It's amusing to remember how, when embarking on the column, we were concerned that we wouldn't have a good enough supply of myths to bust. We needn't have worried. But we have often mused on quite why this should be so. Why, when so much time and money is spent on data collection and we have never had more data and innovative research available, do we find that people so often go with their invalid hunches and myths, rather than the contradictory data evidence?
This problem is not unique to the advertising industry, of course. It's such a universal phenomenon that psychologists have a name for it: confirmation bias. It has its roots in a cluster of psychological biases, all of which blind us to those 'inconvenient truths' that threaten our pet theories.
First, we seek evidence that confirms our beliefs. Working in an agency, we see this every day. Planners rarely come to us looking for help to test their theories – they want evidence to support them. And if there is no obvious evidence, they rarely change their point of view – just ask us to dig deeper.
Our desire for confirmation leads us to read articles and blogs that support our views and screen out contradictory stuff. And it makes us feel most comfortable when surrounded by like-minded people. In our ever more connected world, it's increasingly possible for us to live in a bubble, isolated from views other than our own. Other professions are no better – even scientists and medics, who really ought to be. And research suggests that training people in proper hypothesis testing doesn't help. It seems this sort of bias is hardwired into our brains.
The second problem is that even when evidence contradicts our beliefs, we tend to ignore it. Psychological experiments show we demand much higher standards of proof for things we would prefer not to believe, whereas corroborative evidence tends to go through on the nod. Brain-scanning shows that the cognitive dissonance we feel when confronted by contradictions makes us work very hard to make them go away.
Our tendency to seek confirmation and ignore or deny contradictions is a dangerous combination. It makes us blind to the flaws in our theories (Daniel Kahneman calls it Theory-induced blindness') and it makes us see positive evidence where there is none.
Why do our brains work this way? Partly, it's a failure of our reasoning ability. Rigorous hypothesis testing can be hard work, even for data specialists. The ordinary heuristic thinking that all of us (even econometricians) use every day just isn't up to the job. Partly, too, it's because we hate to see our theories disproved and we indulge in wishful thinking.
Memory also plays a role. Research shows that people tend to remember fake information supporting their beliefs long after being shown the information is false. In fact, it's not uncommon for contradictory evidence to be remembered as supporting evidence. As a result, attempts to debunk myths often just entrench those myths further – a depressing thought for the authors of a column like this.
Finally, social factors are at play. If people find evidence contradicting their theories, they rarely broadcast it. In a world where most people surround themselves with like-minded people, it would put them in conflict with everyone around them. Better to sweep the evidence under the carpet.
So can anything be done about this tendency to ignore dissonant data? We have two suggestions. First, employ some people who are experts in hypothesis testing. Second, employ one or two heretics who will challenge the orthodox view. But then, we would say that, wouldn't we?
This article originally appeared in the July/August 2013 issue of Admap. Click here for subscription information.