I’ve just read a great article in The New Yorker called ‘Why facts don’t change our minds’.

It links into topical political debates about ‘fake news’ and ‘alternative facts’ and probes how people process information – and whether they even care about the truth. It provides good food for thought for anyone in the business of communicating and seeking to influence attitudes and behaviour, whether for social, political or commercial marketing ends.

 

Fascinating experiments

The first part of the article struck me in particular, outlining two fascinating psychological experiments from the 1970s.

In the first, two groups of people were invited to guess which were genuine and which were fake suicide notes from a set of 25.

One group was told (falsely) they’d guessed almost all correctly, scoring 24 out of 25. The other group was told (falsely) they’d only got 10 out of 25 right. In fact, both groups had got roughly the same scores.

The deception was then revealed to the participants and the two groups were asked to estimate how many they’d actually got right and whether they thought they did better than the average person would.

Strikingly, the first group thought they’d guessed far more correctly than the second, even though – since the deception had been revealed – they had absolutely no reason to think this.

It seems they’d each been given a deep-seated confidence or insecurity about their performance which didn’t shift when the truth was revealed.  As the researchers noted, “Once formed, impressions are remarkably perseverant.”

 

‘Impressive’ failure of reason

In the second study, participants were handed information packs with a range of biographical details about a firefighter called Frank. One group’s packs described how Frank was a highly successful firefighter and noted that, on tests of his attitude to risk, he always chose the safest option. The other group had information which also said he always chose the safe option, but that he was a poor performer in his job.

The researchers then revealed that all the information was entirely fictitious – and next asked the participants to say what attitude towards risk they thought would make the best firefighters.

Despite now knowing the information they’d read was completely made up, the participants who’d had the first set of information said the best firefighters would avoid risks. Those who’d had the second set said they’d embrace it.

The researchers concluded that, even after the “evidence for their beliefs had been refuted, people fail to make appropriate revisions in those beliefs.” And they noted that the participants’ failure of reason was “particularly impressive” since, far from a broad set of data, they’d only had an anecdotal case to go on (a ficticious one, at that).

 

Important implications for communicators

Clearly the experiments suggest people are much less rationale than we’d like to think when forming attitudes and beliefs.

The New Yorker goes on to speculate how our minds may have evolved to function like this. But this aside, three points jump out at me from the above experiments which have important implications for anyone in the business of communicating and influencing.

1. As the article’s title summarises, the studies suggest that, once people have firm ideas, they’re extremely difficult to shift.

Even armed with robust facts and arguments, we’ll be hard-pressed to change such people’s views – whether we’re trying to persuade them to stop smoking, to vote differently or to try a better product.

That’s not exactly encouraging. But it’s probably a reality we need to face. Understanding this will at least help set our expectations about how much time, effort and investment may be needed to influence people in such cases.

2. The experiments show, on the other hand, that when people don’t have strong preconceived ideas, they’re extremely open to being persuaded.

Look at how easily and firmly the participants formed views about their ability to spot genuine suicide notes and the best characteristics for firefighters.

It’s an encouraging indication of how readily open-minded people may take information on board.

3. The second experiment also shows how powerful anecdotes and personal stories (like Frank the firefighter’s) can be in making a compelling case – often more so than dry facts and figures which have greater statistical validity.

Although this isn’t really rational, it has a feel of common sense about it and has been documented elsewhere. Personal stories paint a mental picture with which we can empathise in a way we just can’t with facts and figures. And our beliefs are galvanised by our emotions, which are triggered more strongly by human interest stories than dry facts.

 

Truth, clarity & impact

So, for me, the most interesting points from the article are how tough it can be to change ingrained beliefs, but how readily people can be influenced when they’re open-minded – especially through human interest stories.

Of course, we need to use this understanding ethically – to convey truthful information with clarity and impact. And not to peddle falsehoods or ‘alternative facts’.

 

Image credit: Barry Blitt for The New Yorker