There’s a cartoon I see shared with irritating regularity on my Twitter feed. You might have seen it too. There are two stalls on a street corner – one cheerful vendor is selling ‘comforting lies’, and the queue stretches out of frame with eager customers waiting to buy his wares. The other man sits despondent and bored, offering ‘unpleasant truths’. No one stands at his stall.
Those who chance across this cartoon eagerly share it, keenly imagining themselves as the long-suffering merchant of truth and accuracy, perpetually ignored but dedicated to realism and rationality.
Reverse-Google-Image-Search the cartoon and you can see the broad mix of people identifying with it – a group of scientists talking about animal research, some political activists, a deeply religious Christian website, and an atheist blog. Everybody has a little bit of arrogance in them, including me – when I first saw the cartoon I loved it, and shared it too.
This idea is something that seems to be more readily taken up by people who are worse at reasoning and interrogating reality than your average citizen. You’re not seeing ‘wake up sheeple’ links posted on Facebook by the people who wield the humility, skill, time and stamina required to interrupt sheeple slumber.
There’s something about this sentiment that everybody relates to, regardless of their religion, their profession or their political leaning. We love to imagine that we creep among a crowd of deluded, dead-eyed sheep, and that we have access to an unfiltered stream of truth and accuracy, and holy shit, isn’t it annoying that people love the comfort of lies, rather than the uncompromising realities that we bravely offer them.
Everyone is bad at reality
We’re terrible at diagnosing and pushing back against the operation of bias in our own minds, but we’re each totally convinced of our correctness.
This has a name- the ‘Bias blind spot’, coined by American psychologist Emily Pronin. Research looking at the bias blind spot describes how we fail to recognise that we’re each personally compromised, and that none of us are able to full inoculate ourselves against cognitive bias.
“Participants in a final study reported their peer’s self-serving attributions regarding test performance to be biased but their own similarly self-serving attributions to be free of bias”
Traits like intelligence, cognitive ability and decision-making skills don’t inoculate you against being supremely shit at spotting your own bias, and they may not even be related to this at all. In fact,
“People more prone to think they are less biased than others are less accurate at evaluating their abilities relative to the abilities of others”
The role of cognitive bias in the population is slowly becoming a thing. People talk about it a bit more, and you’ll hear the words more often. They’re thrown out during Twitter arguments with reckless abandon.
But we don’t often talk about the fact that even spotting this phenomenon at work in ourselves is really, really hard, and that fighting back against is even harder.
That time I was bad at being rational
Alright, there wasn’t just one time. There have been plenty of times, I’m sure. But for the sake of this piece, I want to talk about one example.
You would’ve heard about the chap who got dragged off a United Airlines plane because the airline was demanding three seats be freed up so they could ferry staff to another job. Which is pretty shocking. The follow up was ugly and unforgivable – a stack of media outlets found out the name of the victim, tied it to a string and dragged it through some mud:
“Dao had surrendered his medical license in February 2005 after being convicted of drug-related offenses, according to documents filed with the Kentucky Board of Medical Licensure last June. Broadcast and print coverage of Dao’s arrest, conviction and sentencing made his name familiar to some Kentuckians”
That miff soon turned to delight as reports started emerging on Twitter that, in their haste, the Courier Journal, and all subsequent re-publishers, had pointed the finger at the wrong David Dao:
God, it was a good story. Really sweet karma. It’s always wonderful when people do something seriously wrong, and then immediately suffer justice. I eagerly retweeted what seemed like strong arguments that there’d been a serious error – but I should have waited.
Turns out media outlets targeted the right victim of police brutality – they’d checked thoroughly to make sure they were sharing the criminal history of the correct airline passenger who’d had his face slammed into an armrest. American journalists proudly defended their actions and strongly criticised the flurry of misinformation that had spread on twitter.
This is a representative example of a personal malfunction of rationality. Rushing to conclusions isn’t exactly rare on Twitter, but this is a good example of just how counter-productive it is – the rumour ended up distracting from the original wrongdoing of journalists dragging the name of a bashing victim, partly boosted by my own act of sharing it.
It’s also a good example of where I’d perceived a failure of rationality and evidence-based information in others, but failed to recognise my own haste and the fact that I don’t have the investigative skills required to verify that kind of stuff.
There’s basically no doubt that this has happened many, many times in my life and that I’ve simply not realised that it’s happening. It’s guaranteed – not because I’m a terrible human, but because I have the same model of brain (with variations, of course) that you do, and that everyone else does.
‘Autoskepticism is useful
Yeah, I just made up that word. Autoskepticism. It’s when you save your fiercest and most inquisitive skepticism for your own thoughts, words and speech. You could describe this as being ‘humble’ but that doesn’t do it justice. This isn’t a soft virtue. It’s a hard-headed attempt to draw darker, thicker edges around the shape of reality. You need to interrogate yourself.
A lack of autoskepticism has become a serious issue in society. I was listening to a great episode of the Guardian’s Science Weekly podcast recently, which highlighted that most of us form conclusions based on instinct (as I did with the United Airlines story), and then use reasoning and logic skills to create ultimately false but socially acceptable justifications for our beliefs (see: empirical evidence within the senate crossbench).
It’s what drives people to reject education and science as acts of arrogance – ‘how dare you suggest I’m too stupid to figure this out myself’.
The simple consequence is that as soon as over-confidence overtakes autoskepticism, wrongness starts having consequences. Diseases come hurtling back as vaccination is rejected, the physical systems that sustain our existence are tipped into chaos by carbon emissions, cats and dogs living together, etc.
It’s not an easy message to sell. ‘We’re shit at reality because our brains formed to get berries, not read Facebook’. Even if you succeed in convincing someone to distrust instinct, there’s a void left gaping by the removal of that fairly huge and serious thing you’ve just taken from them.
Fill it up with the tangible thrill of science: a method of determining the shape of reality that isn’t quite as flawed as guesswork and feelings. Science does use instinct to guide initial exploration – our humanity is a big part of how science works – but it follows up with deduction, which is where we have to admit we fucked up, or we get the glory of being right about stuff. It’s more human than it gets credit for. I think that’s beautiful.
You are nowhere near as good at detecting the truth as you think you are (In fact, the more brazen confidence you have, the more likely it is that you suck – This explains the sheeple thing).
Admitting this is liberating, it makes your life more interesting, you understand the world with clarity that can be confirmed, and people don’t die from diseases and sea level rise, which is always nice.