Truth-rating will amplify our worst instincts and bury our best
Journalism is the material filter through which we soak up the outcomes of scientific venture. The way it works matters a lot.
Though trust in science media hasn’t been recently surveyed in Australia, a 2017 Pew study looked at the US, and found that media outlets are a dominant source of scientific information, compared to other sources. They also found that 41% of respondents think the news media do a bad job covering science:
“When pressed to choose, nearly three-quarters of the public (73%) says the way the news media cover scientific research is a bigger problem than how researchers publish and share their findings (24%)”
Though a thin majority think science media do well, this isn’t exactly a glowing review. This distrust of US science reporting is reflected in decreasing trust of all US media outlets.
This comes at a time when scientific issues are poisoned by intense partisanship, and as a consequence, straightforward facts are obscured, polluted and devalued in favour of tribal warfare.
Climate change, genetic modification, vaccination and energy technology have all been subject to this destructive force. The degradation of the material through which we soak up science is a problem, and the world’s greatest problem solver has come up with a really, really, really bad solution.
The concept
In response to a series of investigations published about worker safety at Tesla’s US factories, Musk tweeted a half-exasperated, half-sincere journalist / media outlet ratings and reputation site idea – Yelp for journalists, where readers assess the truthfulness of an article, and a ‘credibility score’ is assigned to that journalist as a result of these ratings:
Confusingly, Musk implied it was a joke, but he’d also registered ‘Pravda’ as a business (and had to nab ‘Pravduh.com’ instead of the already-nabbed pravda.com).
Even more confusingly, ‘Pravda’ was the official newspaper of the Russian communist party. Some took this as a sign it really was a dry joke; others took the name as an implication that existing media outlets are propaganda.
Musk has been tweeting with increasing frequency, including sharing a defense of his concept by retweeting an article linked to a dangerous cult and, weirdly, attacking Australian nanotechnology researcher Upulie Divisekera. He created a Twitter poll with some skewed options, in that same tone of half-serious, half-sardonic anxious exasperation:
There’s obviously an element of never-tweet-angry in this flurry. But I also suspect this site will become a real thing at some point. Musk is still tweeting about it, and has even implied that the site will use ‘the submission of concrete evidence’ and ‘a PageRank-like algorithm to recursively weigh voters based on their own reputation’ (PageRank is Google’s search results algorithm that tunes the listings of a result based on a series of factors like reputation and the number of links to a website).
It’s worth noting that this concept has been tried before. The systemic campaign of abuse, harassment and threats called ‘Gamergate‘ spawned a site called Deep Freeze, in which gaming journalists are tagged with ratings alongside statements criticising their work, most of which relate to interpersonal relationships between the people on the site’s hit-list. It declares that it “strives, whenever possible, for maximum objectivity — supplying factual information so that readers can form an opinion on their own”.
The idea that you can create a set-and-forget public information rating service is testament to a very serious and significant misunderstanding of human nature. Gamergate’s shocking legacy of abuse is mirrored in Musk’s criticism of specific journalists, mostly women, who subsequently received a torrent of serious abuse, including threats of violence.
The problems of trust, information fakery and the strength of cultural and ideological sentiment in modern discourse are self-sustaining. Gamergate’s analogous effort to tag journalists who speak out of turn about misogyny and sexism in gaming says more about people, and our consumption of information, than it does about real issues with modern journalism.
This vengeful concept mirrors and amplifies the parts of human nature that nurture confirmation and eradicate contradiction.
Musk is right – the public does care about truth. But that doesn’t make clumps of us well suited to figuring out truth, which is hard. One of our best tools for countering the biases in our thinking is using the scientific method to counter our instincts with care, review and evidence.
Ironically, Pravduh will be particularly harmful for scientifically contentious issues, in which the impacts and consequences of inaction and misinformation are increasing, as trust in those sharing information declines.
Opt-in ratings only represent those with intense pre-existing feelings
Perhaps the most obvious problem with ‘Pravduh’ is that most people compelled to vote are people who’ve already made up their minds about the truthfulness of an article.
This is a thing that happens with a lot of different opt-in rating systems. Consider the recent re-make of Ghostbusters on the Internet Movie Databse (IMDB):
5.3/10 might suggest it isn’t a particularly great film, but the distribution of the votes reveals something of the public maelstrom that surrounded the film:
Instead of an even distribution (compare it to Thor:Ragnarok), Ghostbusters was bombed with 1s, and later, a bunch of 10s in response. For phenomena that serve as a proxy for some long-running ideological war (in this case, white identity politics blended with real, raw misogyny), review sites are pseudo-democratic release valves and confirmation bias engines for groups of people with a stupid battle to realise.
Imagine a media article covering the science of solar power, or new studies on climate change. Musk’s review site would be a lightning rod for precisely the same culture wars, inflicted on a range of proxies. Climate change, in particular, has become a signifier of political allegiance.
Consider the IMDB ratings of ‘An Inconvenient Sequel: Truth to Power’, a follow up to Al Gore’s 2006 documentary on climate change. Though the film isn’t a journalistic report or a scientific analysis, the skew is telling; nearly half of all ratings are 1s or 10s.
The important point here is that we can draw zero conclusions about the film’s quality from the skewed emotions of (mostly) men looking to defend the shape of their reality.
These single-click acts of vengeance on some piece of information that struck injury to belief is valuable to the practitioners of ideological culture wars. It doesn’t matter whether people use their real name, or whether you protect against botnets. The harm caused by this isn’t related to anonymity or computerised fraud. It’s us.
Will Musk’s site protect against organised review bombing? If it does, does that mean he also doesn’t trust the public as arbiters of ‘truth’? Where will Pravduh draw the line between collective action from a community and the intensified expression of some greater cultural and political fury, channelled into some specific clipping of science media?
Who judges the evidence?
As mentioned above, Musk implied that the site wouldn’t be quite as simple as, say, IMDB – there’d be a process of gate-keeping of ratings, where evidence is needed to rate the truthfulness of an article.
Evidence needs careful assessment – plenty of people with access to reams evidence will wield it to prove their point, rather than discover the point. But it’s worth noting that plenty of people wield evidence in a misleading way. Arguably, this is the core of pseudoscience: not the rejection of evidence, but the misuse of evidence.
A 2017 PNAS study found that:
“..education, whether measured in terms of general educational attainment, science educational attainment, or science literacy scores, may increase rather than decrease polarization on issues linked to political or religious identity”
It gels with other research showing that individuals with higher levels of scientific literacy are, counter-intuitively, more likely to hold polarised views on scientific issues (though, the inverse is true for young people).
This is something you’ll have experienced directly if you’ve quarrelled with a climate denier on social media. For every argument you bring up, they’ll shoot back with jargon, messy graphs, scientific terms and a plethora of hyperlinks. It’s all wrong, but the process of comprehensively disproving their Gish-Gallop rambling is lengthy and wasteful.
Who judges the evidence submitted to this site? Will there be teams of experts on hand, who all have the experience required to look past the mere proferring of a paper and judge context, nuance and broader scientific consensus? Anything less than a massive undertaking to carefully gauge submissions will result in Pravduh being a glowing red target for pseudoscientific campaigners.
What about when the science conflicts with the rating?
There will be a searing tension between the truth ratings of the public, and the outputs of scientific inquiry. Imagine a media report accurately describing the safety and efficacy of vaccines, that’s rated as false through the genuinely beliefs of a large group of people who object to vaccination. Who’s to say those beliefs, and subsequent ratings, aren’t genuinely held? What if a few passionate anti-vaccination activists vote, and no one else bothers?
Even if you were to get a truly random, representative sample of the population, there’s still a gap between scientific consensus and the general public on contentious and important issues, like climate change. Musk himself is keenly aware of the scientific consensus on this:
But in Australia and America, there is a large (albeit, slowly decreasing) gap between that consensus and the views of the public:
Even if we discount groups that exist to deny the science, what happens when people rate an article that is probably true as ‘probably wrong’? Do the site administrators step in and deny the public’s version of ‘truth’? Or do they leave it alone, allowing the rating itself to serve as further confirmation of a blatant falsehood?
Is the rating process transparent?
The hastily-deleted tweet in which Musk pointed his ~21m followers to a website run by the cult Nxivm is telling. ‘The Knife Media’ saw a big (and sustained) increase in relative Google search interest after Musk’s tweet:
“It’s a fake news site that is, itself, fake. It’s staffed by people that we could not prove existed in the first place,” Brock Wilbur wrote in Paste Magazine, one year ago, of its previous iteration, ‘The Knife of Aristotle’.The site mimics analytical language and the tone of investigative journalism (sort of like fanboys on Twitter mis-applying logical fallacies), but the methodology the site uses to ‘rate’ articles is a total mystery.
“We run each news story through an analysis process that measures bias and assigns objective ratings for Spin, Slant and Logic. Then we combine those ratings into a single score so you can see the integrity of each news outlet’s coverage”
That’s about as much information you get. If Pravduh’s loaded to the gills with algorithms drawing in submissions, ratings, evidence and comments, and spitting out a reputation score, will that be transparent? It has to be, if it’s trustworthy. A mysterious computerised force pulling the levers of information delivery is very Facebook – a company Musk isn’t particularly keen on.
There’s still a decent chance this thing won’t materialise, or it materialises but fails to be used by anyone other than dedicated fans. But much of my long-running adoration of Elon Musk and his various businesses has been the combination of vision and reality. I think this’ll be a real thing.
In July last year, I wrote a strong defense of Tesla’s brand new battery system, installed in South Australia and attached to a wind farm. Much of the criticism of the battery consisted of catastrophic misunderstandings of technology, but stemmed, as its genesis, from an effort to understand new technology through the lens of political tribes – a problem that’s plagued Australia’s decarbonisation efforts for decades, now. I can’t see how truth-rating-deathmatch would have done anything other than widen the gulf, in that situation.
Much of America’s growing distrust of media is fuelled by political polarisation, with media distrust following party lines. It isn’t, as Musk contends, arrogance and elitism expressed by journalists. We might seem better off, in Australia, but our media landscape is far more concentrated and far less diverse, and we have plenty of very serious problems around the reporting of science in Australia. We have zero room for complacency.
There are plenty of ways to chip away that the problem. Rigorous fact-checking and slow, deliberative analysis help a lot, as does well-funded publicly owned journalism like Australia’s ABC, which remains well trusted by voters on all sides of politics.
Facebook’s newsfeed algorithm has already demonstrated that the signals we send out can become a damaging feedback loop when those signals are amplified and repeated back to us. We curate our friendships and like pages and Facebook serves that inferred worldview back to us in the shape of something pretending to be an unbiased representation of reality. We nurture confirmation and eradicate contradiction – Facebook prefers us stuck in that feedback loop.
Horrifyingly, Facebook’s resolution to its misinformation woes is, as its core, the same Musk’s – a democratic assignation of credibility by the public. “We decided that having the community determine which sources are broadly trusted would be most objective,” wrote Zuck.
“It’s riggable. Like you could see people just flooding votes to different things, to sites. You could see it being abused. It’s one of these things like, “Let’s vote what’s good””, said Recode’s Kara Swisher, in response. Doesn’t matter. It’s already started.
For scientific issues polluted by ideology, culture and politics, a journalist and media rating site would quickly become the same dangerous unstable signal amplifier; analogous to Facebook’s corrosive misunderstandings of human nature. Getting to the truth hurts – it’s never smooth, it’s jagged, slow, boring, painful and gradual. If we hope to cure the ills of science media distrust, we need patience, and we definitely don’t need Pravduh.