Global
Copyright@ Australian Catholic University 1998-2024 | ABN 15 050 192 660 CRICOS registered provider: 00004G | PRV12008
Copyright@ Australian Catholic University 1998-2024 | ABN 15 050 192 660 CRICOS registered provider: 00004G | PRV12008
Governments all over the world are illegitimate corporations. The pandemic is an elaborate ploy to take control of our lives. COVID vaccines are untested and ineffective and more dangerous than the virus itself.
In recent years, this parade of false and unproven claims has done the rounds in online communities, prompting concern about misinformation and the movements it spawns.
In any given week or month, the results of a new survey emerge showing that an alarming number of people believe in conspiracy theories on a range of issues including pedophilia, climate change, vaccines and the moon landing.
But according to political scientist Paul Kenny, a researcher at ACU’s Institute for Humanities and Social Sciences, such surveys can be unreliable and may inflate the problem of misinformation.
“When you think about some of the wilder conspiracy theories, you’re talking about a very small number of people who actually believe these myths,” says Professor Kenny, who recently wrote about misinformation for The Conversation.
Many surveys suffer from acquiescence bias — the tendency for respondents to agree with research statements or questions even when they don’t reflect their true position.
Quantitative research can be designed differently to reduce this problem, as political scientists Seth Hill and Molly Roberts have demonstrated, but it’s still an imperfect science that can be easily skewed.
Not to mention that surveys commissioned or conducted by news organisations tend to be designed to produce a more newsworthy result.
“You can tilt the balance to favour the response you want, and I suspect that is very commonly done by newspapers and websites to create a more ‘sexed-up’ finding,” says Professor Kenny, whose research focuses on populism and comparative politics.
“If you can get a result that shows that 25 per cent of people believe the moon landing was fake, it’ll get you a nice headline and many more views.”
News stories depicting crowds of demonstrators with a range of spurious grievances might also give the impression that large numbers of people hold conspiratorial beliefs; however, the views of these protestors are often disparate and not always based on false information, and official crowd estimates tend to be much lower than organisers claim.
Protests and surveys aside, Professor Kenny says it’s likely that harmful conspiracy theories are believed by smaller numbers than we’re often told.
So why, then, are authorities so concerned about the spread of misinformation?
Our recent history tells us that the threat doesn’t always come from the number of people who believe in these theories; rather, it’s their commitment to the cause.
“If they’re noisy enough, and if they’re committed enough, it might only take five per cent of the population to believe in myths and that can have great consequences,” Professor Kenny says.
He cites the United States insurgency of late 2021 as one example of the damage that can be done.
“On the political side, we might say, well, these are the people who believed the US election was stolen, and that led them to mount an armed assault on the Capitol. That’s extremely dangerous and something we’d like to avoid.”
Likewise, when it comes to public health, it only takes a small number of people opting out of routine measures to have some serious consequences.
Vaccination is a case in point. Despite widespread agreement in the scientific community that vaccines are safe, a reluctance to immunise had been on the rise even before the pandemic.
As a result, we saw the resurgence of measles in several parts of the world, and the authorities in some countries are still concerned about vaccine hesitancy and the pandemic of the unvaccinated.
While some of the more bizarre conspiracy theories like vaccine microchips and QAnon may seem like distinctly modern phenomena, misinformation actually has a long and colourful history.
Pamphleteers plastered their villages with false news well before the digital age, and even vaccine resistance can be traced as far back as the smallpox outbreaks of the late 18th century.
There’s also the John Birch Society, the ultra-conservative political movement that first arose in the late 1950s and is cited by some as the natural precursor to QAnon.
“They held some wild theories, like the belief that water fluoridation was a communist plot — à la Dr. Strangelove — and that Dwight Eisenhower was a communist agent,” Professor Kenny says.
So what role does the internet play in the spread of fake news? Professor Kenny says it works as an amplifier.
“Fake news has always been there — even in classical Greece and Rome, it was arguably just as much of a problem as it is now — so the idea that these are modern phenomena largely generated by the internet simply doesn’t hold up.”
What the internet does it to provide a steroid boost to a small number of myths. While most stories stay local and fade away, a few become runaway successes.
So why do some people fall for false information and conspiracy theories? Are we simply too gullible?
Do a quick online search on the spread of misinformation and you’ll likely come across articles professing that gullibility is the main problem.
Ironically, the idea that people are easily duped might itself be a myth. As cognitive psychologist Hugo Mercier has argued, we’re usually quite skilled at figuring out who to trust and what to believe.
“If anything,” Mercier writes in his book, Not Born Yesterday, “we’re too hard rather than too easy to influence.”
According to this theory, which Professor Kenny agrees with, gullibility is a far less prevalent and important trait than many of us think.
Professor Kenny argues that irrationality plays some part in allowing misinformation to slip through, drawing on the work of Nobel Prize-winning economist Daniel Kahneman, who showed that cognitive errors and mental shortcuts lead to poor decision-making.
These errors of thinking are likely at play in phenomena like vaccine refusal. In what is known as “omission bias”, a person might downplay the risk of being unvaccinated, and at the same time overplay the much less significant risk of a serious adverse reaction to a vaccine.
“It’s very hard for people with low levels of cognitive reasoning to evaluate these kinds of things, and that makes them more susceptible to misinformation,” Professor Kenny says.
“They will struggle more to evaluate information directly themselves, so they’ll rely on group cues and other sources of information to make their decisions.”
And this, he adds, is perhaps the most important element in the spread of fake news. In the so-called “misinformation age”, a feeling of belonging is stronger than facts.
As Hugo Mercier has argued, even though we are rationally sceptical, we can sometimes be persuaded to believe a lie.
“People want to be part of a group, to belong, and that is so important that they’ll often follow the cues of their leader and the people they trust,” Professor Kenny says.
“This is something that Mercier really emphasises, that we’re a deeply social species and we’ll take these social cues because we want to be part of a group, even when we haven’t critically evaluated the information ourselves.”
This could apply equally to the devoted Republicans who swallowed the lie that the election was stolen, and the vaccine avoiders who trusted their wellness guru over the world’s leading scientists.
So what do we do if a friend or loved one goes down a deep hole of misinformation? Can we claw them back?
The good news is that the “backfire effect” (an influential theory which contends that confronting people with information contrary to their beliefs only sends them further down the hole) appears to be mostly a myth.
People can, and do, change their minds. That said, you should still choose your battles wisely.
“If it’s something trivial that you’ll argue about for hours without any chance of convincing the person they’re misinformed, you should probably ask yourself if it really matters,” Professor Kenny says.
“But when it comes to something important, something that can affect the person’s life and health and you really want to convince them, you might decide it’s worth the effort to persuade that person.”
In this case, you should most definitely avoid telling them they’re stupid.
“You don’t begin by berating and telling them that they’re wrong — you act with empathy and say, ‘I share your values, we are on the same team’,” he says.
“If people can trust you and see that you share similar values across other things, then you can eventually move to the issue you want to talk about and present them with clear, accurate information. Eventually, over time, even those who have been taken in really deep can change their views.”
Professor Paul Kenny is a researcher at the Institute for Humanities and Social Sciences, specialising in political economy and comparative politics. He is the author of two books: Populism and Patronage, which won the American Political Science Association’s 2018 Robert A. Dahl Award, and Populism in Southeast Asia. His latest book, Why Populism? is due to be published in 2022.
Copyright@ Australian Catholic University 1998-2024 | ABN 15 050 192 660 CRICOS registered provider: 00004G | PRV12008