Finding a Cure for a Pandemic of Misinformation
Stories that caught our attention
As the novel coronavirus continues spreading across the physical world, scientific misinformation about it is proliferating in the online world. With nearly six million cases of COVID-19 confirmed worldwide and still so little known about the virus, health experts are warning of the parallel dangers of online misinformation and conspiracy theories.
“The United Nations secretary-general has warned we’re living through ‘a pandemic of misinformation,’ and the head of the World Health Organization said it’s an ‘infodemic,’” Kaleigh Rogers reported in FiveThirtyEight.
With Americans’ internet and social media usage at all-time highs, it is easier than ever to find or be exposed to false content about the origins of the coronavirus and purported cures. “The odd thing about reporting on the coronavirus is that the nonexperts are supremely confident in their predictions, while epidemiologists keep telling me that they don’t really know much at all,” Nicholas Kristof wrote in the New York Times.
A recent Gallup / Knight Foundation survey revealed deep concern about misinformation – 78% of American adults think false or inaccurate information about the coronavirus is a major problem. Let’s take a look at what makes misinformation so contagious – and what can be done to stop it.
Viral Spread of Misinformation
Whenever there is a vacuum of information, misinformation will find a way to fill it. As Renée DiResta, technical research manager at the Stanford Internet Observatory, wrote in the Atlantic, “The posts that reach people on Facebook, YouTube, and other platforms aren’t those with the most reliable information; they’re the ones that have the most compelling memes, get the most likes, or are shared by influencers with large audiences.”
That means that controversial Facebook post by your relative who believes in a homemade remedy for COVID-19 may be reaching more eyeballs than the evidence-based post by your local public health department — simply because it sparks a passionate discussion online.
To make matters worse, people are hungry for information about COVID-19, but scientific research on any disease requires time, rigor, and patience. Even with experts around the world working at breakneck speeds to study the coronavirus, it may be months or years before we understand herd immunity thresholds or have a vaccine. Much of the preliminary research “is locked in journals, while bloggers produce search-engine-optimized, Pinterest-ready posts offering up their personal viewpoint as medical fact,” DiResta wrote.
A key example of this is the small but loud online community of antivaccine activists. A George Washington University study recently published in Nature compared anti- and pro-vaccine groups on Facebook and found that the former group, while small in size, has a “worryingly effective and far-reaching” online communications strategy. In fact, the strategy is so effective that the researchers predict that, without intervention, support for anti-vaccination views will dominate in about a decade.
Until the expert institutions adapt to modern means of communication . . . a platform such as Facebook will have nothing compelling from them to show users.
—Renée DiResta, Stanford Internet Observatory
In an interview with NPR’s Ailsa Chang, the study’s lead author, Neil Johnson, a physics professor at George Washington University, said the anti-vaccination messengers make their ideas persuasive to people online by hitting a variety of hot-button topics. “They may be mixing distrust of vaccines with distrust of big pharma, or we don’t like government control, or we want freedom of choice for our kids . . . so that each one of these communities has its own nuanced flavor.” By amplifying diverse messages that appeal to many different concerns and beliefs, the antivaccine community manages to become highly entangled with undecided clusters of people.
In contrast, the pro-vaccine community remains largely peripheral and sticks to the singular message that vaccines work and save lives. It does not engage with undecided people the way that the antivaccine community does, nor does it tailor its message to the concerns of those who are undecided.
The Allure of Conspiracy Theories
A recent video posted to Facebook, YouTube, and other websites illustrates the efficiency with which the antivaccine community spreads its messages. “Plandemic,” a 26-minute video that espouses a false anti-vaccination conspiracy theory related to the coronavirus, was posted online in early May.
“Just over a week after ‘Plandemic’ was released, it had been viewed more than eight million times on YouTube, Facebook, Twitter, and Instagram, and had generated countless other posts,” the New York Times reported. (For contrast, the Pentagon posted video footage of unidentified flying objects around the same time and saw one million interactions in two weeks.)
The reason the video is so pernicious, according to Stephan Lewandowsky, PhD, a cognitive psychologist at the University of Bristol, is that “the absence of evidence is twisted to be seen as evidence for the theory. . . . That’s the opposite of rational thinking,” he told Marshall Allen in ProPublica.
Though the video has been banned by Facebook, YouTube, and Twitter on the grounds that it contains misinformation, it is still being uploaded to video-sharing sites and shared by the conspiracy theory group QAnon, as well as by social media influencers and regular users.
Just the Facts, Please
Social media is not the lone purveyor of misinformation. A recent survey conducted by Gallup / Knight Foundation asked American adults to name the two most common sources of misinformation about the coronavirus. A combined 68% of respondents named social media, 54% named the Trump administration, and 45% named mainstream national news.
As concerned citizens, how can we balance our needs for speed and accuracy when it comes to coronavirus updates?
Allen, an investigative health care reporter for ProPublica, keeps a checklist of questions to assess the credibility of sources and information:
- Is the presentation one-sided?
- Is there an independent pursuit of the truth?
- Is there a careful adherence to the facts?
- Are those accused allowed to respond?
- Are all sources named and cited, and if not, is the reason explained?
- Does the work claim some secret knowledge?
He used this checklist to assess “Plandemic” and concluded that “there were so many conspiratorial details stacked on top of each other in the film I couldn’t keep them straight.”
Once you’ve fact-checked information, you may find yourself in the awkward position of needing to gently correct family, friends, or acquaintances. This is a tricky but not impossible.
How to Share Hard Evidence
First, approach the conversation with empathy and assume best intentions — people are sharing all types of information because they are scared, uncertain, or curious. “It’s always important to respond in a way that doesn’t suggest that the other [person] is foolish, naive, or gullible, as much as you think they may be,” psychologist Joshua Coleman, PhD, told Joe Pinsker in the Atlantic. “Rather than saying, ‘I can’t believe that you fall for this crap!’ better to say, ‘I have heard others talk about that as well. And I agree . . . there’s so much information out there, it can be hard to know what to believe.”
Second, don’t repeat the misinformation — don’t share that article or video you’re trying to debunk! Doing so can actually reinforce the misinformation. “There’s research to suggest that the very act of seeing a headline, even if it’s notated as false by the platform or by a fact-checker, can still contribute to people believing its claim,” Miles Parks reported for NPR.
Finally, avoid lecturing or insulting the person. Ask them for the source(s) of the information and offer to share sources you trust in return. That “can point out very gently that sources matter, and that their source might not be reliable, especially if they have to repeat out loud that their information came from something like ‘Conservative Eagle News Punch,’” Joseph Uscinski, PhD, an associate professor at the University of Miami who specializes in the study of conspiracy theories, told Pinsker.
Individuals cannot work alone to combat misinformation. Expert institutions have a key role — and responsibility — to circulate accurate information. This requires reaching people where they are spending their time. “Until the expert institutions adapt to modern means of communication — which they must do, and quickly, if they are to regain public confidence — a platform such as Facebook will have nothing compelling from them to show users,” DiResta wrote.