One Year On, COVID Has Been a Disinformation Nightmare

Screen time, isolation, and social media algorithms made for a toxic mix.

A year ago last March, Americans were faced with an entirely new situation -- the COVID-19 pandemic and lockdowns across the country. People had a barrage of decisions affecting every facet of their lives, big and small. Many people were figuring out how to work from home on Zoom or file for unemployment insurance for the first time, how to engage their kids with remote schooling or to find an alternative to it, how to pay the bills at a shuttered small business or how to close it, or how to pay the rent or mortgage when income had gone away. Questions abounded. Where to find a mask? Was it safe to see family or friends? Was it okay to hug someone? Was it safe to go to the grocery store?

Enter disinformation. The Internet paved the way for conspiracy theories that spread like wildfire in this new world. Here are four reasons why they spread.

COVID-19 Was a New Phenomenon, Allowing Disinformation to Thrive

Virtually no one had any experience living through a global pandemic. Various ideas about how to cure COVID emerged, many of which were lies. It could be killed by sunlight (False). Drinking chloroquine could treat COVID (False). Cocaine and Marijuana could treat Covid (False). These are just a few examples; the fact-checking website Politifact ran 800 fact-checks on the coronavirus, and over 60 percent of them were false or "pants on fire," meaning they had absolutely no truth value.  

While some of these were silly, others had disastrous real-world consequences. Social media posts falsely claimed that wearing masks was more harmful to your health than wearing them. 18 percent of Americans "never" or "rarely" wore a mask when outside the home according to a July 2020 Gallup survey, facilitating the spread of coronavirus. 

Disinformation Came From The Top

President Donald Trump was a big megaphone for denial about COVID as well as many quack cures. A man died and his wife was hospitalized from ingesting the fish-tank additive chloroquine, after President Donald Trump hailed chloroquine as a potential "game changer" to treat COVID. Trump falsely claimed that ingesting bleach could cure COVID, and the state of Maryland received hundreds of calls to the poison control center inquiring about bleach. Trump also repeatedly claimed publicly that COVID was no worse than the flu, contradicting the opinion of public health experts, scientists, and even himself in private. This sentiment tapped into the powerful urge to deny that the disease was not as bad as it was. Optimism bias, the human tendency to believe that bad things won't happen to them, was also a powerful psychological feeling that helped spread denial.

People Were Looking at Screens a Lot

People were spending less time commuting; in school; and having social interactions with friends, coworkers, and extended family. Average daily screen time per day increased more than three hours to over 13 hours in March 2020, according to one Nielsen report. This increase gave conspiracy theories more eyeballs. According to U.K. researchers, users who get their information from social media platforms like Facebook and YouTube were more likely to believe in conspiracy theories about COVID.

Social Media Platforms Facilitated the Spread of Conspiracy Theories

On May 4, 2020, a 26-minute video titled "Plandemic" appeared on YouTube, narrated by a discredited virologist named Dr. Judy Mikovits, alleging an array of conspiracy theories about the coronavirus, including attacks on Dr. Anthony Fauci. The video made a number of false claims, such as that masks were harmful, that the virus was "manipulated," and that the flu vaccine increased the odds of getting COVID. The video was slick -- it was professional quality, the speaker was a seemingly credible authority figure, and it fed people's need for answers about this confusing new reality. The video racked up over 8 million views on YouTube before being taken down by the platform. 

According to a study published by the Harvard Misinformation Review, "stories reinforcing conspiracy theories had a higher virality than neutral or debunking stories." Content moderation -- when platforms take down or throttle the virality of disinformation -- of such conspiracy theories had a "significant mitigating effect" on their spread, but a "large number" of conspiracy theory posts remained unmoderated. Facebook and YouTube's algorithms spread these conspiracy theories easily, and they were taken down often only after they had caused harm. 

What's Next?

The extraordinary disinformation campaigns about COVID laid the groundwork for vaccine skepticism. Over 20 percent of Americans feel hesitant about getting a vaccine. According to another poll, about half of Republican men have no plans to get a vaccine. Disinformation about COVID has had public health consequences that will last beyond the pandemic itself.

Leave a comment

Share

Share Public Sphere

ELSEWHERE IN THE UNITED STATES:

How Russia Got Americans to Do Its Dirty Work, The Atlantic

A Hacker Got All My Texts for $16, Vice

Amazon Is Pushing Readers Down A "Rabbit Hole" Of Conspiracy Theories About The Coronavirus, Buzzfeed

The Cop Who Said The Spa Shooter Had A "Bad Day" Previously Posted A Racist Shirt Blaming China For The Pandemic, Buzzfeed

Platforms vs. PhDs: How tech giants court and crush the people who study them, Protocol

MAGA voters discovered a new home online. But it isn't what it seems., Politico 

ELSEWHERE IN THE WORLD:

Non-English Editions of Wikipedia Have a Misinformation Problem, Slate

TikTok is repeating Facebook’s mistakes in Myanmar, Rest of World

Apple Bent the Rules for Russia—and Other Countries Will Take Note, Wired