Facebook on George Floyd, During the Riots and After the Verdict

The company has taken a harder stance on moderating content after Biden won.

Facebook's Oversight Board will soon announce a decision on whether to make former President Donald Trump's ban from the platform permanent.

During the riots in the aftermath of the death of George Floyd, Trump used incendiary racist language on Facebook. He posted a message with the words, "when the looting starts, the shooting starts."

Facebook refused to either take down or moderate that post, while Twitter affixed a warning label to a similar post. Facebook's Oversight Board refused to review the decision to leave the post up, because it argued it wasn't yet set up to review cases. Facebook employees staged a virtual walkout protesting the company's lack of action. After the outcry, the company started labeling politicians' posts, but left up the "looting, shooting" post. It remains live on the site. Following the Jan. 6 Capitol riots, Trump's Facebook and Instagram accounts were frozen. 

Since Biden won the election, Facebook took a harder line on moderating content. Ahead of the verdict in the trial of former Minneapolis police officer Derek Chauvin, Facebook announced in a statement that it would moderate incendiary content. "As we have done in emergency situations in the past, we may also limit the spread of content that our systems predict is likely to violate our Community Standards in the areas of hate speech, graphic violence, and violence and incitement," wrote Monika Bickert, Vice President of Content Policy.

How effective was this? Two days after a jury found Chauvin guilty for Floyd’s murder, I used Facebook's CrowdTangle to see which politics posts were getting the most likes, reactions, and shares. (This method has limitations as there is no way to measure how many people see certain posts.) As is often the case, right-wing pundits like Ben Shapiro got the most interactions. There were many posts harshly criticizing Rep. Maxine Waters (D-Calif.) and House Speaker Nancy Pelosi (D-Calif.) for their comments about George Floyd. However, none of these posts appeared to be content that Facebook said it would remove or limit the distribution of. These posts were squarely in the realm of opinion. It's impossible to say with certainty why Facebook looked much less incendiary after the verdict; however, its content moderation policies likely played a role. 

Studies have shown that content moderation can effectively throttle certain types of misinformation. However, it is a time-consuming process which users can sometimes evade. 

Facebook's Oversight Board will decide soon on whether to allow Trump back on the platform. Even if allowed back, Facebook would still have the power to moderate his posts, as they did with content after the Chauvin verdict. Now facing antitrust scrutiny from the Biden administration and multiple states, Facebook would likely take stronger actions on moderating content, as it did not for most of 2020.

Share Public Sphere

Leave a comment


Elsewhere in the World:

Navalny Ends Prison Hunger Strike, The Moscow Times

How Twitter Became India's COVID E.R., Imaan Sheikh, The Juggernaut

As Outbreak Rages, India Orders Critical Social Media Posts to Be Taken Down, Karan Deep Singh and Paul Mozur, The New York Times

Elsewhere in the United States:

The Slander Industry, Aaron Krolik and Kashmir Hill, The New York Times

One America News Network Stays True to Trump, Rachel Abrams, The New York Times

Lofgren: Capitol Police official being investigated for directions to pursue only 'anti-Trump' protesters Jan. 6, Kyle Cheney, Politico

Top White House cyber official says action taken so far not enough to deter further Russia cyberattacks,  Alex Marquardt, Zachary Cohen and Geneva Sands, CNN

How Big Tech got so big: Hundreds of acquisitions, Chris Alcantara, Kevin Schaul, Gerrit De Vynck and Reed Albergotti, The Washington Post