Is Facebook Why We're Polarized?

The platform's algorithms have come under fire for appearing to promote misinformation.

Do Facebook's algorithms promote misinformation, resulting in political polarization?

This was the central line of questioning by Democratic lawmakers at Thursday's House Energy and Commerce hearing. In his opening remarks, Facebook CEO Mark Zuckerberg said that he didn't believe social networks were primarily to blame for polarization. He testified: "I believe that the division we see today is primarily the result of a political and media environment that drives Americans apart."

Democratic lawmakers did not like his answer. Chairman Frank Pallone (D-N.J.) said, "You definitely give the impression that you are not actively, in any way promoting this misinformation and extremism, and I totally disagree with that. You're not passive bystanders." 

So which is it? Is Facebook merely reflecting the intense division in America or is it an active participant in polarization?

It's true that if Facebook ended tomorrow, misinformation would not vanish, and American politics would still be deeply polarized. However, studies show that Facebook algorithms promote misinformation, contributing to polarization.

Karen Hao at MIT Technology Review reported, “A former Facebook AI researcher who joined in 2018 says he and his team conducted 'study after study' confirming the same basic idea: models that maximize engagement increase polarization." The same researcher told Hao, “Over time they [users] measurably become more polarized.”

Content moderation can mitigate the spread of misinformation. However, Hao reported that content-moderation models have difficulty in catching new forms of misinformation because misinformation is constantly changing and evolving. Purveyors of misinformation can try to evade detection by misspelling key words or using code words.

Consider a NYU study that Rep. Debbie Dingell (D-Mich.) cited in the hearing. It found that far-right misinformation sources outperformed non-misinformation sources in engagement (likes, shares, and reactions). It found that far-right misinformation had 65% more engagement than other right-wing pages. It also found that center and left partisan posts had a measurable decline in engagement for posting misinformation, while far-right posts did not. 

Answering a question about the NYU study, Zuckerberg responded, "People don’t want to see misinformation or divisive content on our services. People don’t want to see clickbait and things like that," and repeated that it wasn't good for business.

The NYU researchers said their findings were limited by the fact that they could only measure engagement. The researchers also said that there wasn't a way to look at reach — how many users actually see misinformation — which Facebook argues is a more accurate representation of the spread of different types of content.

Disclosing how many people see each post would be a good place to start to answer the question of how much Facebook contributes to polarization. It's often not clear whether misinformation is spreading within smaller groups or more broadly on the platform.

Leave a comment

Share Public Sphere

ELSEWHERE IN THE WORLD:

Report: seven members of Germany's Bundestag and 31 members of its state parliaments were targeted in a phishing attack allegedly originating from Russia's GRU, DW

Chernobyl, SpaceX, A Ukraine Lobbyist, And A Skyrocketing Stock Price Todd Prince, Liubomyra Remazhevska, Georgiy Shabaev / RFE/RL.

'He cut my underwear. Then he did what he did' They wanted democracy. Instead they say they were beaten and raped by police. Nick Paton Walsh, Sebastian Shukla, Christian Streib and Denis Lapin / CNN.

ELSEWHERE IN THE UNITED STATES:

On Google Podcasts, a Buffet of Hate, Reggie Ugwu / The New York Times

Artists are tapping into NFT marketplaces, which could pay them thousands for their work, but are running into scams, environmental concerns, and crypto hype, Abby Olheiser / MIT Technology Review

Google said it has shut down a hacking operation in January but, sources say, didn't disclose that it was an active counter-terrorism operation by a US ally, Patrick Howell O'Neill / MIT Technology Review

Analysis of Facebook, Apple, Google, Amazon US lobbying: $124M spent on lobbying and campaign donations during 2020 cycle; Facebook and Google are top spenders, Public Citizen

The mess at Medium, Casey Newton / The Verge