Facebook's Polarizing Effects Are Truly Global

The algorithm pushes divisive and angry content everywhere, and not just in the polarized United States.

Facebook has maintained that its platform has not been primarily to blame for political polarization. In March, Facebook CEO Mark Zuckerberg testified to Congress, "I believe the division we see today is primarily the result of a political and media environment that drives Americans apart."

Last week, a new series by the Wall Street Journal revealed that a 2018 algorithm change to prioritize what the company called "meaningful social interaction" has had the effect of prioritizing divisive and angry content. Facebook's algorithm was not merely reflecting polarized stories of an angry populace, its algorithm placed polarizing stories at the top of users' feeds. Inflammatory posts that drew scores of comments and shares rose to the top of users' News Feeds, thanks to the new algorithm. According to documents released by the Journal, Zuckerberg resisted change because of concerns that it would hurt Facebook's other goal of keeping people on the platform more, which drives profits.

Facebook employees warned that they were told by political parties across the globe about the effect of the algorithm change. "Many parties, including those that have shifted to the negative, worry about the long term effects on democracy,” read one internal Facebook document leaked to the Journal. Referring to an unnamed Polish political party, Facebook researchers wrote in 2019: "One party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80% negative, explicitly as a function of the change to the algorithm." Facebook researchers wrote of Spanish political parties, “They have learnt that harsh attacks on their opponents net the highest engagement...they claim that they ‘try not to,’ but ultimately ‘you  use what works.’” Facebook researchers heard similar complaints from parties in India and Taiwan.

The Journal's reporting is important for two reasons. First, it confirms that Facebook's algorithm polarizes debate in other countries, not just the United States. If the American political and media environment primarily drove polarization, as Zuckerberg claimed, then Facebook's algorithm might work differently in other countries. According to Facebook internal research, it worked similarly in countries with different political systems and media environments across the globe. Secondly, it confirms that Facebook's algorithm indeed prioritizes divisive content.

A post shared by @zuck

Outside research has confirmed that posts attacking political opponents have gotten more shares. In 2020, psychology researchers Steve Rathje, Sander van der Linden, and Jay Van Bavel, of Cambridge and New York University analyzed 2.7 million Facebook and Twitter posts from news media and congressional members. They found that the effect of posts using out-group language was "4.8 times as big as that of negative affect language and 6.7 times as big as that of moral-emotional language." Posts dunking on the opposing party were more likely to be shared than simply negative posts or ones that appealed to emotions or morals.

Many of Facebook's own researchers, who have had access to data that outside researchers don't, realize that the platform needs to change.  The Journal reported that their complaints often fell on deaf ears of higher-ups. Samidh Chakrabarti, who formerly led Facebook's Civic Engagement team, tweeted that feeds needed to be imbued with a "sense of morality." He added: "In the absence of an articulated set of values, engagement & growth concerns will win every single time because they are far easier to measure (and defend)."

Facebook announced some limited changes on August 31 to its algorithm. "We're gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment on or share political content," wrote Aastha Gupta, a product management director at Facebook. However, it's unclear how much this will change the platform. 

The goals of keeping people on Facebook and deprioritizing polarizing content are in conflict. Sheera Frenkel, co-author of "The Ugly Truth: Inside Facebook's Battle for Domination" told Public Sphere last month, "If they started to become very aggressive, let's say in showing people more accurate sources of information, their own experiments have found that's kind of boring." She added, "You may not engage with it as much, you might log out and go do something else. Maybe you go to TikTok, maybe you'll go to Twitter. That is really dangerous for Facebook."

The political stakes are high. Polarized countries like France, Hungary, and the Philippines have elections next year. Donald Trump's suspension from the platform could end as soon as 2023. Chakbarti, the former Civic Engagement manager, tweeted that the company needs to go out there and explain its values, "platforms need to have the courage to explain to the world what they consider bad AND what is good. Then they need to build out the metrics, set a threshold for these tradeoffs, and hold all their product launches accountable."

Leave a comment


Share Public Sphere