Facebook's Self-Regulation Shows Its Limits
Lawmakers -- and the SEC -- have more power than the company's Oversight Board
On October 21, the Facebook Oversight Board released its quarterly transparency report. The board, funded by Facebook, rebuked the company for not being "fully forthcoming." At issue was the company not informing the board of its Cross-Check system, which exempts almost 6 million high-profile users from its community standards that include hate speech and bullying. In the Oversight Board's case in June reviewing Donald Trump's suspension from the platform, Facebook did not disclose the Cross-Check system and how it shielded the former president. In the report, the board said this omission was "not acceptable." However, these words were about it -- the board has no enforcement power over the company.
How did the Oversight Board learn about Cross-Check and Trump? The Wall Street Journal. The board wrote last month, "We are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission."
Despite Zuckerberg's 2019 commitment to "providing the board with the information and resources it needs to make informed decisions," Frances Haugen, the Facebook whistleblower who leaked documents to the Journal, said that the oversight board was "as blind as the public." (Haugen is soon to brief the Oversight Board.)

Haugen, in her testimony, had some ideas for Congress to pick up the ball. She suggested a regulatory agency that would have the power to request data from Facebook about how its algorithms work. She called for "full access to data for research not directed by Facebook." She added, "Right now, the only people in the world trained to analyze these experiences are people who grew up inside of Facebook or other social media companies...There needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this."
Her second suggestion was changing the engagement based News Feed algorithm to a chronologically based feed. In Haugen's words: "Imagine we ordered our feeds by time, like on iMessage. There are other forms of social media that are chronologically based. They’re going to say, 'You’re going to get spammed. You’re not going to enjoy your feed.' " Spam demotion could be part of a chronological feed. She went on, "There are ways that we can make those experiences where computers don’t regulate what we see...but they don’t want us to have that conversation, because Facebook knows that when they pick out the content that we focus on using computers, we spend more time on their platform, they make more money."
Her third idea for lawmakers was reforming Section 230, which shields tech companies from liability over user-generated content. Haugen favors maintaining a liability shield over user-generated content, but limiting it over algorithms. She said while tech companies can't completely control their content, “They have 100 percent control over algorithms.” Her solution avoids the thorny problem of content-moderation decisions (like over vaccine misinformation), which are inevitably highly politicized. The right wants less content moderation and the left wants more, and Facebook is never going to be able to satisfy both sides. However, limiting the liability shield on algorithms could cause platforms to be more careful about how they use engagement-based ranking and recommendations (such as QAnon posts getting recommended to right-leaning users).

Haugen has turned over a set of documents to the Securities and Exchange Commission, asking it to investigate if the company misled investors about the platform's problems with hate speech. Additional disclosures of internal documents will be published this week, timed to coincide with the release of Facebook's quarterly earnings.
In a letter to Zuckerberg last week, Sen. Richard Blumenthal (D-Conn.) called on the Facebook CEO himself to testify about Instagram and teen self-harm, as opposed to the lieutenants the company has sent recently in front of congressional committees. That would be a good step. However, it will take actual regulations and laws for Facebook to open itself up to the public and correct its problems.