It’s far more dangerous beneath the surface than it appears. It doesn’t matter how terrible things appear on the surface; there’s much worse below.
The Facebook Papers, a collection of internal Facebook papers leaked by whistleblower Frances Haugen and studied by 17 news organizations, provide a glimpse into the dark side of social media use. Their stories provide a vivid picture of a company that has been shattered beyond repair and, despite scandal after scandal, retains the capacity to shock.
1. Facebook’s executives ignored their staff’s calls for change
According to the Atlantic, Facebook documents reveal that certain employees have called out real-world damage caused by the platform — only to be ignored by higher-ups.
“How are we expected to ignore when leadership overrides research-based policy decisions to better serve people like the groups inciting violence today,” a Facebook staffer wrote in the fallout of the Jan.
2. While posing as a free speech advocate in the United States, Mark Zuckerberg personally authorized censoring anti-government postings abroad
Facebook CEO Mark Zuckerberg has stated that he does not want to be in the business of restricting political speech. And yet, according to the Washington Post, he personally did so when it served his company’s financial interests.
The Post has published a particularly egregious example of the CEO’s deception in Vietnam, where according to sources informed with the decision, Zuckerberg himself made the call to block anti-government postings on behalf of the Communist Party in 2020.
Vietnam is a significant market for Facebook. According to Amnesty International, Facebook made around $1 billion in annual revenue from Vietnam in 2018.
3. The company’s own researchers were shocked by the algorithm’s recommendations
The fact that the Facebook algorithm promotes divisive material is no secret. Nonetheless, even Facebook’s own researchers are shocked by the sickening nature of this material.
“On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India,” reports the New York Times. “For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site.”
” I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total.”
Internal Facebook papers shed light on how skewed Facebook’s recommendation algorithms are, according to the study.
“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” wrote the Facebook researcher.
4. Facebook puts politics front and center when enforcing its own rules
According to reports, Zuckerberg was concerned that Facebook’s conservative users would react badly to the site’s liberal bias, and he would personally intervene on behalf of right-wing pundits and publishers. Even Facebook’s own researchers were aware of this, as highlighted by Politico in leaked documents, and they repeatedly criticized it internally.
“Facebook routinely makes exceptions for powerful actors when enforcing content policy,” wrote a Facebook data scientist in a 2020 internal presentation titled Political Influences on Content Policy. “The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies.”
The Public Policy team that the researcher refers to, according to Politico, includes Facebook lobbyists.
Furthermore, according to Facebook researchers, Mark Zuckerberg himself frequently got involved in the decision whether or not a post should be kept or removed — implying a two-tiered enforcement system dependent on unspoken standards.
“In multiple cases, the final judgment about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg. If our decisions are intended to be an application of a written policy then it’s unclear why executives would be consulted. If instead there was an unwritten aspect to our policies, namely to protect sensitive constituencies, then it’s natural that we would like executives to have final decision-making power.”
5. After a threat from Apple, Facebook unleashed a full-court press against human trafficking
Human traffickers have made use of Facebook’s technologies to their advantage. According to news reports, a 2020 internal Facebook document stated that the company was well aware of this fact.
“[Our] platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks,” reads the internal Facebook report in part.
And yet, while human trafficking has long been explicitly banned on Facebook, it took Apple threatening to boot Facebook and Instagram from the Apple App Store in 2019 for Facebook to muster the type of response one might have expected much earlier.
“Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB,” reads the document reviewed by CNN. “To mitigate against this risk, we formed part of a large working group operating around the clock to develop and implement our response strategy.”
Importantly, Apple was not the first to raise the issue with Facebook.
“Was this issue known to Facbeook [sic] before the BBC enquiry and Apple escalation?” the internal Facebook report asks. “Yes.”