The Facebook Papers: What We Know So Far — And What Else Is Coming

Available to WrapPRO members

Thousands upon thousands of leaked documents are made available to media, and more outlets are joining the consortium

facebook hq
Getty Images

In the last few days, Facebook was hit with a massive storm of negative stories about its company practices and platforms — and there are more to come.

So far, at least 97 articles have been published by a group of 17 media outlets that were invited by whistleblower Frances Haugen’s PR and legal team to scan the Facebook Papers, thousands of pages of internal documents Haugen took with her when she left Facebook’s integrity team. And according to one consortium member, there are six more weeks of these stories coming out as more news outlets join the group.

From CNN and Bloomberg to The Verge and The Washington Post, the consortium member outlets are all invited to download the redacted papers on Google Drive and attend briefings to get additional context on them. There is also talk of releasing the documents in their entirety to the public — not just to a group of select journalists.

“The public deserves to read the documents, not just the few dozen journalists in the consortium,” Alex Kantrowitz, publisher of Big Technology, wrote. “Society distrusts institutions when a handful of gatekeepers withhold information that applies to their lives.”

Here are some of the major stories coming out of the Facebook Papers, many documenting the company’s priorities over profit and Western nations and its losing battle over younger users to competitors including TikTok. A number of journalists have been reporting on these threads over the years, but now, thanks to Haugen’s leaks, there will be much greater insight into how Facebook makes decisions and how it views its role in the world.

Facebook has been aware of human trafficking issues for years

As detailed in CNN Business, the company has known about human trafficking happening on its platform at least since 2018. The documents show that women were trafficked on Facebook and subjected to physical and sexual abuse while being deprived of food and access to their travel documents. Facebook even gave it an acronym, “HEx”, for human exploitation. At one point, Apple threatened to ban Facebook and Instagram from its app store over this. Recent reports suggest the company is struggling to combat this type of content, with a January 2020 document saying “our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks.”

The content moderation problems are much worse around the world

What Facebook struggles with in North America in terms of hate speech, misinformation and violent content isn’t even half of it. It turns out, those problems are much worse in many parts of the developing world — especially in India. The Facebook Papers reveal how Facebook has studied its approach in other countries and was very aware of the lack of moderation resources in non-English speaking countries, leaving the platform in those regions exposed to bad actors. In a 2020 report, Facebook documented how the majority (84%) of its misinformation efforts were dedicated to the US, and just 16% going to the “rest of world.” Even with India being Facebook’s largest market, users there lack the critical protections that exist in English-speaking countries.

It’s losing the fight over teen users

Per The Verge, Facebook’s aging user base is causing the platform to lose out on the younger users. Increasingly, Snap and TikTok have become major competitors as Facebook became seen as a network for older people. According to reports, teenage users on Facebook in the U.S. declined by 13% since 2019 and is projected to drop 45% in the next two years. As The Wall Street Journal reported, the company’s research suggests that Instagram can have harmful effects on teen users, especially those facing eating disorders, suicide and body image issues. There’s an overall decline in Facebook’s daily users as the company pivots to maintain its dominance over those aged 18 to 29.

Facebook played a major role leading up to the Jan. 6 insurrection

Facebook had increased efforts to police violent content and misinformation leading up to the 2020 U.S. presidential election. However, after Nov. 6 the company rolled back these safeguarding measures even as a Stop the Steal group surfaced and dozens of similar groups emerged. Inside Facebook, the alarm sounded as users reported “false news” and hit almost 40,000 per hour, according to internal reports. As The Washington Post reported, by the time Facebook tried to pivot back to these measures, it was already too late — the pro-Trump riots were already happening at the Capitol.

Facebook experimented with removing its News Feed algorithm, but eventually gave up

As Alex Kantrowitz wrote in his newsletter, a Facebook researcher attempted in 2018 to shut off the News Feed ranking algorithm for .05% of users. An internal report on the experiment found that without the algorithm, engagement on Facebook dropped significantly — users hid 50% more posts and content from Facebook Groups rose. User sessions declined, and social interactions between friends dropped by 20%. Overall, turning the ranking algorithm off led to a worse experience on the platform, but it actually made Facebook more money (because users saw more ads). But completely doing away with the ranking algorithm also proved other larger integrity issues would surface, including getting exploited by bad actors.

Facebook ignored findings on anti-vaccination content because it was worried about profit

When Facebook was hit with an increase in anti-vaccination content in March, some employees did a study on altering how these posts about vaccines were ranked in the News Feed. Researchers realized that changing the ranking could help reduce the spread of misleading information on vaccines. Instead, it could offer up posts from legitimate sources including the World Health Organization. But the company instead ignored suggestions from this study and delayed making some of these changes until April. Another researcher’s suggestion of disabling comments on vaccine posts in March was also ignored. Instead, Mark Zuckerberg said they would start labeling posts that said vaccines were safe. Critics contend this move would ensure the high engagement on misinformation would continue.

Comments