Facebook revelations: what is in cache of internal documents?

Facebook has been at the centre of a wave of damaging revelations after a whistleblower released tens of thousands of internal documents and testified about the company’s inner workings to US senators.

Frances Haugen left Facebook in May with a cache of memos and research that have exposed the inner workings of the company and the impact its platforms have on users. The first stories based on those documents were published by the Wall Street Journal in September.

Related: Facebook whistleblower Frances Haugen calls for urgent external regulation

Haugen gave further evidence about Facebook’s failure to act on harmful content in testimony to US senators on 5 October, in which she accused the company of putting “astronomical profits before people”. She also testified to MPs and peers in the UK on Monday, as a fresh wave of stories based on the documents was published by a consortium of news organisations.

Facebook’s products – the eponymous platform, the Instagram photo-sharing app, Facebook Messenger and the WhatsApp messaging service – are used by 2.8 billion people a day and the company generated a net income – a US measure of profit – of $29bn (£21bn) last year.

Here is what we have learned from the documents, and Haugen, since the revelations first broke last month.

Teenage mental health

The most damaging revelations focused on Instagram’s impact on the mental health and wellbeing of teenage girls. One piece of internal research showed that for teenage girls already having “hard moments”, one in three found Instagram made body issues worse. A further slide shows that one in three people who were finding social media use problematic found Instagram made it worse, with one in four saying it made issues with social comparison worse.

Facebook described reports on the research, by the WSJ in September, as a “mischaracterisation” of its internal work. Nonetheless, the Instagram research has galvanised politicians on both sides of the Atlantic seeking to rein in Facebook.

Violence in developing countries

Haugen has warned that Facebook is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said that 87% of the spending on combating misinformation at Facebook is spent on English content when only 9% of users are English speakers. According to the news site Politico on Monday, just 6% of Arabic-language hate content was detected on Instagram before it made its way on to the platform.

Haugen told Congress on 5 October that Facebook’s use of engagement-based ranking – where the platform ranks a piece of content, and whether to put it in front of users, on the amount of interactions it gets off people – was endangering lives. “Facebook … knows, they have admitted in public, that engagement-based ranking is dangerous without integrity and security systems, but then not rolled out those integrity and security systems to most of the languages in the world. And that’s what is causing things like ethnic violence in Ethiopia,” she said.

Divisive algorithm changes

In 2018 Facebook changed the way it tailored content for users of its news feed feature, a key part of people’s experience of the platform. The emphasis on boosting “meaningful social interactions” between friends and family meant that the feed leant towards reshared material, which was often misinformed and toxic. “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” said internal research. Facebook said it had an integrity team that was tackling the problematic content “as efficiently as possible”.

Tackling falsehoods about the US presidential election

The New York Times reported that internal research showed how, at one point after the US presidential election last year, 10% of all US views of political material on Facebook – a very high proportion for the platform – were of posts alleging that Joe Biden’s victory was fraudulent. One internal review criticised attempts to tackle “Stop the Steal” groups spreading claims that the election was rigged. “Enforcement was piecemeal,” said the research. The revelations have reignited concerns about Facebook’s role in the 6 January riots.

Facebook said: “The responsibility for the violence that occurred … lies with those who attacked our Capitol and those who encouraged them.” However, the WSJ has also reported that Facebook’s automated systems were taking down posts generating only an estimated 3-5% of total views of hate speech.

Disgruntled Facebook staff

Within the files disclosed by Haugen are testimonies from dozens of Facebook employees frustrated by the company’s failure to either acknowledge the harms it generates, or to properly support efforts to mitigate or prevent those harms. “We are FB, not some naive startup. With the unprecedented resources we have, we should do better,” wrote one employee quoted by Politico in the wake of the 6 January attack on the US capitol.

“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the US, we determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” wrote another. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy … History will not judge us kindly.”

Facebook is struggling to recruit young users

A section of a complaint filed by Haugen’s lawyers with the US financial watchdog refers to young users in “more developed economies” using Facebook less. This is a problem for a company that relies on advertising for its income because young users, with unformed spending habits, can be lucrative to marketers. The complaint quotes an internal document stating that Facebook’s daily teenage and young adult (18-24) users have “been in decline since 2012-13” and “only users 25 and above are increasing their use of Facebook”. Further research reveals “engagement is declining for teens in most western, and several non-western, countries”.

Haugen said engagement was a key metric for Facebook, because it meant users spent longer on the platform, which in turn appealed to advertisers who targeted users with adverts that accounted for $84bn (£62bn) of the company’s $86bn annual revenue. On Monday, Bloomberg said “time spent” for US teenagers on Facebook was down 16% year-on-year, and that young adults in the US were also spending 5% less time on the platform.

Facebook is built for divisive content

On Monday the NYT reported an internal memo warning that Facebook’s “core product mechanics”, or its basic workings, had let hate speech and misinformation grow on the platform. The memo added that the basic functions of Facebook were “not neutral”. “We also have compelling evidence that our core product mechanics, such as vitality, recommendations and optimising for engagement, are a significant part of why these types of speech flourish on the platform,” said the 2019 memo.

A Facebook spokesperson said: “At the heart of these stories is a premise which is false. Yes, we are a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we have invested $13bn and have over 40,000 people to do one job: keep people safe on Facebook.”

Apple threatened to pull Facebook products from its app store over human trafficking content

According to another document in the cache, Apple threatened to remove Facebook and Instagram from its app store two years ago over concerns they were being used to trade in domestic servants, a sector that is high-risk for abuse and human slavery.

The threat was dropped after Facebook shared details of its attempts to tackle the problem. One internal document showed how Facebook removed over 1,000 accounts operating largely out of Saudi Arabia to recruit workers, who reported abuse and sexual violence.

“In our investigation, domestic workers frequently complained to their recruitment agencies of being locked in their homes, starved, forced to extend their contracts indefinitely, unpaid, and repeatedly sold to other employers without their consent,” one Facebook document read. “In response, agencies commonly told them to be more agreeable.”

Facebook’s crackdown seems to have had a limited effect, according to the AP, which found even today a quick search for “khadima”, or “maids” in Arabic, will bring up accounts featuring posed photographs of Africans and South Asians with ages and prices listed next to their images.

The company in a statement told the AP it took the problem seriously, despite the continued spread of ads exploiting foreign workers in the Middle East.

Facebook avoids confrontations with US politicians and rightwing news organisations

A document seen by the Financial Times showed a Facebook employee claiming Facebook’s public policy team blocked decisions to take down posts “when they see that they could harm powerful political actors”. The document said: “In multiple cases the final judgment about whether a prominent post violates a certain written policy are made by senior executives, sometimes Mark Zuckerberg.” The memo said moves to take down content by repeat offenders against Facebook’s guidelines, such as rightwing publishers, were often reversed because the publishers might retaliate.

The wave of stories on Monday were based on disclosures made to the Securities and Exchange Commission – the US financial watchdog – and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organisations including the NYT, Politico and Bloomberg.

Kari Paul contributed reporting