Facebook moderation system favours ‘business partners’, says oversight board

<span>Photograph: Dado Ruvić/Reuters</span>
Photograph: Dado Ruvić/Reuters

A policy designed to protect high-profile Facebook and Instagram users from moderation was structured to satisfy their parent company’s business interests, Meta’s “supreme court” has found, and did not prioritise protecting free speech and civil rights.

The oversight board, which scrutinises moderation decisions on Facebook and Instagram, said the platforms’ “cross-check” system appeared to favour “business partners” – such as users including celebrities who generate money for the company – while journalists and civil society organisations had “less clear paths” to access the programme.

“While Meta told the board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” said the board, adding that it had concerns about the “lack of transparency” around the programme.

It said cross-check grants certain users greater protection than others because content from users on the cross-check list is allowed to stay up while it is vetted by human moderators applying the “full range” of content policies. Meta described it as a “mistake-prevention strategy” that protected important users from erroneous content takedowns.

Ordinary users, by contrast, are much less likely to have their content reach reviewers who can apply the full range of Meta’s content guidelines.

The board said a user’s “celebrity or follower count” should not be the sole criterion for receiving the special protection offered by the programme. Meta admitted to the board that criteria for including “business partners” on the list included the amount of revenue they generated.

Meta also told the board that it exempts some content from takedowns. The company described this system as “technical corrections” and said it carried out about 1,000 a day. The board recommended that Meta conducted audits of enforcement actions that are blocked under the system.

The board added that the technical corrections system is viewed as an “allow list” or “whitelist”. In September last year the Wall Street Journal, using documents disclosed by whistleblower Frances Haugen, reported that Brazilian footballer Neymar had responded to a rape accusation in 2019 by posting Facebook and Instagram videos defending himself, which included showing viewers his WhatsApp correspondence with his accuser. The clips from WhatsApp – also owned by Meta – included the accuser’s name and nude photos of her.

Moderators were blocked for more than a day from removing the video, according to the WSJ, while the normal punishment of disabling his accounts was not implemented. An internal document seen by the WSJ said Neymar’s accounts were left active after “escalating the case to leadership”. Neymar denied the rape allegation and no charges were filed against the footballer.

Citing the Neymar example, the board said that despite Meta saying it had a system for prioritising content decisions, some content still remained online for “significant periods” while this happened.

“In the Neymar case, it is difficult to understand how non-consensual intimate imagery posted on an account with more than 100 million followers would not have risen to the front of the queue for rapid, high-level review if any system of prioritisation had been in place,” said the board.

The board went on to say that the cross-check “business partner” category includes users who are likely to generate money for the company, either through formal business relationships or drawing users to Meta platforms. It said due to the “perception of censorship” it preferred keeping content up to taking it down. The board said the business partner category was likely to include major companies, political parties and campaigns, and celebrities.

The board made 32 recommendations. They included: removing special protection for commercially important accounts if they break content rules frequently; prioritising moderation of posts that are important for human rights; and violating content from cross-check users that is “high severity” should be removed or hidden while reviews are taking place.

The board said Meta viewed the risk of a content decision resulting in “escalation at the highest levels” to a chief executive or chief operating officer of an organisation as highly sensitive. Such content carries an “extremely high severity” tag under the cross-check system. It said Meta therefore seemed more focused on business-related consequences for its decisions rather than ones that are human rights-related.

Meta’s president of global affairs, Nick Clegg, said that in order to “fully address” the board’s recommendations, the company would respond within 90 days.