Meta has responded to the dozens of recommendations from the Oversight Board regarding its controversial cross-check program, which shields high-profile users from the company’s automated content moderation systems. In its response, Meta agreed to adopt many of the board’s suggestions, but declined to implement changes that would have increased transparency around who is in the program.
Meta’s response comes after the board had criticized the program for prioritizing “business concerns” over human rights. While the company had characterized the program as a “second layer of review” to help it avoid mistakes, the Oversight Board noted that cross-check cases are often so backlogged that harmful content is left up far longer than it otherwise would be.
In total, Meta agreed to adopt 26 of the 32 recommendations at least partially. These include changes around how cross-check cases are handled internally at the company, as well as promises to disclose more information to the Oversight Board about the program. The company also pledged to reduce the backlog of cases.
But, notably, Meta declined to take the Oversight Board up on its recommendation that it publicly disclose politicians, state actors, businesses and other public figures who benefit from the protections of cross-check. The company said publicly disclosing details about the program “could lead to myriad unintended consequences making it both unfeasible and unsustainable” and said that it would open cross-check to being “game(d)” by bad actors.
Likewise, the company declined, or didn’t commit, to recommendations that may alert people that they are subject to cross-check. Meta declined a recommendation that it require users who are part of cross-check make “an additional, explicit, commitment” to follow the company’s rules. And Meta said it was “assessing the feasibility” of a recommendation that it allow people to opt out of cross-check (which would also, naturally, notify them that they are part of the program). “We will collaborate with our Human Rights and Civil Rights teams to assess options to address this issue, in an effort to enhance user autonomy regarding cross-check,” the company wrote.
While Meta’s response shows that the company is willing to make changes to one of its most controversial programs, it also underscores the company’s reluctance to make key details about cross-check public. That also aligns with the Oversight Board’s previous criticism, which last year accused the company of not being “fully forthcoming” about cross-check.