Meta's content moderator subcontractor model faces legal squeeze in Spain
A Barcelona-based company that works as a subcontractor for Meta, providing content moderation services for Facebook and Instagram, has been found by a court in Spain to be responsible for psychological damage suffered by a worker. The ruling, handed down Thursday, per local press reports, is the first time a court in Spain has found a content moderation company responsible for the mental disorders suffered by a worker.
A report in El Periódico Thursday said the ruling, which was handed down earlier this month, pertains to a challenge brought against Meta's local subcontractor, CCC Barcelona Digital Services, by a 26-year-old Brazilian who has been receiving psychiatric treatment for five years owing to exposure to extreme and violent content on Facebook and Instagram, such as murders, suicides, terrorism and torture.
The worker in question, who began moderating Facebook and Instagram content in 2018, is said to have suffered a range of psychological harms, including panic attacks, avoidance behaviors, excessive worry about suffering illnesses, disturbed sleep, difficulty swallowing and significant thanatophobia (anxiety due to fear of death), according to the newspaper's report.
The Barcelona court accepted that the mental problems suffered by the worker are not a common illness but a work accident, per the newspaper. Meta's subcontractor had processed his absence from work as common ailment and sought to deny responsibility for any psychological harms suffered from reviewing violent content uploaded to Facebook and Instagram.
In a social media post responding to the court ruling, the law firm representing the worker, Espacio Jurídico Feliu Fins, described the result as a major win for any workers suffering mental health issues as a result of the work they do.
"Meta and social media in general must recognize the magnitude of this problem, and must change their strategy," the law firm wrote in the post [in Spanish; this is a machine translation]. "Instead of pursuing a strategy of denying the problem, they must accept that this horrific reality, suffered by these workers, is as real as life itself.
"The day they take it on and face it, that day, everything will change. As long as this does not happen, we will see to it that this happens through the legal system. We will go step by step, without haste, but without hesitation. And above all, with total determination that we are going to win."
The outsourcing of toxic content reviewing by Meta to various third-party subcontractors, which provide scores of -- typically -- low-paid workers to be used as human filters for extreme violence and other horrific acts uploaded to its social networks has been a source of disturbing stories for years. And yet the practice continues.
Back in May 2020 Meta agreed to pay $52 million to settle a U.S. class action lawsuit brought by content moderators working for third parties providing content review services for its social networks who had alleged that reviewing violent and graphic images had led to them developing post-traumatic stress disorder.
The company is also facing litigation in Africa where a moderator working for Sama, a Meta subcontractor in Kenya, is suing both companies over allegations that also include a failure to provide “adequate” mental health and psychosocial support.
Meta declined to comment on the ruling against its subcontractor in Spain. But the social networking giant provided some general information regarding its approach to outsourcing content moderation, saying its contracts with the third parties it works with on content review contain expectations that they will make provisions in areas including counseling, training and other worker support.
The tech giant also said its contracts require subcontractors to provide 24/7 on-site support with trained practitioners, in addition to offering on-call service and access to private healthcare from the first day of employment.
Meta also noted it provides technical solutions to subcontractors that are intended to enable content reviewers to limit their exposure to graphic material they are being asked to moderate as much as possible. It said these tool can be customized by reviewers so that graphic content appears entirely blurred, in black and white, blurred for the first frame, played without sound, or opted out of auto-play.
However the company's background remarks did not address the possibility of support services and screening tools being undermined by exacting productivity and performance quotas, which may be imposed on reviewers by subcontractors -- which could, in practice, make it difficult for these workers to access adequate support while still performing at the rates demanded by their employers.
Back in October, the Barcelona-based newspaper, La Vanguardia, reported that around 20% of CCC Barcelona Digital Services' staff were off work as a result of psychological trauma from reviewing toxic content. In the article, the newspaper quotes a worker describing the support provided by their employer, and Meta's subcontractor, as "very insufficient."
Another report from the same month, in El Nacional, discusses a high "success rate" (98%) workers are told they must achieve -- meaning each moderator's decisions must match their co-workers' decisions, and the senior auditor's, the vast majority of the time, with the risk of being fired if their rate slips, per the same report.
The use of screening tools that entirely or partially obscure content to be reviewed could clearly make it harder for reviewers to meet exacting performance targets. Workers may therefore view it as a risk to use tools that might reduce the accuracy of their assessments, and see them failing behind peers, as that could risk their continued employment -- effectively discouraging them from taking actions that might better protect them from being exposed to psychologically harmful content.
Shift work routinely imposed on content moderation workers may also contribute to the development of mental health issues, as disturbances to sleep patterns are known to contribute to stress. Additionally, the routine use of young, low-paid workers in content moderation farms implies a high risk of burnout is baked into the model -- suggesting this is a closed-door industry that's configured around managing toxicity via high churn; or, essentially, outsourced burn-out-as-a-service.
Legal rulings that impose requirements on third-party content reviewers to take care of workers' mental health could put limits on the model, however.
A request for comment sent to Telus International, the Canadian company that's a subsidiary of Telus and which owns CCC Barcelona Digital Services, had not been responded to at press time.
Update: Telus International disputes the claim reported by La Vanguardia that around a fifth of CCC Barcelona Digital Services' staff were off work with mental health issues. "This statement is incorrect. Based on information from Telus International’s third-party health support provider, it is estimated that work-related mental health concerns are between 1% and 2% of total absences," a spokesperson said.
While, "on average", the company has an absenteeism rate of 20%, they told us this "encompasses all medical and sick leave absences".
"Telus International maintains a comprehensive approach to addressing employee wellbeing concerns around content moderation work and is constantly working to evolve its offerings to better support the needs of its workforce," they added, saying their "comprehensive well-being program" has helped reduce total absenteeism occurrences to 14% "at periods".
This report was updated with a minor correction: Our request for comment was sent to Telus International, a a subsidiary of the Canadian company, Telus, which, per a company spokesperson, operates separately; rather than being sent to Telus itself as we originally stated.
We also added a response from Telus International which disputes La Vanguardia's reporting about the rate of mental health absenteeism at CCC Barcelona Digital Services.