Spain Challenges Legality of Meta’s Use of Content Moderating Subcontractors

The Barcelona court accepted that the mental problems suffered by the worker are not a common illness but a work accident, per the newspaper. Meta’s subcontractor had processed his absence from work as common ailment and sought to deny responsibility for any psychological harms suffered from reviewing violent content uploaded to Facebook and Instagram. Meta also noted it provides technical solutions to subcontractors which are intended to enable content reviewers to limit their exposure to graphic material they are being asked to moderate as much as possible. In the article the newspaper quotes a worker describing the support provided by their employer, and Meta’s subcontractor, as “very insufficient”. Legal rulings that impose requirements on third party content reviewers to take care of workers’ mental health could put limits on the model, however.

A Barcelona-based company has been found responsible for psychological damage suffered by one of its workers, after a court ruling in Spain. The company works as a subcontractor for Meta, providing content moderation services for Facebook and Instagram. This is the first time a court in Spain has held a content moderation company accountable for the mental disorders of one of its employees.

“Meta and social media in general must recognize the magnitude of this problem, and must change their strategy,” the worker’s law firm, Espacio Jurídico Feliu Fins, wrote in a social media post. “Instead of pursuing a strategy of denying the problem, they must accept that this horrific reality, suffered by these workers, is as real as life itself.”

The ruling follows a case brought against Meta’s local subcontractor, CCC Barcelona Digital Services, by a 26-year-old Brazilian who has been receiving psychiatric treatment for five years. The challenges faced by the worker are a result of being exposed to extreme and violent content on Facebook and Instagram, such as murders, suicides, terrorism, and torture. The court decision was handed down earlier this month, and the ruling pertains to the worker’s mental health concerns.

According to reports in the local press, the worker has been suffering from a range of psychological harms, including panic attacks, avoidance behaviors, disturbed sleep, difficulty swallowing, and significant thanatophobia (anxiety due to fear of death). The court accepted these mental health issues as a work accident, not a common illness, and Meta’s subcontractor had previously classified his absence from work as a common ailment, attempting to deny responsibility for any psychological damage from reviewing violent content uploaded to Facebook and Instagram.

The worker’s law firm regards this ruling as a major win for any workers facing mental health issues due to their work. They state:

“The day they take it on and face it, that day, everything will change.” And they continue: “As long as this does not happen, we will see to it that this happens through the legal system. We will go step by step, without haste, but without hesitation. And above all, with total determination that we are going to win.”

The outsourcing of content moderation to third-party subcontractors by Meta has been the subject of disturbing stories for years. In May 2020, a US class action lawsuit was settled with Meta agreeing to pay $52 million to content moderators who developed post-traumatic stress disorder as a result of reviewing violent and graphic images.

Similar litigation is taking place in Africa, where a content moderator is suing Meta and its subcontractor in Kenya for failing to provide adequate mental health and psychosocial support.

In response to the ruling against its subcontractor in Spain, Meta declined to comment. However, the social networking giant provided general information on its approach to outsourcing content moderation. They state that their contracts with third parties include expectations for provisions such as counselling, training, and other worker support. Additionally, the company requires subcontractors to provide 24/7 on-site support from trained practitioners and access to private healthcare from the first day of employment.

Meta also mentions that they provide technical solutions to subcontractors to help content reviewers limit their exposure to graphic material. These tools can be customized by reviewers, allowing them to blur, display in black and white, play without sound, or opt out of auto-play for graphic content.

However, the company does not address the possibility of support services and screening tools being undermined by the productivity and performance quotas imposed by subcontractors. These quotas can make it challenging for reviewers to access support while still meeting the demands of their employers.

In October, a Barcelona-based newspaper reported that around 20% of CCC Barcelona Digital Services’ staff were off work due to psychological trauma caused by reviewing toxic content. According to the article, workers have described the support provided by their employer and Meta’s subcontractor as “very insufficient”. Another report from the same month discusses the high “success rate” demanded of workers: 98% of decisions made by a content moderator must match their coworkers’ decisions and the senior auditor’s, with the risk of being fired if their rate slips.

The use of screening tools that obscure content may make it challenging for reviewers to meet their performance targets. Workers may view these tools as a risk to their jobs and opt not to use them to avoid falling behind peers. This may discourage reviewers from taking actions to protect themselves from being exposed to psychologically harmful content.

Shift work, routinely imposed on content moderation workers, may also contribute to the development of mental health issues, as disruptions to sleep patterns are known to cause stress. Additionally, the routine use of young, low-paid workers in content moderation farms poses a high risk of burnout, suggesting this is an industry that is configured around managing toxicity through high churn.

Legal rulings imposing responsibilities for third-party content reviewers to prioritize the mental health of workers may impose limits on the current model. Currently, there is no response from Telus, the Canadian company that owns CCC Barcelona Digital Services.

Avatar photo
Max Chen

Max Chen is an AI expert and journalist with a focus on the ethical and societal implications of emerging technologies. He has a background in computer science and is known for his clear and concise writing on complex technical topics. He has also written extensively on the potential risks and benefits of AI, and is a frequent speaker on the subject at industry conferences and events.

Articles: 865

Leave a Reply

Your email address will not be published. Required fields are marked *