Campaigners accuse Meta of causing ‘lifelong trauma’ to Kenyan content moderators, with over 140 diagnosed with PTSD. These claims are part of a lawsuit against Meta and Samasource Kenya. Nearly 81% of assessed moderators reported severe PTSD symptoms, highlighting the dangers of content moderation roles, particularly concerning mental health.
Campaigners have raised grave concerns regarding Facebook’s parent company, Meta, alleging that it has caused “potentially lifelong trauma” to content moderators in Kenya, with over 140 individuals diagnosed with PTSD. These diagnoses were provided by Dr. Ian Kanyanya, who leads mental health services at Kenyatta National Hospital in Nairobi, and the claims were officially submitted to the employment and labor relations court. The legal documentation, represented by Nzili and Sumbi Associates, forms part of an ongoing lawsuit against Meta and Samasource Kenya, an outsourcing firm tasked with content moderation for the tech company.
Content moderators are essential for filtering disturbing material on social media platforms, typically employed through third-party agencies in developing nations. Concerns have persisted for years regarding the detrimental effects this role can impose on moderators’ mental health. In light of the accusations, Meta declined to comment on the specific medical reports but emphasized that it prioritizes the well-being of its content moderators, noting that contracts with third-party firms outline expectations for support, including training and counseling.
The moderators assessed by Dr. Kanyanya disclosed frequent exposure to distressing material, including graphic videos of violence and abuse. Among the 144 moderators who underwent evaluations, the vast majority, approximately 81%, were classified as experiencing severe PTSD. This class-action lawsuit follows a prior case from 2022, initiated by a former moderator alleging wrongful termination following protests about their working conditions.
According to reports, all 260 moderators working at Samasource Kenya’s site were dismissed last year after raising concerns over their pay and treatment. Case documents reveal that the current claimants had been part of the moderation workforce from 2019 to 2023. Medical records indicate that moderators reported severe psychological distress, such as waking from nightmares and experiencing flashbacks related to their work. One former moderator expressed an anxiety towards seeing patterns following exposure to graphic imagery.
Martha Dark, co-executive director of Foxglove, addressing the situation, stated that “moderating Facebook is dangerous, even deadly, work that inflicts lifelong PTSD on almost everyone who moderates it.” Furthermore, she contended that should such diagnoses emerge in any other industry, the responsible parties would invariably face stringent legal repercussions.
This case is part of a larger pattern that has emerged, with previous lawsuits by content moderators against various social media platforms, citing mental health deterioration as a result of their experiences within the industry.
The article discusses the severe psychological impacts experienced by content moderators in Kenya working for Meta, the parent company of Facebook. It highlights a significant number of moderators diagnosed with PTSD, which has sparked legal actions against both Meta and their subcontractor, Samasource Kenya. The moderators’ experiences illustrate the hidden toll of content moderation, particularly in developing countries where mental health risks are often exacerbated by exposure to graphic content. Throughout history, these claims of psychological injury have led to increasing scrutiny of social media companies and their responsibility toward employee welfare.
In conclusion, the allegations against Meta underscore a critical intersection of technology and mental health, emphasizing the urgent need for reforms in the employment practices surrounding content moderation. As evidenced by the diagnosis of severe PTSD among a majority of moderators, the situation calls for a deeper examination of mental health support mechanisms within the tech industry. Legal accountability may lead to necessary changes to protect vulnerable employees from detrimental working conditions in the future.
Original Source: www.cnn.com