Meta faces accusations of inflicting severe psychological trauma on Kenyan content moderators, with over 140 diagnosed with PTSD. The claims stem from evaluations conducted by Dr. Ian Kanyanya and are part of a lawsuit related to working conditions at Samasource Kenya. The situation raises critical questions about the mental health implications of content moderation work, prompting criticism and legal challenges against major tech companies.
Campaigners have accused Meta, Facebook’s parent company, of causing significant psychological harm to content moderators in Kenya, with over 140 individuals diagnosed with PTSD and other severe mental health conditions. These findings, presented by Dr. Ian Kanyanya, the head of mental health services at Kenyatta National Hospital, have been submitted to the employment and labor relations court as part of an ongoing lawsuit involving Meta and Samasource Kenya, the outsourcing company contracted for moderation tasks.
Content moderators face the daunting task of filtering disturbing material on platforms like Facebook, often managed by third-party firms in developing countries, raising serious concerns about the mental repercussions of such work. Meta acknowledged the issue in general terms, stating that it considers moderator support a serious matter, while emphasizing that their contracts require provisions for counseling, training, and fair remuneration. However, they refrained from commenting on specific medical reports due to the ongoing legal proceedings.
Dr. Kanyanya noted that the moderators exposed to extremely graphic content, including scenes of violence, sexual abuse, and self-harm, demonstrated alarming levels of psychological distress—81% of those assessed revealed severe PTSD symptoms. This situation arose from a previous lawsuit initiated by a former moderator claiming wrongful termination following protests against poor working conditions, underscoring a pattern of neglect and punishment for raising legitimate concerns. Foxglove, a UK non-profit supportive of the moderators’ cause, reported all employees at Samasource Kenya’s moderation hub were terminated as retaliation for their complaints in 2022.
The current legal claims involve individuals who were associated with Samasource Kenya from 2019 to 2023. One medical assessment revealed a moderator experiencing recurrent nightmares, cold sweats, and severe anxiety tied to their work, while another described the trauma leading to a specific phobia. Martha Dark, co-executive director of Foxglove, emphasized the gravity of the situation, asserting that the nature of moderating content for Facebook results in long-lasting psychological effects on those involved.
This phenomenon is not isolated; other content moderators have similarly pursued legal action against major social media companies, such as TikTok, citing psychological trauma resulting from their roles. As this trend continues, it begs the question of liability and the need for reform in how content moderation is handled across the industry.
The issue of content moderation and its psychological ramifications has gained traction in recent years, particularly as companies like Meta and TikTok face criticism for the working conditions of third-party moderators. These workers are often tasked with reviewing disturbing and graphic content, which has been shown to have debilitating effects on mental health. Reports of PTSD among moderators have sparked legal challenges, revealing systemic problems within the industry’s approach to mental health support for employees. The current situation in Kenya highlights the urgent need for accountability and better psychosocial care for individuals performing this crucial—but often traumatic—work.
The alarming diagnosis of PTSD among Kenyan content moderators signifies a pressing concern regarding the mental well-being of those in the digital content moderation sector. The lawsuit against Meta and Samasource underscores the need for corporate responsibility in safeguarding employees’ mental health, particularly in the face of exposure to inherently distressing content. This situation brings to light the broader implications of content moderation practices, prompting a call for reform and greater support mechanisms to protect these vulnerable workers from lifelong trauma.
Original Source: www.cnn.com