Facebook’s Content Moderators in Kenya Diagnosed with PTSD Amid Legal Action
A report by CNN reveals that over 140 content moderators in Kenya have been diagnosed with PTSD, according to Dr. Ian Kanyanya. The allegations against Meta and Samasource Kenya emerge from a lawsuit claiming that moderators suffer severe psychological trauma due to exposure to graphic content. Notably, 81% of those evaluated were found to have severe PTSD. The situation raises critical questions about the responsibilities of tech companies towards their workers’ mental health.
Recent allegations against Facebook’s parent company, Meta, have emerged, accusing the tech giant of inflicting severe psychological trauma on its content moderators in Kenya. Campaigners report that over 140 individuals have been diagnosed with post-traumatic stress disorder (PTSD) and other mental health issues. These claims were substantiated by Dr. Ian Kanyanya, who oversees mental health services at Kenyatta National Hospital in Nairobi, noting the extreme and graphic nature of the content moderators encountered on a daily basis, including violent and explicit materials.
The legal action, presented by the law firm Nzili and Sumbi Associates, arises from an ongoing lawsuit against Meta and the outsourcing firm Samasource Kenya, which was engaged to perform content moderation services. Critics have long highlighted the psychological toll of such professions, particularly when they involve exposure to distressing content. Dr. Kanyanya’s comprehensive assessments revealed that a staggering 81% of the evaluated moderators suffered from severe PTSD, prompting serious concerns about their mental health.
Despite the gravity of the situation, Meta has refrained from offering comments on the medical details due to the active litigation process, yet emphasized its commitment to the support and well-being of content moderators through stipulations in contracts with third parties. Furthermore, the moderators are permitted to adjust the content review tools to diminish exposure to alarming visuals.
The grievances stemming from the content moderation role also draw attention to previous legal challenges presented by former moderators against major tech firms. In 2021, a TikTok moderator initiated legal proceedings after claiming damaging psychological effects from her work, highlighting an alarming trend across the content moderation sector, where employees face significant mental health risks.
Content moderation is a critical function for social media platforms, involving the review and removal of inappropriate or harmful posts. As moderators often work for third-party firms based in developing countries, they face unique challenges and dangers due to the nature of the material they are required to process. In recent years, growing concerns regarding the psychological impact of this work have led to instances of legal action against major tech companies. The case in Kenya represents both a legal and ethical challenge for Meta as it confronts allegations of neglect regarding the mental health of its contractors.
In conclusion, the alarming diagnoses of PTSD among content moderators in Kenya underscore the urgent need for improved mental health protections. The legal proceedings initiated against Meta and Samasource Kenya aim not only to seek justice for the affected individuals but also to raise awareness about the risks faced by content moderators worldwide. This case may prompt a reevaluation of practices within the industry and highlight the necessity for comprehensive mental health support for individuals in similar roles.
Original Source: www.cnn.com