wordpress blog stats
Connect with us

Hi, what are you looking for?

Kenyan Workers Call for Probe into Disturbing Work Conditions in AI Content Moderation for OpenAI

The petitioners were regularly exposed to harmful content without adequate psychological support

Content moderators from Kenya who have trained OpenAI’s ChatGPT have petitioned the country’s National Assembly, calling for an investigation into the operations of companies like Samasource, registered in Kenya, to whom big tech companies like Google, Meta, and OpenAI outsource their content moderation and AI work. The petition, shared with MediaNama by digital rights advocate Mercy Sumbi, sheds light on the working conditions of young Kenyan workers employed to label a wide range of internet content as toxic and harmful for ChatGPT training. Samasource, a San Francisco-based company, essentially employs workers to label and filter data and content for big tech companies.

What are the issues raised by Kenyan employees?

The petition reveals significant details about the nature of work that Kenyan content moderators are employed in for training AI models of OpenAI since 2021 – it was then that the company partnered with Samasource Kenya. The petitioners were engaged in temporary contracts with Sama to train ChatGPT, which involved, “reading and viewing material that depicted sexual and graphic violence and categorizing it”. This meant the workers were regularly exposed to content including “acts of bestiality, necrophilia, incestuous sexual violence, rape, defilement of minors, self-harm (e.g. suicide), and murder” among others.

The petitioners highlight that the nature of the job and the work undertaken by them were not sufficiently described in their contracts. They were regularly exposed to harmful content without adequate psychological support, and many workers developed “severe mental illnesses including PTSD, paranoia, depression, anxiety, insomnia, sexual dysfunction”. Additionally, the workers were sent back home without receiving their pending dues or any medical care for the impact on their mental health caused by the job when the contract between Sama and OpenAI abruptly ended.

Article continues below, you might also want to read: Summary: Global Technology Policy Council lists core principles for use of generative AI systems 

An investigation by Time earlier this year revealed how OpenAI employed Kenyan workers to label tens of thousands of snippets of text from the “darkest recesses of the internet,” depicting violence, hate speech, and sexual abuse. These labeled samples were used to train ChatGPT’s models, helping the chatbot learn to identify and filter such content. The investigation also uncovered that the data labelers employed by Sama for OpenAI were paid low wages, ranging from around $1.32 to $2 per hour, depending on seniority and performance.

Advertisement. Scroll to continue reading.

The petitioners emphasize that the outsourcing model employed by big tech companies from the US often hurt the rights of the Kenyan citizens against exploitation and fail to provide safe employment conditions. They have also complained that the workers are paid poorly and are mostly “disposed of at will”.

Why it matters:

The petition uncovers issues related to the fast-paced deployment of AI that remain underserved in narratives restricted to benefits and end-user harms from algorithm-based tools. The working conditions highlighted by Kenyan workers and how their rights are impacted in the process of development of AI are critical to the debate of ensuring accountability from AI developers and the companies that deploy them. Whether it is a direct impact or an indirect involvement, one must also question who ultimately benefits from such operations and at what cost. As countries focus on the regulation of AI and AI businesses through a risk-based and rights-based approach, the case put forth by Kenyan workers is pertinent to the areas of intervention needed to adopt a comprehensive regulatory approach.

What are petitioners asking for?

According to the petition reviewed by MediaNama, the petitioners have appealed for:

  1. Investigation into the nature of work and working conditions of Kenyan employees at companies like Samasource.
  2. Interrogate the role of the Ministry of Labour in the protection of Kenyan youth working for Sama or other companies on behalf of tech companies outside Kenya.
  3. Make recommendations to prevent the exploitation of workers and propose the withdrawal of licenses of companies that enable the exploitation of Kenyan employees.
  4. Bring in a law to regulate outsourcing of “harmful and dangerous” tech work and to protect workers engaged in such work arrangements.
  5. Amend the country’s Employment Act 2007 to offer protection to workers engaged in outsourced work.
  6. Define exposure to harmful content as an “occupational hazard” in relevant country laws.

Observations by Kenyan Courts in a Complaint against Meta:

In June this year, a Kenyan employment court ordered Meta to provide “proper medical, psychiatric and psychological care” to content moderators in Nairobi who screened content for Facebook, as per a report by Guardian. While the case dealt with Facebook’s move to declare around 260 such screeners in Nairobi as “redundant”, the underlying challenges to the company include a growing discontent among the workers who underwent traumatic experiences while screening toxic content under tight timelines without adequate psychological support. According to the report, a Kenyan court has also ruled that Meta was the primary or principal employer of the workers in Nairobi and Sama was only an agent and that the work done by these moderators ultimately served and was also provided by Meta.

Advertisement. Scroll to continue reading.

STAY ON TOP OF TECH NEWS: Our daily newsletter with the top story of the day from MediaNama, delivered to your inbox before 9 AM. Click here to sign up today!

Also Read:

Written By

Curious about the intersection of technology with education, caste and welfare rights. For story tips, please feel free to reach out at sarasvati@medianama.com

Free Reads


Vaishnaw's remarks come a day after Google removed apps belonging to Matrimony.com, Info Edge (Naukri and 99 Acres), Shaadi.com, Altt, Truly Madly, Stage, Quack...


Paytm has started distancing itself from PPBL in light of the current negative spotlight on PPBL.


The move can be seen as an attempt by Paytm to distance itself from the troubled Paytm Payments Bank, which has been significantly restricted...

MediaNama’s mission is to help build a digital ecosystem which is open, fair, global and competitive.



NPCI CEO Dilip Asbe recently said that what is not written in regulations is a no-go for fintech entities. But following this advice could...


Notably, Indus Appstore will allow app developers to use third-party billing systems for in-app billing without having to pay any commission to Indus, a...


The existing commission-based model, which companies like Uber and Ola have used for a long time and still stick to, has received criticism from...


Factors like Indus not charging developers any commission for in-app payments and antitrust orders issued by India's competition regulator against Google could contribute to...


Is open-sourcing of AI, and the use cases that come with it, a good starting point to discuss the responsibility and liability of AI?...

You May Also Like


Google has released a Google Travel Trends Report which states that branded budget hotel search queries grew 179% year over year (YOY) in India, in...


135 job openings in over 60 companies are listed at our free Digital and Mobile Job Board: If you’re looking for a job, or...


By Aroon Deep and Aditya Chunduru You’re reading it here first: Twitter has complied with government requests to censor 52 tweets that mostly criticised...


Rajesh Kumar* doesn’t have many enemies in life. But, Uber, for which he drives a cab everyday, is starting to look like one, he...

MediaNama is the premier source of information and analysis on Technology Policy in India. More about MediaNama, and contact information, here.

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ

Subscribe to our daily newsletter
Your email address:*
Please enter all required fields Click to hide
Correct invalid entries Click to hide

© 2008-2021 Mixed Bag Media Pvt. Ltd. Developed By PixelVJ