Migrant data workers employed by third-party cos. for multinational tech firms allege rights abuse & "constant" exposure to traumatic content; incl. Meta, OpenAI & Remotasks comments
In May 2025, human rights organisation Equidem released a report exploring the impacts of exposure to “images of violence and exploitation” for data enrichment workers, including content moderators and data labellers.
The research investigated the experiences of 116 data workers in Colombia, Kenya and the Philippines, including workers employed by third-party BPO (Business Process Outsourcing) companies for multinational technology firms, including Meta, ByteDance, Remotasks and OpenAI.
The research found multiple labour rights violations experienced by these workers, including:
- Occupational health and safety violations: the report says workers endure “constant exposure” to traumatic content without the required health protections. It argues this violates human rights due diligence standards. Safety harms include psychological harms, such as severe adverse mental health impacts including suicidal ideation; sexual harms, including failure to address workplace sexual harassment; and physical harms, including exhaustion, eye strain, and insomnia.
- Unreasonable working hours, overwork and the denial of leave: Workers told Equidem they were required to work up to 20 hours a day and had to react to 1,000 images a day. In some cases, workers allegedly face pay cuts for not meetings targets, and are penalised for taking leave, including sick leave.
- Precarious employment: The report says workers are subjected to unfair wage deductions, low wages, no fixed salary, and unstable employment, including forced periods of employment without pay. Workers are also hired on short-term or a project-by-project basis for an hourly rate, including without contracts.
- Barriers accessing remedy: Supervisors allegedly faced retaliation for advocating on behalf of workers. Non-disclosure agreements also prevent workers from speaking out about their working conditions.
- Denial of freedom of association: This includes data workers in Colombia, Kenya and the Philippines saying they faced retaliation for organising.
The labour of screening endless streams of violent, sexually explicit, and traumatic content isn’t just exposure to violence – it is a form of violence in itself.Equidem, "Scroll. Click. Suffer: The Hidden Human Cost of Content Moderation and Data Labelling"
Equidem shared the reports findings with Bytedance, Meta, OpenAI, and Remotasks. ByteDance did not respond to Equidem. Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. OpenAI said it conducted an investigation and found no evidence to the support the claims. Remotasks also outlined safety measures in place, including a “24/7 anonymous hotline”, and noted “ongoing improvements” they claim the firm has undertaken. The responses can be read in more detail in the report linked below.