28 May 2025
Scroll. Click. Suffer: The Hidden Human Cost of Content Moderation and Data Labelling
အကျဉ်းချုပ်
Date Reported: 28 May 2025
Location: Colombia
Companies
Meta (formerly Facebook) - Client , ByteDance - ClientOther
Not Reported ( Technology: Software & web-based/digital services ) - EmployerAffected
Total individuals affected: Number unknown
Migrant & immigrant workers: ( Number unknown - Location unknown , Technology: Software & web-based/digital services , Men , Unknown migration status )Issues
Mental Health , Intimidation , Excessive production targets , Occupational Health & Safety , Rape & sexual abuse , Access to Non-Judicial Remedy , Reasonable Working Hours & Leisure Time , Denial of leave , Contract Substitution , Access to Information , Freedom of ExpressionResponse
Response sought: Yes, by Journalist
External link to response: (Find out more)
Action taken: Equidem shared the reports findings with Bytedance & Meta.Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. The responses can be read in more detail in the report.
Source type: NGO
အကျဉ်းချုပ်
Date Reported: 28 May 2025
Location: Ghana
Companies
ByteDance - Client , Meta (formerly Facebook) - ClientOther
Not Reported ( Technology: Software & web-based/digital services ) - EmployerAffected
Total individuals affected: Number unknown
Migrant & immigrant workers: ( Number unknown - Location unknown , Technology: Software & web-based/digital services , Men , Unknown migration status )Issues
Access to Non-Judicial Remedy , Mental Health , Excessive production targets , Denial of leave , IntimidationResponse
Response sought: Yes, by Journalist
External link to response: (Find out more)
Action taken: Equidem shared the reports findings with Bytedance & Meta. Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. The responses can be read in more detail in the report.
Source type: NGO
အကျဉ်းချုပ်
Date Reported: 28 May 2025
Location: Kenya
Companies
Meta (formerly Facebook) - Client , OpenAI - ClientOther
Not Reported ( Technology: Software & web-based/digital services ) - EmployerAffected
Total individuals affected: Number unknown
Migrant & immigrant workers: ( Number unknown - Location unknown , Technology: Software & web-based/digital services , Gender not reported , Unknown migration status )Issues
Freedom of Association , Right to Unionisation , Dismissal , Access to Information , Reasonable Working Hours & Leisure Time , Denial of leaveResponse
Response sought: Yes, by https://equidem.org/wp-content/uploads/2025/05/Equidem-Data-Workers-Report-May-29.pdf
Action taken: Equidem shared the reports findings with Meta & OpenAI. Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. OpenAI said it conducted an investigation and found no evidence to the support the claims. The responses can be read in more detail in the report.
Source type: NGO
အကျဉ်းချုပ်
Date Reported: 28 May 2025
Location: Kenya
Companies
Remotasks (part of Scale AI) - EmployerAffected
Total individuals affected: Number unknown
Workers: ( Number unknown - Location unknown - Sector unknown , Gender not reported )Issues
Excessive production targets , Irregular Work , Social Security , Access to Non-Judicial Remedy , Occupational Health & SafetyResponse
Response sought: Yes, by Journalist
External link to response: (Find out more)
Action taken: Remotask's response can be read in full in the report.
Source type: NGO
... Launched on May 28, 2025, ‘Scroll. Click. Suffer.’ is Equidem’s in-depth investigation into the hidden workforce behind AI and social media—data labellers and content moderators. Based on interviews with 113 workers across Colombia, Kenya, and the Philippines, the report exposes the extreme occupational, psychological, sexual, and economic harms faced by those moderating violent content and training AI models for platforms like Meta, TikTok, and ChatGPT.
From PTSD and substance dependency to union-busting and exploitative contracts, it reveals an industry architecture that normalises harm and conceals abuse. Tech giants outsource risk down opaque, informal supply chains, while workers are silenced by NDAs, punished for speaking out, and denied basic protections.
Grounded in worker testimonies and legal analysis, the report outlines a clear roadmap for reform—and calls on lead firms, investors, governments, and the ILO to centre the rights of digital workers...
[Equidem shared the reports findings with Bytedance, Meta, OpenAI, and Remotasks. ByteDance did not respond to Equidem. Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. OpenAI said it conducted an investigation and found no evidence to the support the claims. Remotasks also outlined safety measures in place, including a “24/7 anonymous hotline”, and noted “ongoing improvements” they claim the firm has undertaken. The responses can be read in more detail in the report]