abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Diese Seite ist nicht auf Deutsch verfügbar und wird angezeigt auf English

Bericht

28 Mai 2025

Autor:
By Equidem (UK)

Scroll. Click. Suffer: The Hidden Human Cost of Content Moderation and Data Labelling

... Launched on May 28, 2025, ‘Scroll. Click. Suffer.’ is Equidem’s in-depth investigation into the hidden workforce behind AI and social media—data labellers and content moderators. Based on interviews with 113 workers across Colombia, Kenya, and the Philippines, the report exposes the extreme occupational, psychological, sexual, and economic harms faced by those moderating violent content and training AI models for platforms like Meta, TikTok, and ChatGPT.

From PTSD and substance dependency to union-busting and exploitative contracts, it reveals an industry architecture that normalises harm and conceals abuse. Tech giants outsource risk down opaque, informal supply chains, while workers are silenced by NDAs, punished for speaking out, and denied basic protections.

Grounded in worker testimonies and legal analysis, the report outlines a clear roadmap for reform—and calls on lead firms, investors, governments, and the ILO to centre the rights of digital workers...

[Equidem shared the reports findings with Bytedance, Meta, OpenAI, and Remotasks. ByteDance did not respond to Equidem. Meta said it takes “the support of content reviewers seriously” and that it “requires all the companies” it workers with to “provide 24/7 on-site support with trained practitioners” and other safety measures. OpenAI said it conducted an investigation and found no evidence to the support the claims. Remotasks also outlined safety measures in place, including a “24/7 anonymous hotline”, and noted “ongoing improvements” they claim the firm has undertaken. The responses can be read in more detail in the report]

Informationen zum Datenschutz

Diese Website verwendet Cookies und andere Web-Speichertechnologien. Sie können Ihre Datenschutzeinstellungen unten festlegen. Die Änderungen werden sofort wirksam.

Weitere Informationen über unsere Nutzung von Webspeicherung finden Sie in unserer Richtlinie zur Datennutzung und Cookies

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Analytics-Cookie

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

Ihre Privatsphäre-Einstellungen für diese Website

Diese Website verwendet Cookies und andere Web-Speichertechnologien, um Ihre Erfahrung über die notwendigen Kernfunktionen hinaus zu verbessern.