abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Esta página não está disponível em Português e está sendo exibida em English

Opinião

15 Nov 2023

Author:
Sani Suleiman,
Author:
Khadija El-Usman

Inside the hidden battles of Africa’s gig workers: The experience of content moderators in Kenya

Shutterstock (purchased)

In Africa, the gig economy is growing rapidly, providing workers with more freedom to work and live efficiently. Despite gig work often being characterised as unstable and contingent, more Africans have been taking on gig work in recent years. According to the MasterCard Foundation's research on Digital Commerce and Youth Employment in Africa, the gig economy is growing at an average rate of 20% per year and is expected to reach 80 million workers by 2030.

Content moderation is one of the most popular forms of gig work and is vital in shaping a safe and productive digital environment. This rings especially true for content moderators in Kenya, with whom we had the privilege to converse. Our conversation unveiled the many challenges and unfair treatment they are subjected to, including misleading job descriptions, unrealistic metrics, inadequate support, micromanagement, confidentiality concerns, pay disparities and threats against those advocating for fair treatment. These critical human rights concerns demand urgent attention.

A content moderator's journey in Kenya often commences with a misleading job description. Potential candidates are typically lured in by advertisements for call center agent positions, particularly for those who will work on extra local languages like Zulu, Hausa, and others. However, once they have accepted the job, their responsibilities are far more demanding. Moderators are entrusted with the critical task of reviewing and flagging content that violates social media platforms' community guidelines. This content can range from graphic violence to hate speech and child sexual abuse.

Once employed, moderators often grapple with stringent and, at times, unrealistic metrics. For instance, they are initially expected to review or moderate content within 70 seconds, a timeline later reduced to 50 seconds. Within this timeframe, moderators are required to read, find violations, make decisions, and apply labels. These metrics are measured through indicators like Average Handling Time (AHT) or No Handling Time (NHT). However, these expectations apply to all content, regardless of its length, pushing moderators to rush through their tasks, ultimately increasing the risk of errors.

Micro-management and confidentiality concerns are prevalent in the lives of content moderators. Supervisors maintain close scrutiny over their work and productivity, creating a stressful and often hostile working environment. Moderators are even asked to sign non-disclosure agreements, and personal items like phones are prohibited in the workplace. Additionally, there may be no transparent process for reporting labour law violations or workplace safety concerns. “Instances have been noted where supervisors or team leaders are asking moderators to come to the office when applying for sick leave,” says one of the Kenyan moderators we get to interact with.

Furthermore, concerns regarding the confidentiality of communications with company counselors add to the complexity. These counselors are often inadequately trained to handle the emotional trauma experienced by moderators exposed to distressing content. For example, the Facebook contractor, US-based Sama, did little to provide post-traumatic professional counseling to moderators in its Nairobi office. This has left moderators seeking solace in alternative ways, like immersing themselves in religion.

Content moderators in Africa are also paid significantly less than their counterparts in Europe and the US, even though they perform the same work and follow the same platform guidelines. Moderators who speak out for fairer pay often face threats from their employers, including blacklisting, job loss, and sometimes legal action. Consider the case of Daniel Motaung, a content moderator in Kenya, who spearheaded a protest for equal pay and equal rights against Samasource, popularly known as Sama, a training data company focusing on annotating and validating data for machine learning algorithms. First reported by Time Magazine in February 2019, the protest sought to address these disparities and initiate the formation of a union. However, the effort faced strong opposition, leading to Daniel's dismissal and threats against other advocates. The remaining moderators were given an ultimatum: sign an agreement to distance themselves from the protesters or face job loss.

Despite all of that, In a disheartening turn of events, those content moderators at Sama, Meta's content moderation partner in Africa, faced redundancy in January 2023. This led to the loss of over 200 content moderator jobs in their Nairobi office, particularly affecting those handling additional languages such as Hausa and Zulu. Forty-three of these moderators filed a lawsuit challenging their dismissal by SAMA and Meta. While the legal battle was ongoing, Meta transitioned its moderation business to another outsourcing company, Majorel. These redundant moderators found themselves blacklisted from seeking employment with Majorel, pushing them to pursue legal action to secure their right to work. At the time of writing, they were filing a constitutional petition in Kenya's Employment and Labour Relations Court against Facebook, Sama, and Majorel, citing unlawful discrimination in retaliation against employees seeking better working conditions.

Given a majority of these platforms are headquartered in the Global North, particularly in the United States and Europe, it is imperative for governments and businesses within these jurisdictions to collaborate to foster equitable labour practices in the Global South. Such collaboration should encompass investments in education, training initiatives and the financial backing of advocacy programmes designed for gig workers. Additionally, technical assistance should be extended to support gig companies in enhancing their working conditions. African governments, particularly those with established National Action Plans on Business and Human Rights (NAP-BHR), should incorporate gig labour into their frameworks. Similarly, jurisdictions lacking such plans must include gig labour concerns in their NAPs and other relevant national instruments. This comprehensive approach will play a pivotal role in mitigating the exploitative nature of gig work and preventing further injustices.

It is also crucial for civil society organisations to continue advocating for greater transparency and accountability from gig platforms. This could involve requiring them to disclose how they set pay rates and metrics, ensuring wages meet the level of effort and economic realities for gig workers, and providing workers with a way to appeal company decisions in instances of labour and human rights violations. This is especially important in the African gig sector, where most host governments lack policies that uphold international labour rights for gig workers.

By Sani Suleiman and Khadija El-Usman