abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Opinion

12 Jan 2021

Author:
Aubrey Calaway, Earth Refuge

Artificial intelligence and the fight against human trafficking

The phrase artificial intelligence (AI) may conjure images of futuristic technology and human-like robots. But AI is already working all around us, and it could become a key player in the fight against human trafficking.

AI and machine learning are essentially about patterns. By feeding large amounts of information into a program, the machine can then trace relationships between data points that are too complex for the human brain to identify. After “learning” from historic data, AI can make guesses about the likely outcome of a new question. When it comes to anti-trafficking initiatives, this predictive capability promises both exciting innovation and potential pitfalls.

The issue of trafficking presents one opportunity for an artificial intelligence-powered intervention. The Brick Belt, an area of land extending across Pakistan, northern India, Nepal and Bangladesh, is plagued by extreme labor exploitation among its estimated 55,387 brick kilns. One exploratory study conducted at the University of Nottingham demonstrates the potential uses of machine learning to more effectively map labor abuses in this vast, largely unregulated region.

Researchers trained their program to recognize the size, shapes and shadows associated with a typical brick kiln, as seen in the image below. As a result, it was able to identify 100% of kiln sites in a test area through a fully automated method. This technology still requires the assistance of a human eye to correct inevitable mistakes, like misidentifying non-kiln sites as kilns in this case. But even without perfect accuracy, it could dramatically reduce the manual labor required to fully map a variety of industries associated with labor exploitation, from mines to mangrove fisheries.

Examples of training samples used in the brick kiln study. The top row shows images with brick kilns while the bottom shows non-kiln samples.

The use of AI to predict labor exploitation risk is quickly moving from theory to reality. FRDM, a “social tech company” and project of anti-trafficking group Made in A Free World, helps its corporate clients stay up-to-date on real-time risk developments in their supply chains. They do this, in part, by employing machine learning “crawlers” to scour the media for relevant news stories about labor abuses. Researchers speculate that AI could also help governments and NGOs predict changing patterns of exploitation.

This includes the team behind Apprise, a software used to screen vulnerable workers for indicators of labor exploitation through their mobile phones. As perpetrators adjust their practices to avoid detection or increase profits, machine learning could allow Apprise and others to identify these changes in their data much earlier. While AI does not fix human error—like poor quality screening responses from workers on the ground—it could help anti-trafficking efforts stay on top of developing trends.

Facial recognition technology, and its deployment against child sex trafficking, has generated significant buzz in the past few years. Demi Moore and Ashton Kutcher lent their celebrity status to this effort by founding Thorn, a tech-oriented organization dedicated to defending children from sexual exploitation. Their flagship program, Spotlight, uses Amazon’s proprietary facial recognition technology to help track missing children, in part by matching them with online sex ads on the dark web. According to Thorn, Spotlight has been used to identify about 10 juvenile victims per day and over 16,000 traffickers in total since its launch in 2014.

While these numbers are promising, the use of AI-powered facial recognition technology is a hotly contested issue. In a test conducted by the American Civil Liberties Union (ACLU) in 2018, the organization found that Amazon’s Rekognition software incorrectly matched 28 members of Congress to images of other individuals who had been arrested for a crime. Notably, it misidentified a disproportionate number of people of color.

Another study found that several commercial facial analysis systems falsely identified dark-skinned females more than any other group, with an error rate of 35.7%, compared to 0.8% for lighter-skinned males. This may be due to a strong bias in the types of faces that these technologies have been trained on historically. As a result, anti-trafficking initiatives must be cognizant that their AI-powered tools, if built on a database of largely white and male images, could be less effective at identifying female victims of color.

The ACLU and others have also sounded the alarm about who has access to this technology, and how they might employ it. While law enforcement agencies have had access to facial recognition tools for decades, their image databases have largely been restricted to government sources like mugshots and drivers’ licenses. But companies like the now infamous Clearview AI have begun to provide federal and state law enforcement with multi-billion-image databases scraped from websites like Facebook and YouTube. This kind of big data could, in theory, help programs like Thorn’s Spotlight more effectively crawl the dark web to identify victims of trafficking for sexual exploitation. But the safeguards against members of law enforcement or an entire agency exploiting it for darker purposes, like surveilling protestors or consensual sex workers, are lacking.

Artificial intelligence promises to be a powerful tool in the global fight against various forms of human trafficking. By mapping patterns and making predictions using huge amounts of data, machine learning could help activists, governments, and corporations more effectively tackle this ever-evolving issue. But AI itself is not a panacea, and its real-world implementation may give rise to a variety of ethical and practical problems. As this technology becomes more accessible, anti-trafficking efforts should remain on alert for the serious bias and privacy concerns that come with smarter tools.

Aubrey Calaway is writer and researcher who has investigated issues of climate change, human trafficking, and community resilience. She currently works as a correspondent at Earth Refuge.