abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

28 Jan 2019

Author:
Joy Buolamwini, Medium

Commentary: Amazon should halt use of facial recognition technology for policing & govt. surveillance

See all tags

In this article... I... address.... criticisms.... made by those with interest in keeping the use, abuse, and technical immaturity of AI systems in the dark... AI services the company provides to law enforcement and other customers can be abused regardless of accuracy... Among the most concerning uses of facial analysis technology involve the bolstering of mass surveillance, the weaponization of AI, and harmful discrimination in law enforcement contexts...  Because this powerful technology is being rapidly developed and adopted without oversight, the Algorithmic Justice League and the Center on Privacy & Technology launched the Safe Face Pledge. The pledge prohibits lethal use of any kind of facial analysis technology including facial recognition and aims to mitigate abuses. 

As an expert on bias in facial analysis technology, I advise Amazon to

1) immediately halt the use of facial recognition and any other kinds of facial analysis technology in high-stakes contexts like policing and government surveillance

2) submit company models currently in use by customers to the National Institute of Standards and Technology benchmark

Part of the following timelines

Shareholders & civil society groups urge Amazon to halt sale of facial recognition software to law enforcement agencies

Facial analysis technology often recreates racial & gender bias, says expert