abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

19 Dec 2019

Author:
Drew Harwell, Washington Post

USA: Federal study confirms racial bias of many facial-recognition systems, casts doubt on their expanding use

Facial-recognition systems misidentified people of color more often than white people, a landmark federal study released... shows, casting new doubts on a rapidly expanding investigative technique widely used by law enforcement across the United States... Asian and African American people were up to 100 times as likely to be misidentified than white men... Native Americans had the highest false-positive rate of all ethnicities... The study could fundamentally shake one of American law enforcement’s fastest-growing tools for identifying criminal suspects and witnesses, which privacy advocates have argued is ushering in a dangerous new wave of government surveillance tools.

... The federal report confirms previous studies from researchers who found similarly staggering error rates. Companies such as Amazon had criticized those studies, saying they reviewed outdated algorithms or used the systems improperly... [Researcher] Joy Buolamwini said the study was a “comprehensive rebuttal” to skeptics of what researchers call “algorithmic bias.”... "[This study is] a sobering reminder that facial recognition technology has consequential technical limitations alongside posing threats to civil rights and liberties.”