abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Story

5 Dec 2019

EU Fundamental Rights Agency looks at human rights implications of using facial recognition technology in law enforcement

Facial recognition technology (FRT) makes it possible to compare digital facial images to determine whether they are of the same person. Increasingly, private companies and public authorities are already using this technology, and several EU Member States are considering, testing or planning to use it for law enforcement purposes. While the accuracy of the technology is improving, a real risk of errors remains – particularly for minority groups. Moreover, people may not be aware their images are being captured and processed, and thus cannot challenge possible misuses. 

A new paper by the European Union Agency for Fundamental Rights outlines and analyses these and other rights challenges the use of FRT for law enforcement purposes poses, and presents recommendations to help avoid rights violations.