abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

28 Apr 2020

Author:
Nicolas Kayser-Bril, AlgorithmWatch

Unchecked use of computer vision by police carries high risks of discrimination

11 local police forces in Europe use computer vision to automatically analyze images from surveillance cameras. The risks of discrimination run high but authorities ignore them... This approach requires that software developers feed large amounts of scenes depicting normality, and others representing the situations considered abnormal... The software... only finds inferences in the data it has been given... AlgorithmWatch revealed that Google Vision... classified a thermometer as a “tool” in a hand that had a light skin tone, and “gun” in a dark-skinned one. (Google since changed their system)...  

BriefCam... stated in an email that, because the software did not use skin tone as a variable, it could not discriminate... BriefCam’s spokesperson added that they used “training datasets consisting of multi-gender, multi-age and multi-race samples without minority bias,” but declined to provide any evidence or details... A spokesperson for Frauenhofer IOSB, which powers the automated surveillance of Mannheim, Germany, claimed that their software could not be discriminatory because it relied on a 3-dimensional modelling of body shapes. It analyzed movements, not images, and therefore did not use skin tone... Avigilon declined to comment. One Télécom, Two-I and Snef did not reply to numerous emails.

... How much automated surveillance impacts discrimination in policing is not known. None of the vendors or cities AlgorithmWatch contacted conducted audits to ensure that the output of their systems was the same for all citizens... [In response to the issue] Nicole Romain, spokesperson for the Agency for Fundamental Rights of the European Union, wrote that any institution deploying such technologies should conduct a “comprehensive fundamental rights impact assessment to identify potential biases”...