abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Diese Seite ist nicht auf Deutsch verfügbar und wird angezeigt auf English


19 Jun 2018

Jamie Condliffe, The New York Times

Amazon urged not to sell facial recognition software to police

Alle Tags anzeigen

A group of 19 socially responsible investors, including firms like Sustainvest Asset Management and the Social Equity Group, are applying pressure to Amazon over privacy concerns that they have about the technology... Amazon began marketing a facial recognition system, called Rekognition, to law enforcement agencies as a means of identifying suspects shortly after the tool was introduced in 2016... [R]ecently, Amazon came under criticism from the American Civil Liberties Union and a group of more than two dozen civil rights organizations for selling the technology to police authorities. The A.C.L.U.’s argument: The police could use such systems not just to track people committing crimes but also to identify citizens who are innocent, such as protesters... In a letter addressed to the company’s chief executive... a group of investors explained why they want a halt to Rekognition sales to the police... Amazon had no immediate comment on the letter. In a blog post published shortly after the initial call by the A.C.L.U. to ban the sale of Rekognition to the police, Matt Wood, general manager of artificial intelligence at Amazon Web Services, wrote: "We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future."