Shareholders & civil society groups urge Amazon to halt sale of facial recognition software to law enforcement agencies

Get RSS feed of these results

All components of this story

Article
12 July 2018

Amazon workers oppose sale of the co's technology to US immigration

Author: Anthony Cuthbertson, The Independent

"AMAZON WORKERS 'REFUSE' TO BUILD TECH FOR US IMMIGRATION, WARNING JEFF BEZOS OF IBM'S NAZI LEGACY", 22 June 2018

[Amazon workers] have written to the company's CEO, Jeff Bezos, to protest the sale of facial recognition tools and other technology to police departments and government agencies. [They] cite the use of Amazon technology by the US Department of Homeland Security and the Immigration and Customs Enforcement (ICE) agency, which have been criticised for enforcing Donald Trump's "zero tolerance" policy, which has seen parents separated from their children at the US border... The letter comes days after 19 Amazon shareholders wrote to Mr Bezos urging him to halt the sale of facial recognition to police and government agencies... The American Civil Liberties Union revealed (...) how Amazon is marketing its powerful AWS Rekognition tool to law enforcement agencies, a practice the workers claim is making the firm implicit in alleged human rights abuses... In their letter to Mr Bezos, the Amazon workers said they would "refuse to build the platform" which powers ICE, or any technology used to violate human rights... Amazon refused to provide a comment on the record about the letter...

Read the full post here

Article
9 July 2018

Commentary: When the robot doesn't see dark skin

Author: Joy Buolamwini, The New York Times

When I was a college student using A.I.-powered facial detection software for a coding project, the robot I programmed couldn’t detect my dark-skinned face. I had to borrow my white roommate’s face to finish the assignment... My experience is a reminder that artificial intelligence, often heralded for its potential to change the world, can actually reinforce bias and exclusion... A.I. systems are shaped by the priorities and prejudices — conscious and unconscious — of the people who design them, a phenomenon that I refer to as “the coded gaze.” Research has shown that automated systems that are used to inform decisions about sentencing produce results that are biased against black people and that those used for selecting the targets of online advertising can discriminate based on race and gender.

... Canada has a federal statute governing the use of biometric data in the private sector. Companies like Facebook and Amazon must obtain informed consent to collect citizens’ unique face information. In the European Union, Article 9 of the General Data Protection Regulationrequires express affirmative consent for collection of biometrics from E.U. citizens. Everyday people should support lawmakers, activists and public-interest technologists in demanding transparency, equity and accountability in the use of artificial intelligence that governs our lives.

Read the full post here

Article
22 June 2018

Amazon shareholders ask company to stop selling facial recognition technology to governments following NGO warnings

Author: Jeremy White, The Independent

"Amazon shareholders demand company stop selling facial recognition technology to governments", 19 June 2018

A group of Amazon shareholders is asking CEO Jeff Bezos to stop selling and marketing pattern-recognition technology to governments after civil liberties groups warned of the potential for abuse. Earlier this year, a group of advocacy organisations led by the American Civil Liberties Union (ACLU) published a report detailing how Amazon was marketing its Rekognition tool to American law enforcement agencies...A letter signed by 19 shareholders - and provided to The Independent by the ACLU - urges Mr Bezos to halt the tool's expansion until those concerns can be addressed..."While Rekognition may be intended to enhance some law enforcement activities, we are deeply concerned it may ultimately violate civil and human rights", the letter said. "We are concerned the technology would be used to unfairly and disproportionately target and surveil people of colour, immigrants, and civil society organisations"...Shareholders of major technology companies have increasingly been broadcasting their concerns about the financial and reputational damage of privacy violations.

Read the full post here

Article
20 June 2018

Amazon general manager of artificial intelligence highlights positive uses of Rekognition & acceptable use policy

Author: Dr. Matt Wood, Amazon Web Services Maching Learning Blog

"Some quick thoughts on the public discussion regarding facial recognition and Amazon Rekognition this past week," 1 June 2018

Amazon Rekognition... makes use of new technologies – such as deep learning – and puts them in the hands of developers in an easy-to-use, low-cost way... [W]e have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families...), and organizations (enhancing security through multi-factor authentication)... Amazon Web Services (AWS) is not the only provider of services like these, and we remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement. There has been no reported law enforcement abuse of Amazon Rekognition. We also have an Acceptable Use Policy (“AUP”) that prohibits the use of our services for “[a]ny activities that are illegal, that violate the rights of others, or that may be harmful to others.” This includes violating anybody’s Constitutional rights relating to the 4th, 5th, and 14th Amendments – essentially any kind of illegal discrimination or violation of due process or privacy right. Customers in violation of our AUP are prevented from using our services.

...There have always been and will always be risks with new technology capabilities... AWS takes its responsibilities seriously. But we believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future.  

Read the full post here

Article
19 June 2018

Amazon urged not to sell facial recognition software to police

Author: Jamie Condliffe, The New York Times

A group of 19 socially responsible investors, including firms like Sustainvest Asset Management and the Social Equity Group, are applying pressure to Amazon over privacy concerns that they have about the technology... Amazon began marketing a facial recognition system, called Rekognition, to law enforcement agencies as a means of identifying suspects shortly after the tool was introduced in 2016... [R]ecently, Amazon came under criticism from the American Civil Liberties Union and a group of more than two dozen civil rights organizations for selling the technology to police authorities. The A.C.L.U.’s argument: The police could use such systems not just to track people committing crimes but also to identify citizens who are innocent, such as protesters... In a letter addressed to the company’s chief executive... a group of investors explained why they want a halt to Rekognition sales to the police... Amazon had no immediate comment on the letter. In a blog post published shortly after the initial call by the A.C.L.U. to ban the sale of Rekognition to the police, Matt Wood, general manager of artificial intelligence at Amazon Web Services, wrote: "We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future."

Read the full post here

Article
18 June 2018

Letter from Nationwide Coalition to Amazon CEO Jeff Bezos regarding Rekognition

Author: American Civil Liberties Union (ACLU), American-Arab Anti-Discrimination Committee, Human Rights Watch, Witness & 63 other civil society groups

The undersigned coalition of organizations are dedicated to protecting civil rights and liberties and safeguarding communities. We write today to express our profound concerns about your company's facial recognition system, Rekognition. We demand that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country... As advertised, Rekognition is a powerful surveillance system readily available to violate rights and target communities of color... Amazon also encourages the use of Rekognition to monitor “people of interest,” raising the possibility that those labeled suspicious by governments—such as undocumented immigrants or Black activists—will be targeted for Rekognition surveillance.

... Amazon Rekognition is primed for abuse in the hands of governments. This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build. Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers, and take Rekognition off the table for governments.

Read the full post here

Article
15 June 2018

Letter from shareholders to Amazon CEO Jeff Bezos regarding Rekognition

Author: As You Sow, Northwest Coalition for Responsible Investment & Walden Asset Management & 16 other investors

According to multiple news reports–confirmed by Amazon—our Company has developed and is selling to law enforcement agencies, marketed as part of Amazon Web Services (AWS), a facial recognition system called Rekognition. The undersigned Amazon shareholders are concerned such government surveillance infrastructure technology may not only pose a privacy threat to customers and other stakeholders across the country, butmayalso raise substantial risks for our Company, negatively impactingour company's stock valuation and increasing financial risk for shareholders. To date, we have seen no evidence of our Board of Directors conducting fiduciary oversight on how Rekognition may or may not, should or should not, be deployed. The recent experience and scrutiny of Facebook demonstrates the degree to which these new issues may undermine company value as the detrimental impacts on society become clear. While Rekognition may be intended toenhance some law enforcement activities, we are deeply concerned it may ultimately violate civil and human rights.

... [W]e are also concerned sales may be expanded to foreign governments, including authoritarian regimes. Without protective policies in place, it seems inevitable the application of these technologies will result in Amazon's Rekognition being used to identify and detain democracy advocates. We believe our Company needs toimmediately halt the expansion, further development, and marketing of Rekognition, and any other surveillance technologies, to all governments and government agencies, until there is a clear demonstration our Board of Directors has undertaken appropriate fiduciary oversight and placed appropriate guidelines and policies in place to safeguard therights of our customers, shareholders, other stakeholders and citizens.

Read the full post here

Company non-response
12 June 2018

Amazon non-response

Amazon was urged to disclose public bias testing data for its facial recognition software. We invited Amazon to respond; company didn't respond.

Article
12 June 2018

Amazon urged to disclose public bias testing data for its facial recognition software; co. didn't respond

Author: Russell Brandom, The Verge

"Amazon needs to come clean about racial bias in its algorithms", 23 May 2018

[A]mazon’s quiet Rekognition program became very public, as new documents obtained by the ACLU of Northern California showed the system partnering with the city of Orlando and police camera vendors like Motorola Solutions for an aggressive new real-time facial recognition service. Amazon insists that the service is a simple object-recognition tool and will only be used for legal purposes. But even if we take the company at its word, the project raises serious concerns, particularly around racial bias...Facial recognition systems have long struggled with higher error rates for women and people of color — error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasn’t shared any data on the issue, if it’s collected data at all...ACLU-NC’s Matt Cagle [said]: "“Face recognition is a biased technology. It doesn’t make communities safer. It just powers even greater discriminatory surveillance and policing.”...In the most basic terms... facial recognition systems pose an added threat of wrongful accusation and arrest for non-white people...I asked Amazon directly if the company has any data on bias testing for Rekognition, but so far, nothing has turned up...

Read the full post here