Shareholders & civil society groups urge Amazon to halt sale of facial recognition software to law enforcement agencies

Get RSS feed of these results

All components of this story

28 January 2019

Commentary: Amazon should halt use of facial recognition technology for policing & govt. surveillance

Author: Joy Buolamwini, Medium

In this article... I... address.... criticisms.... made by those with interest in keeping the use, abuse, and technical immaturity of AI systems in the dark... AI services the company provides to law enforcement and other customers can be abused regardless of accuracy... Among the most concerning uses of facial analysis technology involve the bolstering of mass surveillance, the weaponization of AI, and harmful discrimination in law enforcement contexts...  Because this powerful technology is being rapidly developed and adopted without oversight, the Algorithmic Justice League and the Center on Privacy & Technology launched the Safe Face Pledge. The pledge prohibits lethal use of any kind of facial analysis technology including facial recognition and aims to mitigate abuses. 

As an expert on bias in facial analysis technology, I advise Amazon to

1) immediately halt the use of facial recognition and any other kinds of facial analysis technology in high-stakes contexts like policing and government surveillance

2) submit company models currently in use by customers to the National Institute of Standards and Technology benchmark

Read the full post here

28 January 2019

Commentary: Thoughts on recent research paper and associated article on Amazon Rekognition

Author: Dr. Matt Wood, AWS Machine Learning Blog

A research paper and associated article published yesterday made claims about the accuracy of Amazon Rekognition... this research paper and article are misleading and draw false conclusions... The research paper seeks to “expose performance vulnerabilities in commercial facial recognition products,” but uses facial analysis as a proxy... [F]acial analysis and facial recognition are two separate tools; it is not possible to use facial analysis to match faces in the same way as you would in facial recognition... The research paper states that Amazon Rekognition provides low quality facial analysis results. This does not reflect our own extensive testing... The research papers implies that Amazon Rekognition is not improving, and that AWS is not interested in discussing issues around facial recognition. This is false. We are now on our fourth significant version update of Amazon Rekognition.

... We know that facial recognition technology, when used irresponsibly, has risks... It’s also why we clearly recommend in our documentation that facial recognition results should only be used in law enforcement when the results have confidence levels of at least 99%, and even then, only as one artifact of many in a human-driven decision.‎ But, we remain optimistic about the good this technology‎ will provide in society.

Read the full post here

3 December 2018

Commentary: Facial recognition can be positive but companies need to ensure human rights are upheld

Author: Amira Dhalla, Medium

"Amazon's facial recognition technology scares me. You should be scared too."

When used appropriately, the broader society can... benefit from... positive uses of facial recognition... [F]acial recognition can also be harmful as it infringes on individuals’ human rights by gathering and using their data without consent... Government can also use these tools to surveil people... [which] poses a threat to our most vulnerable communities and unjustly targets particular individuals such as people of color... [T]here are currently no rules to govern how this tool is to be used by private or public sectors.

... Amazon has focused on the positive impact of Rekognition but what makes it different from other technology companies is the fact that since the product was conceived, they have been marketing and selling it to law enforcement as a way to track and identify criminal suspects... ACLU formed a coalition of civil rights groups calling on Amazon to stop selling the program to law enforcement... The future of facial recognition can be positive but we need the organizations creating these tools to lead the discussion on how we can create these tools while upholding the rights of all citizens... Amazon has a responsibility to do what it can to ensure that innovative tools are used in ethical ways... We need to pressure Amazon to protect the rights to privacy and freedom of citizens in the United States by suspending the sales of Rekognition to law enforcement and instead starting a dialogue on how we use tools like facial recognition in humane and secure ways.

Read the full post here

27 July 2018

Amazon recommends 99% or higher confidence match when using facial recognition for law enforcement

Author: Dr. Matt Wood, Amazon blog

"Thoughts on machine learning accuracy," 27 Jul 2018

This blog shares some brief thoughts on machine learning accuracy and bias...  Using Rekognition, the ACLU built a face database using 25,000 publicly available arrest photos and then performed facial similarity searches on that database using public photos of all current members of Congress. They found 28 incorrect matches out of 535... Some thoughts on their claims:

  • The default confidence threshold for facial recognition APIs in Rekognition is 80%, which is good for a broad set of general use cases... but it’s not the right setting for public safety use cases... We recommend 99% for use cases where highly accurate face similarity matches are important...
  • In real-world public safety and law enforcement scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment...,where it can help find lost children, fight against human trafficking, or prevent crimes. 

There’s a difference between using machine learning to identify a food object and using machine learning to determine whether a face match should warrant considering any law enforcement action. The latter is serious business and requires much higher confidence levels. We continue to recommend that customers do not use less than 99% confidence levels for law enforcement matches, and then to only use the matches as one input across others that make sense for each agency.

Read the full post here

12 July 2018

Amazon workers oppose sale of the co's technology to US immigration

Author: Anthony Cuthbertson, The Independent


[Amazon workers] have written to the company's CEO, Jeff Bezos, to protest the sale of facial recognition tools and other technology to police departments and government agencies. [They] cite the use of Amazon technology by the US Department of Homeland Security and the Immigration and Customs Enforcement (ICE) agency, which have been criticised for enforcing Donald Trump's "zero tolerance" policy, which has seen parents separated from their children at the US border... The letter comes days after 19 Amazon shareholders wrote to Mr Bezos urging him to halt the sale of facial recognition to police and government agencies... The American Civil Liberties Union revealed (...) how Amazon is marketing its powerful AWS Rekognition tool to law enforcement agencies, a practice the workers claim is making the firm implicit in alleged human rights abuses... In their letter to Mr Bezos, the Amazon workers said they would "refuse to build the platform" which powers ICE, or any technology used to violate human rights... Amazon refused to provide a comment on the record about the letter...

Read the full post here

9 July 2018

Commentary: When the robot doesn't see dark skin

Author: Joy Buolamwini, The New York Times

When I was a college student using A.I.-powered facial detection software for a coding project, the robot I programmed couldn’t detect my dark-skinned face. I had to borrow my white roommate’s face to finish the assignment... My experience is a reminder that artificial intelligence, often heralded for its potential to change the world, can actually reinforce bias and exclusion... A.I. systems are shaped by the priorities and prejudices — conscious and unconscious — of the people who design them, a phenomenon that I refer to as “the coded gaze.” Research has shown that automated systems that are used to inform decisions about sentencing produce results that are biased against black people and that those used for selecting the targets of online advertising can discriminate based on race and gender.

... Canada has a federal statute governing the use of biometric data in the private sector. Companies like Facebook and Amazon must obtain informed consent to collect citizens’ unique face information. In the European Union, Article 9 of the General Data Protection Regulationrequires express affirmative consent for collection of biometrics from E.U. citizens. Everyday people should support lawmakers, activists and public-interest technologists in demanding transparency, equity and accountability in the use of artificial intelligence that governs our lives.

Read the full post here

22 June 2018

Amazon shareholders ask company to stop selling facial recognition technology to governments following NGO warnings

Author: Jeremy White, The Independent

"Amazon shareholders demand company stop selling facial recognition technology to governments", 19 June 2018

A group of Amazon shareholders is asking CEO Jeff Bezos to stop selling and marketing pattern-recognition technology to governments after civil liberties groups warned of the potential for abuse. Earlier this year, a group of advocacy organisations led by the American Civil Liberties Union (ACLU) published a report detailing how Amazon was marketing its Rekognition tool to American law enforcement agencies...A letter signed by 19 shareholders - and provided to The Independent by the ACLU - urges Mr Bezos to halt the tool's expansion until those concerns can be addressed..."While Rekognition may be intended to enhance some law enforcement activities, we are deeply concerned it may ultimately violate civil and human rights", the letter said. "We are concerned the technology would be used to unfairly and disproportionately target and surveil people of colour, immigrants, and civil society organisations"...Shareholders of major technology companies have increasingly been broadcasting their concerns about the financial and reputational damage of privacy violations.

Read the full post here

20 June 2018

Amazon general manager of artificial intelligence highlights positive uses of Rekognition & acceptable use policy

Author: Dr. Matt Wood, Amazon Web Services Maching Learning Blog

"Some quick thoughts on the public discussion regarding facial recognition and Amazon Rekognition this past week," 1 June 2018

Amazon Rekognition... makes use of new technologies – such as deep learning – and puts them in the hands of developers in an easy-to-use, low-cost way... [W]e have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families...), and organizations (enhancing security through multi-factor authentication)... Amazon Web Services (AWS) is not the only provider of services like these, and we remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement. There has been no reported law enforcement abuse of Amazon Rekognition. We also have an Acceptable Use Policy (“AUP”) that prohibits the use of our services for “[a]ny activities that are illegal, that violate the rights of others, or that may be harmful to others.” This includes violating anybody’s Constitutional rights relating to the 4th, 5th, and 14th Amendments – essentially any kind of illegal discrimination or violation of due process or privacy right. Customers in violation of our AUP are prevented from using our services.

...There have always been and will always be risks with new technology capabilities... AWS takes its responsibilities seriously. But we believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future.  

Read the full post here

19 June 2018

Amazon urged not to sell facial recognition software to police

Author: Jamie Condliffe, The New York Times

A group of 19 socially responsible investors, including firms like Sustainvest Asset Management and the Social Equity Group, are applying pressure to Amazon over privacy concerns that they have about the technology... Amazon began marketing a facial recognition system, called Rekognition, to law enforcement agencies as a means of identifying suspects shortly after the tool was introduced in 2016... [R]ecently, Amazon came under criticism from the American Civil Liberties Union and a group of more than two dozen civil rights organizations for selling the technology to police authorities. The A.C.L.U.’s argument: The police could use such systems not just to track people committing crimes but also to identify citizens who are innocent, such as protesters... In a letter addressed to the company’s chief executive... a group of investors explained why they want a halt to Rekognition sales to the police... Amazon had no immediate comment on the letter. In a blog post published shortly after the initial call by the A.C.L.U. to ban the sale of Rekognition to the police, Matt Wood, general manager of artificial intelligence at Amazon Web Services, wrote: "We believe it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future."

Read the full post here

18 June 2018

Letter from Nationwide Coalition to Amazon CEO Jeff Bezos regarding Rekognition

Author: American Civil Liberties Union (ACLU), American-Arab Anti-Discrimination Committee, Human Rights Watch, Witness & 63 other civil society groups

The undersigned coalition of organizations are dedicated to protecting civil rights and liberties and safeguarding communities. We write today to express our profound concerns about your company's facial recognition system, Rekognition. We demand that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country... As advertised, Rekognition is a powerful surveillance system readily available to violate rights and target communities of color... Amazon also encourages the use of Rekognition to monitor “people of interest,” raising the possibility that those labeled suspicious by governments—such as undocumented immigrants or Black activists—will be targeted for Rekognition surveillance.

... Amazon Rekognition is primed for abuse in the hands of governments. This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build. Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers, and take Rekognition off the table for governments.

Read the full post here