abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptriangletwitteruniversalitywebwhatsappxIcons / Social / YouTube
Article

31 Aug 2023

Author:
Mark Wilding, The Verge

IBM returns to facial recognition market while advocates claim incompatibility with prior company's human rights commitments

"IBM promised to back off facial recognition — then it signed a $69.8 million contract to provide it", 31 August 2023

IBM has returned to the facial recognition market — just three years after announcing it was abandoning work on the technology due to concerns about racial profiling, mass surveillance, and other human rights violations.

In June 2020, as Black Lives Matter protests swept the US after George Floyd’s murder, IBM chief executive Arvind Krishna wrote a letter to Congress announcing that the company would no longer offer “general purpose” facial recognition technology. “The fight against racism is as urgent as ever,” he wrote. “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency.” Later that year, the company redoubled its commitment, calling for US export controls to address concerns that facial recognition could be used overseas “to suppress dissent, to infringe on the rights of minorities, or to erase basic expectations of privacy.”

Despite these announcements, last month, IBM signed a $69.8 million (£54.7 million) contract with the British government to develop a national biometrics platform that will offer a facial recognition function to immigration and law enforcement officials...

The Home Office Biometrics Matcher Platform includes “strategic” matching of photos in a database

The platform will allow photos of individuals to be matched against images stored on a database — what is sometimes known as a “one-to-many” matching system. In September 2020, IBM described such “one-to-many” matching systems as “the type of facial recognition technology most likely to be used for mass surveillance, racial profiling, or other violations of human rights.”

IBM spokesman Imtiaz Mufti denied that its work on the contract was in conflict with its 2020 commitments.

“The Home Office Biometrics Matcher Platform and associated Services contract is not used in mass surveillance. It supports police and immigration services in identifying suspects against a database of fingerprint and photo data....”

Human rights campaigners, however, said IBM’s work on the project is incompatible with its 2020 commitments. Kojo Kyerewaa of Black Lives Matter UK said: “IBM has shown itself willing to step over the body and memory of George Floyd to chase a Home Office contract. This won’t be forgotten.”

Matt Mahmoudi, PhD, tech researcher at Amnesty International, said: “The research across the globe is clear; there is no application of one-to-many facial recognition that is compatible with human rights law, and companies — including IBM — must therefore cease its sale, and honor their earlier statements to sunset these tools, even and especially in the context of law and immigration enforcement where the rights implications are compounding.”

Police use of facial recognition has been linked to wrongful arrests in the US and has been challenged in the UK courts.

Other tech firms have imposed partial bans on the use of their facial recognition services for law enforcement.

Amazon initially announced a one-year moratorium on police use of its Rekognition software in June 2020 and said it would be extending the ban “indefinitely” the following year. A spokeswoman for the company confirmed that the moratorium, which prohibits “use of Amazon Rekognition’s face comparison feature by police departments in connection with criminal investigations,” is still in place.

Microsoft said in June 2020 that it would not sell facial recognition software to US police departments until a national law is introduced governing use of the technology. When contacted by The Verge and Liberty Investigates, a spokeswoman for Microsoft referred to the company’s website, which states that use of the Azure AI Face service “by or for state or local police in the US is prohibited by Microsoft policy.” 

The UK Home Office did not respond to a request for comment.