abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

This page is not available in Deutsch and is being displayed in English

Artikel

China: Huawei reportedly tested facial recognition software that could identify minorities

"Huawei tested AI software that could recognize Uighur minorities and alert police, report says", 8 December 2020

The Chinese tech giant Huawei has tested facial recognition software that could send automated “Uighur alarms” to government authorities when its camera systems identify members of the oppressed minority group, according to an internal document that provides further details about China’s artificial-intelligence surveillance regime.

A document signed by Huawei representatives — discovered by the research organization IPVM and shared exclusively with The Washington Post — shows that the telecommunications firm worked in 2018 with the facial recognition start-up Megvii to test an artificial-intelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity.

If the system detected the face of a member of the mostly Muslim minority group, the test report said, it could trigger a “Uighur alarm” — potentially flagging them for police in China, where members of the group have been detained en masse as part of a brutal government crackdown. The document, which was found on Huawei’s website, was removed shortly after The Post and IPVM asked the companies for comment. [...]

Both companies have acknowledged the document is real. Shortly after this story published Tuesday morning, Huawei spokesman Glenn Schloss said the report “is simply a test and it has not seen real-world application. Huawei only supplies general-purpose products for this kind of testing. We do not provide custom algorithms or applications.”

Also after publication, a Megvii spokesman said the company’s systems are not designed to target or label ethnic groups. [...]

Part of the following stories

China: 83 major brands implicated in report on forced labour of ethnic minorities from Xinjiang assigned to factories across provinces; Includes company responses

China: 83 major brands implicated in report on forced labour of ethnic minorities from Xinjiang assigned to factories across provinces; Includes company responses

China: Mounting concerns over forced labour in Xinjiang

China: Huawei's facial recognition software capable of identifying minorities stirs controversy