abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

이 페이지는 한국어로 제공되지 않으며 English로 표시됩니다.

기사

2020년 12월 15일

저자:
Drew Harwell and Eva Dou, Washington Post (USA)

China: Huawei reportedly tested facial recognition software that could identify minorities

"Huawei tested AI software that could recognize Uighur minorities and alert police, report says", 8 December 2020

The Chinese tech giant Huawei has tested facial recognition software that could send automated “Uighur alarms” to government authorities when its camera systems identify members of the oppressed minority group, according to an internal document that provides further details about China’s artificial-intelligence surveillance regime.

A document signed by Huawei representatives — discovered by the research organization IPVM and shared exclusively with The Washington Post — shows that the telecommunications firm worked in 2018 with the facial recognition start-up Megvii to test an artificial-intelligence camera system that could scan faces in a crowd and estimate each person’s age, sex and ethnicity.

If the system detected the face of a member of the mostly Muslim minority group, the test report said, it could trigger a “Uighur alarm” — potentially flagging them for police in China, where members of the group have been detained en masse as part of a brutal government crackdown. The document, which was found on Huawei’s website, was removed shortly after The Post and IPVM asked the companies for comment. [...]

Both companies have acknowledged the document is real. Shortly after this story published Tuesday morning, Huawei spokesman Glenn Schloss said the report “is simply a test and it has not seen real-world application. Huawei only supplies general-purpose products for this kind of testing. We do not provide custom algorithms or applications.”

Also after publication, a Megvii spokesman said the company’s systems are not designed to target or label ethnic groups. [...]

다음 타임라인의 일부

China: 83 major brands implicated in report on forced labour of ethnic minorities from Xinjiang assigned to factories across provinces; Includes company responses

China: 83 major brands implicated in report on forced labour of ethnic minorities from Xinjiang assigned to factories across provinces; Includes company responses

China: Mounting concerns over forced labour in Xinjiang

China: Huawei's facial recognition software capable of identifying minorities stirs controversy

개인정보

이 웹사이트는 쿠키 및 기타 웹 저장 기술을 사용합니다. 아래에서 개인정보보호 옵션을 설정할 수 있습니다. 변경 사항은 즉시 적용됩니다.

웹 저장소 사용에 대한 자세한 내용은 다음을 참조하세요 데이터 사용 및 쿠키 정책

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

분석 쿠키

ON
OFF

귀하가 우리 웹사이트를 방문하면 Google Analytics를 사용하여 귀하의 방문 정보를 수집합니다. 이 쿠키를 수락하면 저희가 귀하의 방문에 대한 자세한 내용을 이해하고, 정보 표시 방법을 개선할 수 있습니다. 모든 분석 정보는 익명이 보장되며 귀하를 식별하는데 사용하지 않습니다. Google은 모든 브라우저에 대해 Google Analytics 선택 해제 추가 기능을 제공합니다.

프로모션 쿠키

ON
OFF

우리는 소셜미디어와 검색 엔진을 포함한 제3자 플랫폼을 통해 기업과 인권에 대한 뉴스와 업데이트를 제공합니다. 이 쿠키는 이러한 프로모션의 성과를 이해하는데 도움이 됩니다.

이 사이트에 대한 개인정보 공개 범위 선택

이 사이트는 필요한 핵심 기능 이상으로 귀하의 경험을 향상시키기 위해 쿠키 및 기타 웹 저장 기술을 사용합니다.