abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Эта страница недоступна на Русский и отображается на English

Статья

11 Май 2022

Автор:
M. Moon, Engadget

Zoom urged to halt emotion tracking function by 27 human rights group

"Human rights organizations ask Zoom to scrap its emotion tracking AI in open letter" 11 May 2022

Digital rights non-profit Fight for the Future and 27 human rights organizations have written an open letter to Zoom, asking the company not to continue exploring the use of AI that can analyze emotions in its video conferencing platform. The groups wrote the letter in response to a Protocol report that said Zoom is actively researching how to incorporate emotion AI into its product in the future. It's part of a larger piece examining how companies have started using artificial intelligence to detect the emotional state of a potential client during sales calls. [...]

Fight for the Future and the other human rights orgs are hoping their call would pressure Zoom to abandon its plans. They called the technology "discriminatory, manipulative, potentially dangerous and based on assumptions that all people use the same facial expressions, voice patterns, and body language."

The groups also pointed out that the technology is inherently biased and racist, just like facial recognition. By incorporating the feature, Zoom would be discriminating against certain ethnicities and people with disabilities, they said. In addition, it could be used to punish students or workers if they displayed the wrong emotion. In 2021, a project led by University of Cambridge professor Alexa Hagerty showed the limits of emotion recognition AIs and how easy it is to fool them. Previous studies also showed that emotion recognition programs fail the racial bias test and struggle to read Black faces.

The group ended the letter by mentioning Zoom's decision to cancel the rollout of face-tracking features and calling this another opportunity to do what's right by its users. They're now asking Zoom to commit to not implementing emotion AI in its product by May 20th, 2022. [...]

Хронология