abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Esta página não está disponível em Português e está sendo exibida em English

Artigo

15 nov 2019

Author:
Julia Carrie Wong, The Guardian

Randstad allegedly instructs workers to target people of color & use deceptive tactics for Google facial recognition study

"Google reportedly targeted people with 'dark skin' to improve facial recognition", 03 October, 2019

[S]ubcontracted workers were employed by staffing firm Randstad but directed by Google managers... to target people with “darker skin tones” and those who would be more likely to be enticed by the $5 gift card, including homeless people and college students... “They said to target homeless people because they’re the least likely to say anything to the media,” [said] a former contractor... Randstad did not immediately respond to a request for comment... [contractors] described using deceptive tactics to persuade subjects to agree to the face scans, including mischaracterizing the face scan as a “selfie game” or “survey”, pressuring people to sign a consent form without reading it, and concealing... that the phone the research subjects were handed to “play with” was taking video of their faces... “We’re taking these claims seriously and investigating them,” a Google spokesperson said in a statement. “The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided"... “This is totally unacceptable conduct from a Google contractor. It’s why the way AI is built today needs to change,” said Jake Snow, an attorney with the ACLU of Northern California. “The answer to algorithmic bias is not to target the most vulnerable.”

Privacy information

Este site usa cookies e outras tecnologias de armazenamento na web. Você pode definir suas opções de privacidade abaixo. As alterações entrarão em vigor imediatamente.

Para obter mais informações sobre nosso uso de armazenamento na web, consulte nossa Política de Uso de Dados e de Cookies

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Cookies analíticos

ON
OFF

Quando você acessa nosso site, usamos o Google Analytics para coletar informações sobre sua visita. A aceitação deste cookie nos permitirá entender mais detalhes sobre sua viagem, e melhorar a forma como nós colocamos as informações na superfície. Todas as informações analíticas são anônimas e não as utilizamos para identificá-lo. O Google fornece uma opção de não inclusão no Google Analytics para todos os navegadores populares.

Cookies promocionais

ON
OFF

Compartilhamos notícias e atualizações sobre empresas e direitos humanos através de plataformas de terceiros, incluindo mídias sociais e mecanismos de busca. Estes cookies nos ajudam a entender o desempenho destas promoções.

Suas escolhas de privacidade para este site

Este site usa cookies e outras tecnologias de armazenamento da web para aprimorar sua experiência além da funcionalidade básica necessária.