abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

15 Nov 2019

Author:
Julia Carrie Wong, The Guardian

Randstad allegedly instructs workers to target people of color & use deceptive tactics for Google facial recognition study

"Google reportedly targeted people with 'dark skin' to improve facial recognition", 03 October, 2019

[S]ubcontracted workers were employed by staffing firm Randstad but directed by Google managers... to target people with “darker skin tones” and those who would be more likely to be enticed by the $5 gift card, including homeless people and college students... “They said to target homeless people because they’re the least likely to say anything to the media,” [said] a former contractor... Randstad did not immediately respond to a request for comment... [contractors] described using deceptive tactics to persuade subjects to agree to the face scans, including mischaracterizing the face scan as a “selfie game” or “survey”, pressuring people to sign a consent form without reading it, and concealing... that the phone the research subjects were handed to “play with” was taking video of their faces... “We’re taking these claims seriously and investigating them,” a Google spokesperson said in a statement. “The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided"... “This is totally unacceptable conduct from a Google contractor. It’s why the way AI is built today needs to change,” said Jake Snow, an attorney with the ACLU of Northern California. “The answer to algorithmic bias is not to target the most vulnerable.”