abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

هذه الصفحة غير متوفرة باللغة العربية وهي معروضة باللغة English

المقال

15 نوفمبر 2019

الكاتب:
Julia Carrie Wong, The Guardian

Randstad allegedly instructs workers to target people of color & use deceptive tactics for Google facial recognition study

"Google reportedly targeted people with 'dark skin' to improve facial recognition", 03 October, 2019

[S]ubcontracted workers were employed by staffing firm Randstad but directed by Google managers... to target people with “darker skin tones” and those who would be more likely to be enticed by the $5 gift card, including homeless people and college students... “They said to target homeless people because they’re the least likely to say anything to the media,” [said] a former contractor... Randstad did not immediately respond to a request for comment... [contractors] described using deceptive tactics to persuade subjects to agree to the face scans, including mischaracterizing the face scan as a “selfie game” or “survey”, pressuring people to sign a consent form without reading it, and concealing... that the phone the research subjects were handed to “play with” was taking video of their faces... “We’re taking these claims seriously and investigating them,” a Google spokesperson said in a statement. “The allegations regarding truthfulness and consent are in violation of our requirements for volunteer research studies and the training that we provided"... “This is totally unacceptable conduct from a Google contractor. It’s why the way AI is built today needs to change,” said Jake Snow, an attorney with the ACLU of Northern California. “The answer to algorithmic bias is not to target the most vulnerable.”

معلومات الخصوصية

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي. يمكنك ضبط خيارات الخصوصية أدناه. تسري التغييرات فورًا.

للمزيد من المعلومات عن استخدامنا للتخزين الشبكي، انظر سياستنا في استخدام البيانات وملفات تعريف الارتباط

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

ملفات تعريف الارتباط التحليلية

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

خيارات الخصوصية على هذا الموقع

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي لتحسين تجربتك لما يتجاوز الخصائص الرئيسية الضرورية.