abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

7 Apr 2020

Author:
Nicolas Kayser-Bril, AlgorithmWatch

AlgorithmWatch identifies racial bias in Google Vision Cloud algorithm; Google apologises

"Google apologizes after its Vision AI produced racist results", 7 April 2020 

[Google Vision AI] labeled an image of a dark-skinned individual holding a thermometer “gun” while a similar image with a light-skinned individual was labeled “electronic device”... Google since updated its algorithm... In a statement to AlgorithmWatch, Tracy Frey, director of Product Strategy and Operations at Google, wrote that “this result [was] unacceptable. The connection with this outcome and racism is important to recognize, and we are deeply sorry for any harm this may have caused”... “Our investigation found some objects were mis-labeled as firearms and these results existed across a range of skin tones. We have adjusted the confidence scores to more accurately return labels when a firearm is in a photograph.” Ms Frey added that Google had found “no evidence of systemic bias related to skin tone.” [also refers to Facebook]