abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

1 May 2018

Author:
Bettina Büchel, IMD Business School, World Economic Forum

Data biases used to train AI can reinforce gender inequality

"AI could reinforce gender inequality", 8 Mar 2018

We are not only living in an age where women are being under-represented in many spheres of economic life, but technology could make this even worse...[T]he main reason is social bias...

...This is on the verge of being further reinforced by artificial intelligence, as current data being used to train machines to learn are often biased. With the rapid deployment of AI, this biased data will influence the predictions that machines make. Whenever you have a dataset of human decisions, it naturally includes bias...[T]his will be influenced by cultural, gender or race biases...

...The careers platform LinkedIn...had an issue where highly-paid jobs were not displayed as frequently for searches by women as they were for men because of the way its algorithms were written...One study found a similar issue with Google...Another study shows how images that are used to train image-recognition software amplify gender biases...

...The training of machines using data remains unproblematic as long as it does not lead to discriminatory predictive actions...One way to test for biases is by stress-testing the system...[also refers to Facebook & Microsoft]