abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

10 Oct 2023

Author:
Victoria Turk, Rest of World

Journalists reveal major pitfalls of generative AI's stereotyped defaults

"How AI reduces the world to stereotypes" 10 October 2023

...BuzzFeed posted a list of 195 images of Barbie dolls produced using Midjourney, the popular artificial intelligence image generator. Each doll was supposed to represent a different country: Afghanistan Barbie, Albania Barbie, Algeria Barbie, and so on. The depictions were clearly flawed: Several of the Asian Barbies were light-skinned; Thailand Barbie, Singapore Barbie, and the Philippines Barbie all had blonde hair. Lebanon Barbie posed standing on rubble; Germany Barbie wore military-style clothing. South Sudan Barbie carried a gun.

The article — to which BuzzFeed added a disclaimer before taking it down entirely — offered an unintentionally striking example of the biases and stereotypes that proliferate in images produced by the recent wave of generative AI text-to-image systems, such as Midjourney, Dall-E, and Stable Diffusion....

...Using Midjourney, we chose five prompts, based on the generic concepts of “a person,” “a woman,” “a house,” “a street,” and “a plate of food.” We then adapted them for different countries: China, India, Indonesia, Mexico, and Nigeria. We also included the U.S. in the survey for comparison, given Midjourney (like most of the biggest generative AI companies) is based in the country. 

For each prompt and country combination (e.g., “an Indian person,” “a house in Mexico,” “a plate of Nigerian food”), we generated 100 images, resulting in a data set of 3,000 images.

“An Indian person” is almost always an old man with a beard. “A Mexican person” is usually a man in a sombrero. Most of New Delhi’s streets are polluted and littered. In Indonesia, food is served almost exclusively on banana leaves...

“Essentially what this is doing is flattening descriptions of, say, ‘an Indian person’ or ‘a Nigerian house’ into particular stereotypes which could be viewed in a negative light,” Amba Kak, executive director of the AI Now Institute, a U.S.-based policy research organization, told Rest of World. Even stereotypes that are not inherently negative, she said, are still stereotypes: They reflect a particular value judgment, and a winnowing of diversity. Midjourney did not respond to multiple requests for an interview or comment for this story...

...Researchers told Rest of World this could cause real harm. Image generators are being used for diverse applications, including in the advertising and creative industries, and even in tools designed to make forensic sketches of crime suspects...

Privacy information

This site uses cookies and other web storage technologies. You can set your privacy choices below. Changes will take effect immediately.

For more information on our use of web storage, please refer to our Data Usage and Cookies Policy

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Analytics cookie

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

Your privacy choices for this site

This site uses cookies and other web storage technologies to enhance your experience beyond necessary core functionality.