abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

Cette page n’est pas disponible en Français et est affichée en English

Article

10 oct 2023

Auteur:
Victoria Turk, Rest of World

Journalists reveal major pitfalls of generative AI's stereotyped defaults

"How AI reduces the world to stereotypes" 10 October 2023

...BuzzFeed posted a list of 195 images of Barbie dolls produced using Midjourney, the popular artificial intelligence image generator. Each doll was supposed to represent a different country: Afghanistan Barbie, Albania Barbie, Algeria Barbie, and so on. The depictions were clearly flawed: Several of the Asian Barbies were light-skinned; Thailand Barbie, Singapore Barbie, and the Philippines Barbie all had blonde hair. Lebanon Barbie posed standing on rubble; Germany Barbie wore military-style clothing. South Sudan Barbie carried a gun.

The article — to which BuzzFeed added a disclaimer before taking it down entirely — offered an unintentionally striking example of the biases and stereotypes that proliferate in images produced by the recent wave of generative AI text-to-image systems, such as Midjourney, Dall-E, and Stable Diffusion....

...Using Midjourney, we chose five prompts, based on the generic concepts of “a person,” “a woman,” “a house,” “a street,” and “a plate of food.” We then adapted them for different countries: China, India, Indonesia, Mexico, and Nigeria. We also included the U.S. in the survey for comparison, given Midjourney (like most of the biggest generative AI companies) is based in the country. 

For each prompt and country combination (e.g., “an Indian person,” “a house in Mexico,” “a plate of Nigerian food”), we generated 100 images, resulting in a data set of 3,000 images.

“An Indian person” is almost always an old man with a beard. “A Mexican person” is usually a man in a sombrero. Most of New Delhi’s streets are polluted and littered. In Indonesia, food is served almost exclusively on banana leaves...

“Essentially what this is doing is flattening descriptions of, say, ‘an Indian person’ or ‘a Nigerian house’ into particular stereotypes which could be viewed in a negative light,” Amba Kak, executive director of the AI Now Institute, a U.S.-based policy research organization, told Rest of World. Even stereotypes that are not inherently negative, she said, are still stereotypes: They reflect a particular value judgment, and a winnowing of diversity. Midjourney did not respond to multiple requests for an interview or comment for this story...

...Researchers told Rest of World this could cause real harm. Image generators are being used for diverse applications, including in the advertising and creative industries, and even in tools designed to make forensic sketches of crime suspects...

Informations sur la confidentialité

Ce site utilise des cookies et d'autres technologies de stockage web. Vous pouvez définir vos choix en matière de confidentialité ci-dessous. Les changements prendront effet immédiatement.

Pour plus d'informations sur notre utilisation du stockage web, veuillez vous référer à notre Politique en matière d'utilisation des données et de cookies

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Cookie analytique

ON
OFF

Lorsque vous accédez à notre site Web, nous utilisons Google Analytics pour collecter des informations sur votre visite. Autoriser ce cookie nous permettra de comprendre en plus de détails sur votre parcours et d'améliorer la façon dont nous diffusons les informations. Toutes les informations analytiques sont anonymes et nous ne les utilisons pas pour vous identifier. Outre la possibilité que vous avez de refuser des cookies, vous pouvez installer le module pour la désactivation de Google Analytics.

Cookies promotionels

ON
OFF

Nous partageons des nouvelles et des mises à jour sur les entreprises et les droits de l'homme via des plateformes tierces, y compris les médias sociaux et les moteurs de recherche. Ces cookies nous aident à comprendre les performances de ces items.

Vos choix en matière de confidentialité pour ce site

Ce site utilise des cookies et d'autres technologies de stockage web pour améliorer votre expérience au-delà des fonctionnalités de base nécessaires.