abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Cette page n’est pas disponible en Français et est affichée en English

Article

18 mar 2021

Auteur:
Kashmir Hill, New York Times

Clearview AI scraped the internet to build a facial-recognition tool & blew open the future of privacy

"Your Face Is Not Your Own", 18 March 2021

... Few outside law enforcement knew of Clearview’s existence back then... The government often avoids tipping off would-be criminals to cutting-edge investigative techniques, and Clearview’s founders worried about the reaction to their product. Helping to catch sex abusers was clearly a worthy cause, but the company’s method of doing so — hoovering up the personal photos of millions of Americans — was unprecedented and shocking. Indeed, when the public found out about Clearview last year... an immense backlash ensued.

Facebook, LinkedIn, Venmo and Google sent cease-and-desist letters to the company, accusing it of violating their terms of service and demanding, to no avail, that it stop using their photos. BuzzFeed published a leaked list of Clearview users, which included not just law enforcement but major private organizations including Bank of America and the N.B.A. (Each says it only tested the technology and was never a client.)

... A.I. software can analyze countless photos of people’s faces and learn to make impressive predictions about which images are of the same person; the more faces it inspects, the better it gets. Clearview is deploying this approach using billions of photos from the public internet. By testing legal and ethical limits around the collection and use of those images, it has become the front-runner in the field.

... The legal threats to Clearview have begun to move through the courts, and Clearview is preparing a powerful response, invoking the First Amendment. Many civil-liberties advocates fear the company will prevail, and they are aghast at the potential consequences. One major concern is that facial-recognition technology might be too flawed for law enforcement to rely on. A federal agency called the National Institute of Standards and Technology (NIST) periodically tests the accuracy of facial-recognition algorithms voluntarily submitted by vendors; Clearview hasn’t participated. In 2019, the agency found that many algorithms were less accurate in identifying people of color, meaning their use could worsen systemic bias in the criminal-justice system. In the last year, three cases have been unearthed (none involving Clearview) in which police officers arrested and briefly jailed the wrong person based on a bad facial-recognition match. All three of the wrongfully arrested were Black men.

There’s also a broader reason that critics fear a court decision favoring Clearview: It could let companies track us as pervasively in the real world as they already do online.

Informations sur la confidentialité

Ce site utilise des cookies et d'autres technologies de stockage web. Vous pouvez définir vos choix en matière de confidentialité ci-dessous. Les changements prendront effet immédiatement.

Pour plus d'informations sur notre utilisation du stockage web, veuillez vous référer à notre Politique en matière d'utilisation des données et de cookies

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

Cookie analytique

ON
OFF

Lorsque vous accédez à notre site Web, nous utilisons Google Analytics pour collecter des informations sur votre visite. Autoriser ce cookie nous permettra de comprendre en plus de détails sur votre parcours et d'améliorer la façon dont nous diffusons les informations. Toutes les informations analytiques sont anonymes et nous ne les utilisons pas pour vous identifier. Outre la possibilité que vous avez de refuser des cookies, vous pouvez installer le module pour la désactivation de Google Analytics.

Cookies promotionels

ON
OFF

Nous partageons des nouvelles et des mises à jour sur les entreprises et les droits de l'homme via des plateformes tierces, y compris les médias sociaux et les moteurs de recherche. Ces cookies nous aident à comprendre les performances de ces items.

Vos choix en matière de confidentialité pour ce site

Ce site utilise des cookies et d'autres technologies de stockage web pour améliorer votre expérience au-delà des fonctionnalités de base nécessaires.