abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

هذه الصفحة غير متوفرة باللغة العربية وهي معروضة باللغة English

المقال

18 مارس 2021

الكاتب:
Kashmir Hill, New York Times

Clearview AI scraped the internet to build a facial-recognition tool & blew open the future of privacy

"Your Face Is Not Your Own", 18 March 2021

... Few outside law enforcement knew of Clearview’s existence back then... The government often avoids tipping off would-be criminals to cutting-edge investigative techniques, and Clearview’s founders worried about the reaction to their product. Helping to catch sex abusers was clearly a worthy cause, but the company’s method of doing so — hoovering up the personal photos of millions of Americans — was unprecedented and shocking. Indeed, when the public found out about Clearview last year... an immense backlash ensued.

Facebook, LinkedIn, Venmo and Google sent cease-and-desist letters to the company, accusing it of violating their terms of service and demanding, to no avail, that it stop using their photos. BuzzFeed published a leaked list of Clearview users, which included not just law enforcement but major private organizations including Bank of America and the N.B.A. (Each says it only tested the technology and was never a client.)

... A.I. software can analyze countless photos of people’s faces and learn to make impressive predictions about which images are of the same person; the more faces it inspects, the better it gets. Clearview is deploying this approach using billions of photos from the public internet. By testing legal and ethical limits around the collection and use of those images, it has become the front-runner in the field.

... The legal threats to Clearview have begun to move through the courts, and Clearview is preparing a powerful response, invoking the First Amendment. Many civil-liberties advocates fear the company will prevail, and they are aghast at the potential consequences. One major concern is that facial-recognition technology might be too flawed for law enforcement to rely on. A federal agency called the National Institute of Standards and Technology (NIST) periodically tests the accuracy of facial-recognition algorithms voluntarily submitted by vendors; Clearview hasn’t participated. In 2019, the agency found that many algorithms were less accurate in identifying people of color, meaning their use could worsen systemic bias in the criminal-justice system. In the last year, three cases have been unearthed (none involving Clearview) in which police officers arrested and briefly jailed the wrong person based on a bad facial-recognition match. All three of the wrongfully arrested were Black men.

There’s also a broader reason that critics fear a court decision favoring Clearview: It could let companies track us as pervasively in the real world as they already do online.

معلومات الخصوصية

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي. يمكنك ضبط خيارات الخصوصية أدناه. تسري التغييرات فورًا.

للمزيد من المعلومات عن استخدامنا للتخزين الشبكي، انظر سياستنا في استخدام البيانات وملفات تعريف الارتباط

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

ملفات تعريف الارتباط التحليلية

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

خيارات الخصوصية على هذا الموقع

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي لتحسين تجربتك لما يتجاوز الخصائص الرئيسية الضرورية.