abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

這頁面沒有繁體中文版本,現以English顯示

文章

2021年3月18日

作者:
Kashmir Hill, New York Times

Clearview AI scraped the internet to build a facial-recognition tool & blew open the future of privacy

"Your Face Is Not Your Own", 18 March 2021

... Few outside law enforcement knew of Clearview’s existence back then... The government often avoids tipping off would-be criminals to cutting-edge investigative techniques, and Clearview’s founders worried about the reaction to their product. Helping to catch sex abusers was clearly a worthy cause, but the company’s method of doing so — hoovering up the personal photos of millions of Americans — was unprecedented and shocking. Indeed, when the public found out about Clearview last year... an immense backlash ensued.

Facebook, LinkedIn, Venmo and Google sent cease-and-desist letters to the company, accusing it of violating their terms of service and demanding, to no avail, that it stop using their photos. BuzzFeed published a leaked list of Clearview users, which included not just law enforcement but major private organizations including Bank of America and the N.B.A. (Each says it only tested the technology and was never a client.)

... A.I. software can analyze countless photos of people’s faces and learn to make impressive predictions about which images are of the same person; the more faces it inspects, the better it gets. Clearview is deploying this approach using billions of photos from the public internet. By testing legal and ethical limits around the collection and use of those images, it has become the front-runner in the field.

... The legal threats to Clearview have begun to move through the courts, and Clearview is preparing a powerful response, invoking the First Amendment. Many civil-liberties advocates fear the company will prevail, and they are aghast at the potential consequences. One major concern is that facial-recognition technology might be too flawed for law enforcement to rely on. A federal agency called the National Institute of Standards and Technology (NIST) periodically tests the accuracy of facial-recognition algorithms voluntarily submitted by vendors; Clearview hasn’t participated. In 2019, the agency found that many algorithms were less accurate in identifying people of color, meaning their use could worsen systemic bias in the criminal-justice system. In the last year, three cases have been unearthed (none involving Clearview) in which police officers arrested and briefly jailed the wrong person based on a bad facial-recognition match. All three of the wrongfully arrested were Black men.

There’s also a broader reason that critics fear a court decision favoring Clearview: It could let companies track us as pervasively in the real world as they already do online.

隱私資訊

本網站使用 cookie 和其他網絡存儲技術。您可以在下方設置您的隱私選項。您所作的更改將立即生效。

有關我們使用網絡儲存技術的更多資訊,請參閱我們的 數據使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析cookie

ON
OFF

您瀏覽本網頁時我們將以Google Analytics收集信息。接受此cookie將有助我們理解您的瀏覽資訊,並協助我們改善呈現資訊的方法。所有分析資訊都以匿名方式收集,我們並不能用相關資訊得到您的個人信息。谷歌在所有主要瀏覽器中都提供退出Google Analytics的添加應用程式。

市場營銷cookies

ON
OFF

我們從第三方網站獲得企業責任資訊,當中包括社交媒體和搜尋引擎。這些cookie協助我們理解相關瀏覽數據。

您在此網站上的隱私選項

本網站使用 cookie 和其他網絡儲存技術來增強您在必要核心功能之外的體驗。