abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

هذه الصفحة غير متوفرة باللغة العربية وهي معروضة باللغة English

المقال

28 يناير 2019

الكاتب:
Joy Buolamwini, Medium

Commentary: Amazon should halt use of facial recognition technology for policing & govt. surveillance

In this article... I... address.... criticisms.... made by those with interest in keeping the use, abuse, and technical immaturity of AI systems in the dark... AI services the company provides to law enforcement and other customers can be abused regardless of accuracy... Among the most concerning uses of facial analysis technology involve the bolstering of mass surveillance, the weaponization of AI, and harmful discrimination in law enforcement contexts...  Because this powerful technology is being rapidly developed and adopted without oversight, the Algorithmic Justice League and the Center on Privacy & Technology launched the Safe Face Pledge. The pledge prohibits lethal use of any kind of facial analysis technology including facial recognition and aims to mitigate abuses. 

As an expert on bias in facial analysis technology, I advise Amazon to

1) immediately halt the use of facial recognition and any other kinds of facial analysis technology in high-stakes contexts like policing and government surveillance

2) submit company models currently in use by customers to the National Institute of Standards and Technology benchmark

Part of the following timelines

Shareholders & civil society groups urge Amazon to halt sale of facial recognition software to law enforcement agencies

Facial analysis technology often recreates racial & gender bias, says expert

معلومات الخصوصية

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي. يمكنك ضبط خيارات الخصوصية أدناه. تسري التغييرات فورًا.

للمزيد من المعلومات عن استخدامنا للتخزين الشبكي، انظر سياستنا في استخدام البيانات وملفات تعريف الارتباط

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

ملفات تعريف الارتباط التحليلية

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

خيارات الخصوصية على هذا الموقع

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي لتحسين تجربتك لما يتجاوز الخصائص الرئيسية الضرورية.