abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

هذه الصفحة غير متوفرة باللغة العربية وهي معروضة باللغة English

التقرير

2 سبتمبر 2020

الكاتب:
AI Now Institute

AI Now launches compendium exploring attempts to regulate biometric systems given human rights & other risks

Photo: Canva

"AI Now Launches 'Regulating Biometrics: Global Approaches and Open Questions,'" 2 September, 2020

Amid heightened public scrutiny, interest in regulating biometric technologies like face and voice recognition has grown significantly across the globe... Advocates continue to remind developers, profiteers, and those using and regulating these biometric systems that the future course of these technologies must––and will–– be subject to greater democratic control. The next few years are poised to produce wide-ranging legal regulation in many parts of the world that could alter the future course of these technologies... AI Now worked with academics, advocates, and policy experts to publish a Compendium of case studies on current attempts to regulate biometric systems, and reflect on the promise, and the limits, of the law.

... In recent years, privacy advocates have demanded regulatory tools that ensure transparency as early in the process as possible... In the EU, advocacy organizations like Access Now and Algorithm Watch have called for a mandatory disclosure scheme for all AI systems used in the public sector, in conjunction with a mandatory human rights or algorithmic impact assessment...

معلومات الخصوصية

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي. يمكنك ضبط خيارات الخصوصية أدناه. تسري التغييرات فورًا.

للمزيد من المعلومات عن استخدامنا للتخزين الشبكي، انظر سياستنا في استخدام البيانات وملفات تعريف الارتباط

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

ملفات تعريف الارتباط التحليلية

ON
OFF

When you access our website we use Google Analytics to collect information on your visit. Accepting this cookie will allow us to understand more details about your journey, and improve how we surface information. All analytics information is anonymous and we do not use it to identify you. Google provides a Google Analytics opt-out add on for all popular browsers.

Promotional cookies

ON
OFF

We share news and updates on business and human rights through third party platforms, including social media and search engines. These cookies help us to understand the performance of these promotions.

خيارات الخصوصية على هذا الموقع

هذا الموقع يستخدم ملفات تعريف الارتباط وتكنولوجيا التخزين الشبكي لتحسين تجربتك لما يتجاوز الخصائص الرئيسية الضرورية.