abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeblueskyburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfilterflaggenderglobeglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalitywebwhatsappxIcons / Social / YouTube

์ด ํŽ˜์ด์ง€๋Š” ํ•œ๊ตญ์–ด๋กœ ์ œ๊ณต๋˜์ง€ ์•Š์œผ๋ฉฐ English๋กœ ํ‘œ์‹œ๋ฉ๋‹ˆ๋‹ค.

๊ธฐ์‚ฌ

2018๋…„ 6์›” 12์ผ

์ €์ž:
Russell Brandom, The Verge

Amazon urged to disclose public bias testing data for its facial recognition software; co. didn't respond

"Amazon needs to come clean about racial bias in its algorithms", 23 May 2018

[A]mazonโ€™s quiet Rekognition program became very public, as new documents obtained by the ACLU of Northern California showed the system partnering with the city of Orlando and police camera vendors like Motorola Solutions for an aggressive new real-time facial recognition service. Amazon insists that the service is a simple object-recognition tool and will only be used for legal purposes. But even if we take the company at its word, the project raises serious concerns, particularly around racial bias...Facial recognition systems have long struggled with higher error rates for women and people of color โ€” error rates that can translate directly into more stops and arrests for marginalized groups. And while some companies have responded with public bias testing, Amazon hasnโ€™t shared any data on the issue, if itโ€™s collected data at all...ACLU-NCโ€™s Matt Cagle [said]: "โ€œFace recognition is a biased technology. It doesnโ€™t make communities safer. It just powers even greater discriminatory surveillance and policing.โ€...In the most basic terms... facial recognition systems pose an added threat of wrongful accusation and arrest for non-white people...I asked Amazon directly if the company has any data on bias testing for Rekognition, but so far, nothing has turned up...

ํƒ€์ž„๋ผ์ธ

๊ฐœ์ธ์ •๋ณด

์ด ์›น์‚ฌ์ดํŠธ๋Š” ์ฟ ํ‚ค ๋ฐ ๊ธฐํƒ€ ์›น ์ €์žฅ ๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ์•„๋ž˜์—์„œ ๊ฐœ์ธ์ •๋ณด๋ณดํ˜ธ ์˜ต์…˜์„ ์„ค์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ณ€๊ฒฝ ์‚ฌํ•ญ์€ ์ฆ‰์‹œ ์ ์šฉ๋ฉ๋‹ˆ๋‹ค.

์›น ์ €์žฅ์†Œ ์‚ฌ์šฉ์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ ๋‹ค์Œ์„ ์ฐธ์กฐํ•˜์„ธ์š” ๋ฐ์ดํ„ฐ ์‚ฌ์šฉ ๋ฐ ์ฟ ํ‚ค ์ •์ฑ…

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

๋ถ„์„ ์ฟ ํ‚ค

ON
OFF

๊ท€ํ•˜๊ฐ€ ์šฐ๋ฆฌ ์›น์‚ฌ์ดํŠธ๋ฅผ ๋ฐฉ๋ฌธํ•˜๋ฉด Google Analytics๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๊ท€ํ•˜์˜ ๋ฐฉ๋ฌธ ์ •๋ณด๋ฅผ ์ˆ˜์ง‘ํ•ฉ๋‹ˆ๋‹ค. ์ด ์ฟ ํ‚ค๋ฅผ ์ˆ˜๋ฝํ•˜๋ฉด ์ €ํฌ๊ฐ€ ๊ท€ํ•˜์˜ ๋ฐฉ๋ฌธ์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์„ ์ดํ•ดํ•˜๊ณ , ์ •๋ณด ํ‘œ์‹œ ๋ฐฉ๋ฒ•์„ ๊ฐœ์„ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ชจ๋“  ๋ถ„์„ ์ •๋ณด๋Š” ์ต๋ช…์ด ๋ณด์žฅ๋˜๋ฉฐ ๊ท€ํ•˜๋ฅผ ์‹๋ณ„ํ•˜๋Š”๋ฐ ์‚ฌ์šฉํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. Google์€ ๋ชจ๋“  ๋ธŒ๋ผ์šฐ์ €์— ๋Œ€ํ•ด Google Analytics ์„ ํƒ ํ•ด์ œ ์ถ”๊ฐ€ ๊ธฐ๋Šฅ์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

ํ”„๋กœ๋ชจ์…˜ ์ฟ ํ‚ค

ON
OFF

์šฐ๋ฆฌ๋Š” ์†Œ์…œ๋ฏธ๋””์–ด์™€ ๊ฒ€์ƒ‰ ์—”์ง„์„ ํฌํ•จํ•œ ์ œ3์ž ํ”Œ๋žซํผ์„ ํ†ตํ•ด ๊ธฐ์—…๊ณผ ์ธ๊ถŒ์— ๋Œ€ํ•œ ๋‰ด์Šค์™€ ์—…๋ฐ์ดํŠธ๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ์ด ์ฟ ํ‚ค๋Š” ์ด๋Ÿฌํ•œ ํ”„๋กœ๋ชจ์…˜์˜ ์„ฑ๊ณผ๋ฅผ ์ดํ•ดํ•˜๋Š”๋ฐ ๋„์›€์ด ๋ฉ๋‹ˆ๋‹ค.

์ด ์‚ฌ์ดํŠธ์— ๋Œ€ํ•œ ๊ฐœ์ธ์ •๋ณด ๊ณต๊ฐœ ๋ฒ”์œ„ ์„ ํƒ

์ด ์‚ฌ์ดํŠธ๋Š” ํ•„์š”ํ•œ ํ•ต์‹ฌ ๊ธฐ๋Šฅ ์ด์ƒ์œผ๋กœ ๊ท€ํ•˜์˜ ๊ฒฝํ—˜์„ ํ–ฅ์ƒ์‹œํ‚ค๊ธฐ ์œ„ํ•ด ์ฟ ํ‚ค ๋ฐ ๊ธฐํƒ€ ์›น ์ €์žฅ ๊ธฐ์ˆ ์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค.