abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

文章

31 三月 2022

作者:
Michael Keller, The New York Times

Meta's content moderation allegedly misclassifies some cases of child sexual abuse when ages are unclear; incl. co. comment

"Adults or Sexually Abused Minors? Getting It Right Vexes Facebook", 31 March 2022

Facebook is a leader among tech companies in detecting child sexual abuse content... But concerns about mistakenly accusing people of posting illegal imagery have resulted in a policy that could allow photos and videos of abuse to go unreported.

Meta... has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.

Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults... “The sexual abuse of children online is abhorrent,” Ms. Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life-changing” for users.

While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors.

The training document... was created for moderators working for Accenture, a consulting firm that has a contract to sort through Facebook’s noxious content and remove it from the site. Accenture declined to comment on the practice.

Legal and tech policy experts said that social media companies had a difficult path to navigate. If they fail to report suspected illicit imagery, they can be pursued by the authorities; if they report legal imagery as child sexual abuse material, they can be sued and accused of acting recklessly.

隐私资讯

本网站使用 cookie 和其他网络存储技术。您可以在下方设置您的隐私选项。您所作的更改将立即生效。

有关我们使用网络存储的更多信息,请参阅我们的 数据使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析 cookie

ON
OFF

您浏览本网页时我们将以Google Analytics收集信息。接受此cookie将有助我们理解您的浏览资讯,并协助我们改善呈现资讯的方法。所有分析资讯都以匿名方式收集,我们并不能用相关资讯得到您的个人信息。谷歌在所有主要浏览器中都提供退出Google Analytics的添加应用程式。

市场营销cookies

ON
OFF

我们从第三方网站获得企业责任资讯,当中包括社交媒体和搜寻引擎。这些cookie协助我们理解相关浏览数据。

您在此网站上的隐私选项

本网站使用cookie和其他网络存储技术来增强您在必要核心功能之外的体验。