Meta's content moderation allegedly misclassifies some cases of child sexual abuse when ages are unclear; incl. co. comment
"Adults or Sexually Abused Minors? Getting It Right Vexes Facebook", 31 March 2022
Facebook is a leader among tech companies in detecting child sexual abuse content... But concerns about mistakenly accusing people of posting illegal imagery have resulted in a policy that could allow photos and videos of abuse to go unreported.
Meta... has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.
Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults... “The sexual abuse of children online is abhorrent,” Ms. Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life-changing” for users.
While it is impossible to quantify the number of images that might be misclassified, child safety experts said the company was undoubtedly missing some minors.
The training document... was created for moderators working for Accenture, a consulting firm that has a contract to sort through Facebook’s noxious content and remove it from the site. Accenture declined to comment on the practice.
Legal and tech policy experts said that social media companies had a difficult path to navigate. If they fail to report suspected illicit imagery, they can be pursued by the authorities; if they report legal imagery as child sexual abuse material, they can be sued and accused of acting recklessly.