abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

이 페이지는 한국어로 제공되지 않으며 English로 표시됩니다.

이 내용은 다음 언어로도 제공됩니다: English, 日本語

이야기

4 10월 2021

"Facebook Files" investigation uncovers company research identifying platform’s harms and left unfixed; incl. co. comments

A major investigation by the Wall Street Journal first published in September 2021, "The Facebook Files," finds that Facebook knows of flaws in its platforms that cause harm but has not fixed them. The findings come from reviewing a set of internal Facebook documents that includes research reports, online employee discussions, and draft presentations to management. The documents suggest the company's researchers have repeatedly identified platform harms, and that the company has not corrected these issues.

Among the investigation's reports are:

  • The company exempts high-profile users from some of its rules, despite stating company policies apply to everyone. This programme "shields millions of VIPs from the company’s normal enforcement, the documents show. Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions. Facebook says criticism of the program is fair, that it was designed for a good purpose and that the company is working to fix it."
  • Researchers at Facebook's subsidiary Instagram have repeatedly found that Instagram is harmful for a sizeable percentage of teenage girls. "In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address."
  • Facebook employees flag drug cartels and human traffickers to little effect. Documents show "employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding. Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses about organ selling, pornography and government action against political dissent, according to the documents. They also show the company’s response, which in many instances is inadequate or nothing at all. A Facebook spokesman said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe."

Along with statements on individual reports, and a rebuttal to the story on Instagram’s effects on teenage girls, Facebook published a response to the series acknowledging the investigation's topics are "some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens" and that "it is absolutely legitimate for us to be held to account for how we deal with them." It continues, "[b]ut these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees" and disputes the allegation that "Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company."

In late September 2021, the company announced it was "pausing" plans to develop "Instagram Kids", a version of the photo app for children under 13 years old.

In October 2021, Frances Haugen, a former Facebook product manager, revealed herself as the whistleblower who leaked internal company documents to the Wall Street Journal, and accused the company of placing “profit over safety.” She has filed complaints with US securities regulators that Facebook was misleading investors.

타임라인

개인정보

이 웹사이트는 쿠키 및 기타 웹 저장 기술을 사용합니다. 아래에서 개인정보보호 옵션을 설정할 수 있습니다. 변경 사항은 즉시 적용됩니다.

웹 저장소 사용에 대한 자세한 내용은 다음을 참조하세요 데이터 사용 및 쿠키 정책

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

분석 쿠키

ON
OFF

귀하가 우리 웹사이트를 방문하면 Google Analytics를 사용하여 귀하의 방문 정보를 수집합니다. 이 쿠키를 수락하면 저희가 귀하의 방문에 대한 자세한 내용을 이해하고, 정보 표시 방법을 개선할 수 있습니다. 모든 분석 정보는 익명이 보장되며 귀하를 식별하는데 사용하지 않습니다. Google은 모든 브라우저에 대해 Google Analytics 선택 해제 추가 기능을 제공합니다.

프로모션 쿠키

ON
OFF

우리는 소셜미디어와 검색 엔진을 포함한 제3자 플랫폼을 통해 기업과 인권에 대한 뉴스와 업데이트를 제공합니다. 이 쿠키는 이러한 프로모션의 성과를 이해하는데 도움이 됩니다.

이 사이트에 대한 개인정보 공개 범위 선택

이 사이트는 필요한 핵심 기능 이상으로 귀하의 경험을 향상시키기 위해 쿠키 및 기타 웹 저장 기술을 사용합니다.