abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

内容有以下的语言版本: English, 日本語

故事

4 十月 2021

"Facebook Files" investigation uncovers company research identifying platform’s harms and left unfixed; incl. co. comments

A major investigation by the Wall Street Journal first published in September 2021, "The Facebook Files," finds that Facebook knows of flaws in its platforms that cause harm but has not fixed them. The findings come from reviewing a set of internal Facebook documents that includes research reports, online employee discussions, and draft presentations to management. The documents suggest the company's researchers have repeatedly identified platform harms, and that the company has not corrected these issues.

Among the investigation's reports are:

  • The company exempts high-profile users from some of its rules, despite stating company policies apply to everyone. This programme "shields millions of VIPs from the company’s normal enforcement, the documents show. Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions. Facebook says criticism of the program is fair, that it was designed for a good purpose and that the company is working to fix it."
  • Researchers at Facebook's subsidiary Instagram have repeatedly found that Instagram is harmful for a sizeable percentage of teenage girls. "In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address."
  • Facebook employees flag drug cartels and human traffickers to little effect. Documents show "employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding. Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses about organ selling, pornography and government action against political dissent, according to the documents. They also show the company’s response, which in many instances is inadequate or nothing at all. A Facebook spokesman said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe."

Along with statements on individual reports, and a rebuttal to the story on Instagram’s effects on teenage girls, Facebook published a response to the series acknowledging the investigation's topics are "some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens" and that "it is absolutely legitimate for us to be held to account for how we deal with them." It continues, "[b]ut these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees" and disputes the allegation that "Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company."

In late September 2021, the company announced it was "pausing" plans to develop "Instagram Kids", a version of the photo app for children under 13 years old.

In October 2021, Frances Haugen, a former Facebook product manager, revealed herself as the whistleblower who leaked internal company documents to the Wall Street Journal, and accused the company of placing “profit over safety.” She has filed complaints with US securities regulators that Facebook was misleading investors.

时间线

隐私资讯

本网站使用 cookie 和其他网络存储技术。您可以在下方设置您的隐私选项。您所作的更改将立即生效。

有关我们使用网络存储的更多信息,请参阅我们的 数据使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析 cookie

ON
OFF

您浏览本网页时我们将以Google Analytics收集信息。接受此cookie将有助我们理解您的浏览资讯,并协助我们改善呈现资讯的方法。所有分析资讯都以匿名方式收集,我们并不能用相关资讯得到您的个人信息。谷歌在所有主要浏览器中都提供退出Google Analytics的添加应用程式。

市场营销cookies

ON
OFF

我们从第三方网站获得企业责任资讯,当中包括社交媒体和搜寻引擎。这些cookie协助我们理解相关浏览数据。

您在此网站上的隐私选项

本网站使用cookie和其他网络存储技术来增强您在必要核心功能之外的体验。