abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

這頁面沒有繁體中文版本,現以English顯示

內容有以下的語言版本: English, 日本語

故事

2021年10月4日

"Facebook Files" investigation uncovers company research identifying platform’s harms and left unfixed; incl. co. comments

A major investigation by the Wall Street Journal first published in September 2021, "The Facebook Files," finds that Facebook knows of flaws in its platforms that cause harm but has not fixed them. The findings come from reviewing a set of internal Facebook documents that includes research reports, online employee discussions, and draft presentations to management. The documents suggest the company's researchers have repeatedly identified platform harms, and that the company has not corrected these issues.

Among the investigation's reports are:

  • The company exempts high-profile users from some of its rules, despite stating company policies apply to everyone. This programme "shields millions of VIPs from the company’s normal enforcement, the documents show. Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions. Facebook says criticism of the program is fair, that it was designed for a good purpose and that the company is working to fix it."
  • Researchers at Facebook's subsidiary Instagram have repeatedly found that Instagram is harmful for a sizeable percentage of teenage girls. "In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address."
  • Facebook employees flag drug cartels and human traffickers to little effect. Documents show "employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding. Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses about organ selling, pornography and government action against political dissent, according to the documents. They also show the company’s response, which in many instances is inadequate or nothing at all. A Facebook spokesman said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe."

Along with statements on individual reports, and a rebuttal to the story on Instagram’s effects on teenage girls, Facebook published a response to the series acknowledging the investigation's topics are "some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens" and that "it is absolutely legitimate for us to be held to account for how we deal with them." It continues, "[b]ut these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees" and disputes the allegation that "Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company."

In late September 2021, the company announced it was "pausing" plans to develop "Instagram Kids", a version of the photo app for children under 13 years old.

In October 2021, Frances Haugen, a former Facebook product manager, revealed herself as the whistleblower who leaked internal company documents to the Wall Street Journal, and accused the company of placing “profit over safety.” She has filed complaints with US securities regulators that Facebook was misleading investors.

時間線