abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

這頁面沒有繁體中文版本,現以English顯示

內容有以下的語言版本: English, 日本語

故事

4 十月 2021

"Facebook Files" investigation uncovers company research identifying platform’s harms and left unfixed; incl. co. comments

A major investigation by the Wall Street Journal first published in September 2021, "The Facebook Files," finds that Facebook knows of flaws in its platforms that cause harm but has not fixed them. The findings come from reviewing a set of internal Facebook documents that includes research reports, online employee discussions, and draft presentations to management. The documents suggest the company's researchers have repeatedly identified platform harms, and that the company has not corrected these issues.

Among the investigation's reports are:

  • The company exempts high-profile users from some of its rules, despite stating company policies apply to everyone. This programme "shields millions of VIPs from the company’s normal enforcement, the documents show. Many abuse the privilege, posting material including harassment and incitement to violence that would typically lead to sanctions. Facebook says criticism of the program is fair, that it was designed for a good purpose and that the company is working to fix it."
  • Researchers at Facebook's subsidiary Instagram have repeatedly found that Instagram is harmful for a sizeable percentage of teenage girls. "In public, Facebook has consistently played down the app’s negative effects, including in comments to Congress, and hasn’t made its research public or available to academics or lawmakers who have asked for it. In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address."
  • Facebook employees flag drug cartels and human traffickers to little effect. Documents show "employees raising alarms about how its platforms are used in developing countries, where its user base is huge and expanding. Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses about organ selling, pornography and government action against political dissent, according to the documents. They also show the company’s response, which in many instances is inadequate or nothing at all. A Facebook spokesman said the company has deployed global teams, local partnerships and third-party fact checkers to keep users safe."

Along with statements on individual reports, and a rebuttal to the story on Instagram’s effects on teenage girls, Facebook published a response to the series acknowledging the investigation's topics are "some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens" and that "it is absolutely legitimate for us to be held to account for how we deal with them." It continues, "[b]ut these stories have contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees" and disputes the allegation that "Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company."

In late September 2021, the company announced it was "pausing" plans to develop "Instagram Kids", a version of the photo app for children under 13 years old.

In October 2021, Frances Haugen, a former Facebook product manager, revealed herself as the whistleblower who leaked internal company documents to the Wall Street Journal, and accused the company of placing “profit over safety.” She has filed complaints with US securities regulators that Facebook was misleading investors.

時間線

隱私資訊

本網站使用 cookie 和其他網絡存儲技術。您可以在下方設置您的隱私選項。您所作的更改將立即生效。

有關我們使用網絡儲存技術的更多資訊,請參閱我們的 數據使用和 Cookie 政策

Strictly necessary storage

ON
OFF

Necessary storage enables core site functionality. This site cannot function without it, so it can only be disabled by changing settings in your browser.

分析cookie

ON
OFF

您瀏覽本網頁時我們將以Google Analytics收集信息。接受此cookie將有助我們理解您的瀏覽資訊,並協助我們改善呈現資訊的方法。所有分析資訊都以匿名方式收集,我們並不能用相關資訊得到您的個人信息。谷歌在所有主要瀏覽器中都提供退出Google Analytics的添加應用程式。

市場營銷cookies

ON
OFF

我們從第三方網站獲得企業責任資訊,當中包括社交媒體和搜尋引擎。這些cookie協助我們理解相關瀏覽數據。

您在此網站上的隱私選項

本網站使用 cookie 和其他網絡儲存技術來增強您在必要核心功能之外的體驗。