abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Diese Seite ist nicht auf Deutsch verfügbar und wird angezeigt auf English

Artikel

16 Nov 2018

Autor:
Mark Zuckerberg, Facebook

Letter from Facebook CEO regarding content governance & enforcement

Alle Tags anzeigen

"A Blueprint for Content Governance and Enforcement," 15 Nov 2018

[W]e have a responsibility to keep people safe on our services -- whether from terrorism, bullying, or other threats. We also have a broader social responsibility to help bring people closer together -- against polarization and extremism. The past two years have shown that without sufficient safeguards, people will misuse these tools to interfere in elections, spread misinformation, and incite violence.

... The single most important improvement in enforcing our policies is using artificial intelligence to proactively report potentially problematic content... Moving from reactive to proactive handling of content at scale has only started to become possible recently because of advances in artificial intelligence... In the past year, we have prioritized identifying people and content related to spreading hate in countries with crises like Myanmar. We were too slow to get started here, but in the third quarter of 2018, we proactively identified about 63% of the hate speech we removed in Myanmar, up from just 13% in the last quarter of 2017... A fundamental question is how we can ensure that our systems are not biased in ways that treat people unfairly... In the next year, we're planning to create a new way for people to appeal content decisions to an independent body, whose decisions would be transparent and binding... While creating independent oversight and transparency is necessary, I believe the right regulations will also be an important part of a full system of content governance and enforcement. At the end of the day, services must respect local content laws, and I think everyone would benefit from greater clarity on how local governments expect content moderation to work in their countries.