abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

13 Apr 2021

Tim De Chant, Ars Technica

Facebook users can now petition company's oversight board to remove content

Facebook is allowing its oversight board to rule on moderation decisions relating to content that remains on its Facebook and Instagram platforms. Previously, the only rulings the board could issue were to restore content that moderators had removed.

... To appeal a post, a person must have an active Facebook account and must have exhausted the company’s appeals process. At that point, the user can take their petition to the oversight board.

In the six months since it was founded, the board has made eight decisions, basing its rulings on Facebook’s policies and overruling the company in five of them. The board’s charter says that its rulings are binding “unless implementation of a resolution could violate the law.” Since Facebook is available in every country except those that officially ban it, including China, it’s unclear which laws those include.

... Neither Facebook nor the oversight board has disclosed how many removals have been appealed. But based on reported figures, the board has ruled on around 0.003 percent of posts moderated for violating hate speech policies... Facebook appears to be hoping that automation can expand the oversight board’s decisions to other posts it thinks apply. When that happens, the company says it will “take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well.” It’s unclear if any moderation that results from automated applications of decisions can be appealed to the oversight board.