Facebook establishes oversight board to review content moderation decisions
In November 2018, Facebook announced plans for the creation of an oversight board for content decisions. Over the next year its form would be finalised as a body of external experts to review cases where users have objected to the removal of content from the company's platforms and exhausted the company's appeals process. The board is made up of 20 academics, lawyers, and human rights activists, first appointed in May 2020.
Since the board started accepting cases in October 2020, it has received tens of thousands of appeals, of which it will rule on a small number of controversial decisions. Each of the chosen cases is reviewed by a five-member panel of the board, which then presents a decision for majority approval. The board’s decisions on specific posts are binding, meaning Facebook must restore the content if instructed by the oversight board. The board can also issue recommendations that Facebook can choose whether or not to adopt.
The board’s limited remit has been the subject of criticism, with researchers suggesting the company should grant the oversight board more scope to review practices concerning user data, advertising, algorithms, and controversial content that has been left up. Business for Social Responsibility (BSR) has conducted human rights reviews into the oversight board's governance and operations.
In January 2021 Facebook referred to the oversight board for review the decision to remove Donald Trump following years of using the company's platforms to spread lies and then inciting violence at the US Capitol.
The oversight board announced its first decisions in January 2021, overturning Facebook's removal of content in four out of five cases reviewed, and issued nine non-binding policy recommendations.