abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Story

10 Dec 2021

Santa Clara Principles present standards for tech platforms to provide transparency and accountability in content moderation

In May 2018, a coalition of organizations, advocates, and academics came together to create the Santa Clara Principles on Transparency and Accountability Around Content Moderation in response to growing concerns about the lack of transparency and accountability from internet platforms around how they create and enforce their content moderation policies.

These principles represent recommendations for initial steps that companies engaged in content moderation should take to provide meaningful due process to impacted speakers and better ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of users’ rights.

During the COVID-19 pandemic, many platforms signalled that they would increase their reliance on automated tools for content moderation purposes. Some services also announced that they would be suspending their appeals processes, thereby impeding users’ access to due process.

Because of these concerns, between 2020 and 2021, the Santa Clara Principles coalition initiated an open call for comments from a broad range of global stakeholders, advocates and academic experts to develop the second iteration of the Santa Clara Principles, which were launched in December 2021.

Timeline