abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

22 Mar 2018

Author:
New York University Stern Center for Business & Human Rights

New report calls on businesses to protect human rights on digital platforms

"Harmful Content: The Role of Internet Platform Companies in Fighting Terrorist Incitement and Politically Motivated Disinformation", Nov 2017

...[T]his white paper...focuses on two types of dangerous online content: terrorist incitement and politically motivated disinformation...Internet companies have resisted government content regulations. They justifiably worry that many states would seek to suppress dissenting views, undermining free speech online...

...In the absence of government regulation, however, it is incumbent on the major platforms to assume a more active self-governance role. Corporate leaders should take responsibility to vindicate core societal interests...while elevating journalistic reporting and civil discourse...

...We urge the companies to conduct across-the-board internal assessments of the threats posed by terrorist content and political disinformation. This risk analysis should call on engineering, product, sales, and public policy groups to identify problematic content, as well as the algorithmic and social pathways by which it is distributed...[T]he platform companies must continually refine....programmes to account for changing circumstances...Companies can adjust their user interfaces to include warnings, notifications, and other forms of friction between suspicious content and individuals...This kind of mild but informative friction discourages a user from sharing content or from clicking on fraudulent material...[I]nternet platforms will need to devote a significant number of people to monitoring and evaluating content....[refers to Google, Facebook, LinkedIn, Twitter & YouTube]