abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

このページは 日本語 では利用できません。English で表示されています

レポート

2021年10月6日

著者:
Eliska Pirkova, Eva Simon, Chloé Berthélémy, EDRi

Human rights experts warn that EU's Digital Services Act could repeat regulatory mistakes & negatively impact fundamental rights

"Warning: the EU’s Digital Services Act could repeat TERREG’s mistakes", 06 October 2021

On 30 September, the Committee of Legal Affairs (JURI) in the European Parliament approved its draft report on a Single Market for Digital Services (Digital Services Act) [or DSA]. We had expressed our concerns about negative fundamental rights implications of some proposed measures in the report to the Members of the European Parliament...

[In 2018, The European Commission] introduced the first text of the online terrorist content regulation, commonly referred to as ‘TERREG’ [or TCO]. The controversial legal proposal full of shortsighted “solutions” to deeply complex societal issues such as terrorism and radicalisation online triggered negative critique by international human rights bodies and civil society organisations...

[…]

In the last two years, there have been heated debates in the European Parliament regarding the TCO Regulation content removal deadline, whether a one-hour time frame is doable for providers to remove or block terrorist content from the internet... [The DSA], offers 24 hours to the providers to disable access to an illegal content, such as content that can harm public policy...

[The requirement] to remove allegedly illegal content in such a short time frame without the possibility to turn to the court will eventually empower platforms with extra power without proper oversight...

[…]

...While the legislator missed the opportunity in the TCO Regulation to require independent judicial oversight of users’ content removal, the very same problem emerges from the DSA amendments submitted by the responsible IMCO committee. Competent authorities can and should support platforms when needed, but these authorities are not equally protecting freedom of expression as the judiciary...

[...]

...DSA goes even further and requires platforms to proactively inform law enforcement if “serious criminal offence is likely to take place”, even though platforms and moderators are not trained lawyers or able to assist victims of crimes. Such a requirement will empower platforms to censor user content for the slightest chance of illegality to avoid any liability.

タイムライン