The Digital Services Act

The Digital Services Act

The final negotiating phase regarding the Digital Services Act (DSA) in the so-called Trilogue meeting (between the European Commission, European parliament and the Council of the EU) has started. The DSA will be a common set of obligations and accountability rules for providers of network infrastructure, hosting service providers and online (social media) platforms. Online platforms have become increasingly important in our daily life, bringing new opportunities, but also new risks. It is our duty to make sure that what is illegal offline, is illegal online. This is also without a doubt a crucial moment for the future of the moderation of online hate speech. The International Network Against Cyber Hate therefore finds it important to give a statement in which we highlight the important issues that should not be overlooked by European lawmakers. 

We welcome the establishment of more user-friendly mechanisms to report content to international social media platforms. These mechanisms need to be well known to everyone, and especially accessible for children and young people. Right now, the difficulty of the reporting mechanisms and the lack of knowledge about the existence of it, discourages reporting of online hate speech. 

The DSA mandates social media platforms to provide content creators with information and reasoning when their content is removed. However, it doesn’t oblige them to provide more transparency towards users. It is important for users to understand the decisions made by a platform regarding reported content, since it directly affects their lives. Especially when these users are young. 

The DSA recognises a procedure of using Trusted Flaggers, a system that has been well-established for several years to utilize the experience of Civil Society Organizations. However, it is very important to underline that these Trusted Flaggers should come from a wide range of different CSOs, from thematically diverse organizations and not only those who are big enough to lobby for a spot. 

The technologies to voluntarily detect and remove certain forms of content (e.g. CSAM or terrorist materials) exist for years. These technologies should be applied to more topics, like online hate speech. Known illegal content could be identified using hash values and Artificial Intelligence to flag potentially illegal content for human review. 

Social media platforms should also create a point of contact for users for every member state that is accessible in one of its official languages and gives proof of delivery. These points of contact should be meaningful and accessible. Finally, there is an increase of new platforms that have a high number of users but at the same time have no assets or representatives accessible within the EU. The DSA should allow specific enforcement measures against these fundamentally non-compliant services. 

We are looking forward to a positive outcome of the Trilogue meeting!