image
1

The platforms and search engines will also be subject to annual assessments and audits , Source: Tetiana Shyshkina / Unsplash

The Digital Services Act – the EU’s new leash on big platforms and search engines

The Digital Services Act – the EU’s new leash on big platforms and search engines

The legislation will allow for unprecedented public oversight of online space

Yesterday, the EU Commission introduced the new Digital Services Act (DSA) that aims to create more accountability amongst internet platforms. The DSA, which entered into force on 16 November, comes with a very serious promise – to mandate more measures against fake news from big players in online services.

It also comes with a claim of allowing for more freedom of speech for average citizens by opening up platforms for outside arbitration and litigation.

Furthermore, the DSA calls for the creation of Digital Service Coordinators in each Member State, which will cooperate with the Commission and the European Board of Digital Services. Their job would be to supervise platforms and search engines on a local level.

Digital services’ new role

According to a statement by the Commission, the DSA burdens platforms with new responsibilities to limit the spread of illegal content and products online, increase the protection of minors, give users more choice and better information.

However, the policy also distinguishes these obligations based on the number of people using the platform, with bigger players having more responsibility towards their users. All relevant players would have to use a new flagging system for illegal content.

Yet, platforms and search engines with more than 45 million users would have to be subject to annual safety assessments and audits performed by the EU. These assessments will be about illegal goods or disinformation, for example.

Public oversight for digital platforms

The DSA creates an unprecedented level of public oversite over large online platforms and search engines, both on the national and EU level. It also opens them up the platforms to much more public scrutiny.

For example, platforms will be limited in their ways to moderate content as users will have more ways to challenge their decisions, including when they are based on a platform’s terms and conditions. Users will now be able to complain directly to the platform, choose an out-of-court dispute settlement body or go to court.

The largest platforms and search engines will also have to make comprehensive risk assessments for people’s fundamental rights, including freedom of expression, personal data protection and child protection.

This article is part of Read Twice – an EU-funded project, coordinated by Euro Advance Association that targets young people and aims to counter disinformation and fake news by enhancing their skills to assess critically information, identify vicious and harmful media content and distinguish between facts and opinions, thus improving their media literacy competences.

The contents of this publication are the sole responsibility of its author and do not necessarily reflect the opinion of the European Union.

Newsletter

Back

Growing City

All

Smart City

All

Green City

All

Social City

All

New European Bauhaus

All

Interviews

All

ECP 2021 Winner TheMayorEU

Latest