image
1

Content moderation is about making the right decision, Source: Depositphotos

Combating disinformation with content moderation? The modern ethical dilemma

Combating disinformation with content moderation? The modern ethical dilemma

The way information is treated in the public space varies wildly across the world

Content moderation in the online space presents a profound ethical dilemma, particularly when it comes to combating the spread of dis- and misinformation. On one hand, there is a moral imperative to shield users from precarious falsehoods that can undermine crucial sectors of public interest, such as democracy, education, public health, and social cohesion in general.

On the other hand, from an ethical point of view, the very act of moderating content raises complex questions regarding freedom of speech, censorship and the ever-growing power of tech companies and, subsequently, their trusted associates to shape public discourse and popular opinion.

From election interference to public health crises and even the daily news, the consequences of unchecked and non-moderated content can be from mildly concerning to devastating. As such, there is a pressing need for platforms to take proactive measures to curb its spread.

The imperfection of moderation

However, the task of content moderation is rife with ethical challenges. Determining what constitutes disinformation is oftentimes subjective and context-dependent, raising concerns about censorship and the stifling of legitimate discourse.  Moreover, the sheer volume of content uploaded to online platforms every minute makes it virtually impossible to review every piece manually, thus necessitating the use of automated moderation systems.

These automated systems, while time-efficient, are prone to errors and biases, leading to the inadvertent removal of legitimate content or, at the other end of the spectrum, to the failure to detect sophisticated disinformation campaigns. Additionally, the opaque nature of content moderation algorithms raises questions about accountability and transparency, as users are left in the dark when it comes to why their content was flagged or removed.

Furthermore, the concentration of moderation power in the hands of a select few tech giants raises another set of concerns about censorship, this time in regard to the potential erosion of democratic values. As private entities, these platforms have the authority to set their own moderating policies, often with little to no oversight or accountability.

Some solution proposals

In order for content moderation to be effective, legislative measures should be set in place by a general council appointed specifically with this purpose in mind. Such a council would consist of communication and journalism professionals, social academia and international law and policy specialists from all around the globe. In this manner, the final consensus would take notice of all perspectives and find a proper common ground.

However, one singular set of regulations that would be applied across the entire planet would also pose challenges as the specifications within it might not align perfectly with communication and free speech legislations in force in certain countries.

In addition, since many media companies are private entities, governments have little to no say in what kind of policies they can enforce within their own portals, making it more difficult to reach a consensus in terms of content moderation across all platforms and countries.

When navigating this ethical minefield, striking a balance between combating disinformation and upholding free speech rights is as difficult as it is essential. This initiative would require transparent moderation processes, robust oversight mechanisms, and meaningful engagement between stakeholders and public policymakers. It all needs to start with a common understanding of the issue, as well as a shared purpose to combat it.

This article is part of Read Twice – an EU-funded project, coordinated by Euro Advance Association that targets young people and aims to counter disinformation and fake news by enhancing their skills to assess critically information, identify vicious and harmful media content and distinguish between facts and opinions, thus improving their media literacy competences.

The contents of this publication are the sole responsibility of its author and do not necessarily reflect the opinion of the European Union nor of TheMayor.EU.

Newsletter

Back

Growing City

All

Smart City

All

Green City

All

Social City

All

New European Bauhaus

All

Interviews

All

ECP 2021 Winner TheMayorEU

Latest