image
1

Going online will also involve looking inward from now on

Emotions and Perceptions: Disinformation exploits the things that make us human

Emotions and Perceptions: Disinformation exploits the things that make us human

Last Friday’s insightful webinar, part of the Read Twice project, revealed that we can’t take things for granted in the digital sphere anymore, but there’s hope

On 12 January, the Read Twice (R2) project held its international webinar “Beyond the Headlines: Tools and Skills to Detect Disinfo in the News”. The event went down with a great aplomb attracting an audience of over 150 participants keen to find out how experts deal with the pervasive phenomenon of false information in the online world.

In case you missed it, we’ve prepared a recap of the webinar for you so you can peek into some of the eye-opening ideas that were discussed throughout its duration.

What emerged from the two panels and workshop that took place over 5 hours and involved eminent experts from different European countries, working in distinct spheres were two main themes that humankind needs to consider, adapt to and overall internalize if it wants to thrive in the context of disinformation – on the one hand misleading stories work because they use tricks to play on our primal emotional strings, while on the other, thanks to technological progress and AI, these false stories are also now increasingly visual.

The emotional power play of disinfo tactics

Disinformation has always been around and always will be with us as long as human civilization exists. Its tactics evolve with societies but what always remains the same is that its potency rests on the ability of its purveyors to trigger emotions which can then potentially trigger decisions and (re)actions. It can be used to shape political opinions, to make people buy something or just to generate viewership online.

The realization that we are quite amenable to emotional manipulation can be depressing but it doesn’t have to be. That’s why, Andy Stoycheff from NTCenter offered his theory of the Brain as a Predictive Machine, based on his deep and expansive on the cognitive layers of disinformation. The audience was enthralled by his point-on analogies describing the multi-pronged ways that disinformation and propaganda reach our consciousness. It is often down to using the right adjectives, colours, sounds and symbols, which help surpass rational thinking and tap straight into the limbic system to initiate a primal response.

This was very revealing indeed, however, we as humans usually do a bad job at self-analysing and turning a critical lens on our reactions and emotions. That’s why, external help from a young age is necessary and Dr Kari Kivinen (Faktabaari) took the stage to show why Finland has habitually taken the top spots when it comes to media literacy in Europe and the world. It’s not a question of coincidence but a well-planned out policy to integrate the self-awareness about the children’s relationship with information from the youngest ages.

What’s more, now there is a need to develop a new skill called AI literacy and awareness, which will determine our ability to be functioning citizens of the evolving society. Disinformation is common but misinformation is just as rampant, even if it’s not intentional. Children will now have to be taught how to relate to AI and to harness its power while staying vigilant and skeptical in a healthy way towards the information it provides.

On the other end of the spectrum, we have societies such as Bulgaria, where due to a variety of complex factors in the past decades have not paid sufficient attention to the issue of constantly updating the educational curriculum and now the result presents itself in lower overall media literacy, which presents itself into higher vulnerability and susceptibility to external propaganda and conspiracy theories. Dr Keith Kiely from Gate Institute has been studying these effects in the Eastern European country for years, having identified several pain points that need to be remedied in order to increase media literacy.

Education is paramount but is hardly enough when there is distrust in society towards its public institutions, such as politicians and traditional media outlets. Unfortunately, the transitioning period from a communist regime to a democratic society has created deep-seated economic inequalities where many people have ended up on the losing side of this divide. This sense of distrust can then be and has been, easily exploited by the Russian propaganda machine to promote Bulgarian political forces friendly to its objectives. A case in point was using the traditional distrust by poor Bulgarians towards their institutions to sow the idea that Ukrainian refugees to the country were being treated preferentially at the expense of native citizens in need.

Sara Ahnborg, from the Spokesperson’s Unit at the European Parliament, demonstrated that a lot of the above can also be a threat to more well-to-do countries in Europe. In fact, Moscow- and Beijing-produced disinformation is now considered the main threat to the upcoming European elections later this year.

Putin’s regime relies on instilling certain narratives that dissatisfied people across the continent can find resonating and this time around instead of likes and shares it could actually translate into direct votes that would bring in strong populist and anti-democratic parties to Brussels.

Some of the narratives are perennial populist classics that always find an audience, such as “the elites vs the people”, “threatened traditional values”, and “supposed loss of national sovereignty or personal identity”, but there are also new tactics employed. One such is the so-called “Hahaganda”, which is a way of dismissing serious issues and legislative decisions as mere nonsense or absurdity.

An example of that was the Russian Foreign Ministry’s reaction to the decision of the European Parliament to declare Russia a sponsor of state terrorism after the continuous attacks on the Ukrainian civilian population. The spokesperson of the Russian Foreign Ministry, Mariya Zakharova, posted on Telegram that she offered to “declare the European Parliament sponsor of idiotism”. This may bring a chuckle or two but does nothing to address the serious issues on the ground stemming from the Kremlin’s war policy.

What you see is not what you may get

Self-awareness about the way disinformation targets our emotions is crucial to develop in our constant quest to become responsible and enlightened citizens. However, with the rise of AI, that alone will likely not be sufficient.

The problem is that AI has ushered in a parallel phenomenon – deep fakes, or the ability to create realistic and authentic-looking photos, videos and sounds of things and events that have never happened.

Dr Federica Russo, Professor of Philosophy at Utrecht University, put things into a chilling perspective citing her research into that phenomenon through the SOLARIS project. In her view, we as a human society now not only face the question of “what is true and what is false” but also that of “what is wrong and what is right”.

She pointed out the way we’ve been accustomed to think of pictures and videos as “epistemic backstops”, that is items that indisputably prove the veracity that something has taken place, or as the common saying goes – a picture is worth a thousand words. This, however, no longer holds true since pictures and videos can not only be manipulated but are entirely made up from scratch as a product of fiction and not something that was ever captured by a camera.

Not only the information but now the possibility of disinformation overload in the digital space will inevitably lead to pervasive confusion and distrust. More than ever critical self-awareness and education about one’s role in the network of social actors that participate in the creation and consumption of deep fakes is necessary to be implemented in our educational standards. Citizens must become self-questioning, cautious and proactive in choosing their sources of information.

Both Dr. Russo and Laura Bante, an Instagram and TikTok influencer with hundreds of thousands of followers, however, were united in pointing out that creating video and image content on the Internet doesn’t have to be always linked to disinformation and nefarious purposes. In fact, both deep fakes and artistically edited true videos have been used for education and entertainment.

That being said, people like Laura Bante, who enjoy a tribune and large audiences do have the ethical responsibility to apply the above principles not only when consuming content but also when creating it as well. Laura shared a story of how unintentionally she spread misinformation about a beauty hack without considering its potential harmful effects. Her ethical responsibility, however, also led her to produce a follow-up video explaining how it had been wrong to offer such ill-considered beauty advice. It is this act that showed her own self-awareness in the role she has in the (dis)information chain in social media.

The webinar ended on a practical note, which offered useful advice to the audience. Tilman Wagner, from Deutsche Welle, demonstrated a digital application in development, called Spot, which helps professional journalists, but also anyone else, to identify where a certain picture or a video has been taken. That is quite revolutionary as an innovation because it lets users describe what they see in a picture or a video and have the tool search other images around the world that match what has been described through the OpenStreetMap database. That tool has the ability to turn the act of geolocating, which is becoming an indispensable ability for modern journalists, into an easy task akin to doing a simple search on Google.

What the webinar brought into perspective was the fact that more than ever we need authoritative, trusted and reliable media sources, which is both good news for traditional media outlets, which in the past years were considered to be on their way out, but it also serves up a message of caution. The audience needs to identify and know their trusted information source more intimately than ever before.

That message was emphasized by journalist Fernando Costa, from the Portuguese Publico, which is an authoritative media in the Iberian country and precisely that status has caused disinformation-spreading actors to latch onto it and exploit it. The propaganda machine of the far-right Chega party resorted to copying the design layout of Publico’s website in order to create fake headlines of the news that the media has never published.

The effect of this was to both “kidnap” some legitimacy for the conspiracy theories that Chega is peddling in Portugal and to sow the seed of doubt in the minds of Publico’s regular readers. Mr Costa advised participants to treat images and headlines seemingly from their trusted media sources but shared through social media accounts with caution and curiosity, and to preferably follow the news directly from the media’s official website.

Newsletter

Back

Growing City

All

Smart City

All

Green City

All

Social City

All

New European Bauhaus

All

Interviews

All

ECP 2021 Winner TheMayorEU

Latest