Search

Putin and Trump cast as champions of traditional values / Truth Talks: Estonia

Author: Karolina Zbytniewska

In Estonia, Russia is often depicted as a stronghold of patriarchal order, standing in stark contrast to the “morally degenerate” West, allegedly doomed to collapse under the weight of liberal values, equality, and solidarity, Maria Murumaa-Mengel of University of Tartu told EURACTIV.pl’s Truth Talks podcast.

IN BRIEF: Disinformation in Estonia

Dominant narratives: “failed state” narrative, narratives based on Soviet nostalgia, anti-NATO, anti-EU, Westernophobic, esoteric self-healing trends, “us vs. them” narratives

Main disinformation spreaders: Russia, the US, people seeking profits (e.g. organisers of webinars, sellers of “magic” items), domestic pro-Russian or nationalist parties tied to international right-wing networks or financed by Hungary

Most common false stories:
Estonia as a not entirely separate state in terms of culture,
today’s youth portrayed as “snowflakes”
in the past, society didn’t have “confusing things” such as LGBTQ+ identities or disabilities,
the West symbolising moral decay, eroding the patriarchy and traditional family structures, Russia portrayed as a bastion of patriarchy, while the „morally corrupt” West is doomed by its liberal values,
Ukrainians disregarding local culture and showing disrespect for significant monuments,
Ukrainians portrayed as uncivilised or even barbaric,
masculine leaders (e.g. Donald Trump or Vladimir Putin) portrayed as defenders of traditional values worldwide

Combating disinformation: Ridiculing false narratives and trends (e.g. the People Who Think AI-Generated Photos Are Real group on Facebook), enhancing media and digital literacy (cooperation with teachers, parents, community leaders), international collaborations (EDMO, BEDSID project), pre-bunking, high-quality journalism, push for better regulations, providing research and expertise

Today, we delve into Estonian disinformation landscape with Dr Maria Murumaa-Mengel, Associate Professor of Media Studies at the Institute of Social Studies, University of Tartu.

Karolina Zbytniewska, EURACTIV.pl: As a digitally advanced nation on the EU’s border with Russia, Estonia has been a frequent target of disinformation campaigns, particularly from foreign actors—but perhaps not only. What are the dominant disinformation narratives circulating in Estonia today?

Maria Murumaa-Mengel: There are many, and not all of them are disinformation. A lot of information disorder comes from misinformation—people simply not understanding the whole picture, being scared, being angry, and sharing things they haven’t fact-checked. Just human things.

What we know from working with researchers and practitioners from different fields, and from analysing disinformation narratives, is that existing divides and social problems create a very fertile ground for chaos, panic, and distrust toward institutions. It’s the same in Estonia. Whatever people are already debating heatedly is being used.

Most European countries have faced similar issues—migration, the legalisation of same-sex marriage, the pandemic, rising prices, sustainability, and climate change. Whatever people are arguing about is systematically and strategically exploited.

In Estonia, we see some narratives that aren’t entirely unique but are similar to those spread about Ukraine—particularly the “failed state” narrative. This attempts to depict Estonia as a country that isn’t really a country, despite its long history and distinct linguistic identity.

Similar claims were made about Ukraine, often accompanied by Soviet nostalgia. On social media, we see many thematic forums and groups that are almost retro-utopian, reminiscing about the “good old Soviet times”—remembering them as brighter than they actually were.

Older generations are especially vulnerable to these soft-power, long-game manipulation tactics. The world is always changing, and people tend to look back on their childhood and youth as the best time.

Narratives emerge about how “we didn’t have phones, we could run outside, be strong and independent”, while today’s youth are portrayed as “snowflakes”. This extends to more problematic ideas, like the belief that in the past, society didn’t have “confusing things” such as LGBTQ+ identities or disabilities—deeply flawed memories that weren’t true but are strongly amplified in these groups.

And, of course, there’s the familiar narrative of the “big bad” NATO, the West, and the EU—portrayed as forces of moral decay, eroding the patriarchy and traditional family structures.

Recently, we’ve been working on gendered disinformation and noticing how hyper-masculinity plays a central role in these narratives. They often frame Russia as a place where patriarchal order still exists, portraying it as a good thing, while the “morally corrupt” West is doomed because of its commitment to liberal values, equality, and solidarity.

In general, these narratives remain consistent because they are so effective. They rely on an “us vs. them” mentality, demonisation, and basic psychological mechanisms that shape how our brains process information. That’s why we see similar narratives not only in Estonia but also in Ukraine, the US, and elsewhere.

Do you think this is primarily disinformation and misinformation, as you mentioned? Or is it often just an opinion—an interpretation that originates in fringe circles and then gradually gains mainstream traction?

Of course, it’s a good thing that people debate and think critically about the world and its changes. We don’t want a two-party or one-party system where everyone has to think the same way—that wouldn’t be healthy for democracy.

But we do become concerned when we see the same repeated elements appearing in grassroots posts. For example, in 2022, when Ukrainian refugees arrived in Estonia, a specific story started circulating—not just in Estonia, but across the Baltics. It appeared in small towns and villages in Estonia, Lithuania, and Latvia, where Ukrainians had settled and integrated into local communities.

The narrative claimed that these “bad Ukrainians” didn’t care about local culture and were disrespecting important monuments—sometimes even urinating on them. The details varied—different places, different monuments—but the core story remained the same.

This was a deliberate disinformation campaign originating from Russia, designed to foster disgust and portray Ukrainians as uncivilised, even barbaric. Stories like these are powerful manipulation techniques, so we keep a close watch on such patterns.

You mentioned Russia, and that’s exactly the direction I wanted to explore further. You’ve pointed out that disinformation and misinformation narratives are used strategically and systematically. Who are the actors behind them? Who is actively creating and spreading disinformation in Estonia?

I think there are several types of actors. Many people spread misinformation simply because they’re human—we all make mistakes, get tricked, or share things without thinking.

But when it comes to deliberate manipulation of the public, that’s a different story. Since I research this field, my perspective might differ from someone else’s, who might categorise things differently.

The first category I’d highlight are those who exploit information disorder for profit—taking advantage of fear and the general anxiety caused by the polycrisis we’re experiencing.

Some actors in the disinformation landscape are simply looking to profit—selling webinars, life coaching, “miracle” solutions, or esoteric protective devices.

But what we see in Estonia is a chaotic mix of different narratives. Esoteric self-healing trends—crystals, pseudo-beliefs, alternative spiritual systems—are closely linked to conspiracy theories. These narratives often originate in Russia or migrate from the United States, spreading through global conspiracy networks.

Certain individuals exploit this situation, presenting themselves as having “the solution”—for a price. “Give me your money, and I’ll make everything better.” This sector worries me deeply because it represents the worst of human behaviour.

Of course, some genuinely believe in what they preach—protective energy bubbles, speaking to angels. That falls under misinformation, as they aren’t deliberately deceiving people. But those who cynically exploit these beliefs as a money-making opportunity? I have no respect for them. They are a significant, amorphous part of our information disorder.

Then, there are politically motivated actors. In Estonia, we’ve seen the emergence of small pro-Russian parties. Some openly talk about rejoining Russia, though that’s not their main messaging.

Many Russian-speaking Estonians are loyal to Estonia and don’t support reintegration with Russia, so the rhetoric is often softened: “Let’s be good neighbours,” “Let’s not provoke Russia,” or “We support peaceful relations.”

On the other side, we have nationalist populist parties that are not pro-Russian but use strong national symbols—flags, national emblems, patriotic imagery—to appeal to voters. These groups often have connections to international right-wing networks.

Investigative journalism has revealed that some Estonian parties receive funding from Hungarian right-wing and religious organisations. This isn’t just about making money—it’s about advancing political ambitions.

It sounds like a kind of international disinformation.

Exactly. And everything is international. Many conspiracy theories and so-called “heroic saviours” are not local figures. The same strong, masculine leaders—Trump, Putin, and others—are portrayed as defenders of traditional values worldwide.

And, of course, we can’t ignore Russia’s massive disinformation operations. It would be naïve to think otherwise. Russia strategically deploys information manipulation tactics in every neighbouring country—and beyond. They employ individuals whose sole job is to comment online all day, using multiple accounts.

Their messages don’t even have to be explicitly pro-Russian. Often, the goal is simply to erode trust, create chaos, and flood the information space with mixed signals. Unfortunately, I think we’ve underestimated the scale of these actors—akin to China’s “50 Cent Army” or other known disinformation networks.

This is a topic I’m particularly passionate about. In the past, we advised people to withdraw from toxic information spaces for the sake of their mental health—to go on “media diets” and avoid disturbing news. But we no longer say that.

Now, we emphasise that staying informed and speaking out is a civic duty. Even if you fear online trolls or being dragged into the mud, it’s crucial to engage. Otherwise, the loudest voices will belong to those who are paid to spread misinformation or are deeply entrenched in it. Their content gets seen, shared, and archived.

That’s why experts in various fields must be present in online discussions. My expertise is media, so I try to provide credible sources and promote scientific literacy. Doctors should counter medical misinformation. Military experts should address defense-related disinformation.

We need real experts to be visible. Because when we engage in these debates, we’re not really arguing with the trolls. I don’t even care what, if they change their mind or whatever. I care about the invisible masses who never leave a mark of their presence, because most social media users are lurkers.

Many people turn to social media just to observe—checking the conversation, getting a sense of public sentiment, and forming their opinions. They don’t actively engage or leave a trace, but they do need correct, factual, evidence-based information.

If that information isn’t available, and all they see are trolls arguing and spreading nonsense, their understanding of critical issues—like vaccines or international energy ties—will be fundamentally flawed.

In Poland, we recently saw a wave of AI-generated images portraying idyllic rural scenes, like a farmer who supposedly built a special barn but never got recognition. These posts would receive thousands of comments saying, “That’s beautiful,” or “We’d love to support you.” Do you see similar trends in Estonia?

Oh God, so much. One way to counter it is through humour and ridicule. I know that’s a slippery slope, but social sanctioning works on different levels. Making fun of these deceptive narratives can be an effective way to push back.

We do that, but humour tends to stay within certain bubbles.

True. But there’s a great Facebook group called People Who Think AI-Generated Photos Are Real, with around 200,000 members. If any of our listeners are on Facebook, I highly recommend joining—it’s full of both absurd examples and more serious discussions on detecting AI-generated content.

Many of those comments—“That’s so beautiful,” “Great job,” “I’d love to buy one”—are likely from bots. There are so many fake accounts generating artificial engagement.

I also think older generations are more vulnerable to this. We work a lot with young people, and while not all of them, the media-savvy ones tend to be almost paranoid about digital content—constantly questioning, “Wait, is this real? What’s the source? Can I trust this?” Of course, that kind of hyper-vigilance comes with its own challenges.

There are persistent narratives designed to make people want to believe something. Feel-good viral stories are a great example. But when it comes to harmful misinformation, it’s not usually about a single incorrect fact. It’s more insidious—opinion pieces, subtle messaging, slow narrative-building.

A kind of disinfo vibe.

Exactly. It’s about creating a backdrop—shaping what people see as normal, what’s acceptable, how the world order should function, and who is entitled to what. Disinformation often spreads through hints, memes, and jokes—subtle, seemingly lighthearted content that gradually influences people.

What is being done to fight disinformation in Estonia, and what more would you recommend?

I’m a media professional by training, with a background in journalism and communication. Universities played a crucial role, especially during the pandemic. Our alumni and professional networks reached out, saying, We need to do something. That’s how we became a hub for both formal and informal initiatives.

At the University of Tartu, we’ve been active on multiple fronts. We see media and digital literacy as essential—arguably more important than some traditional school subjects. We work with teachers, youth workers, librarians, parents, and community leaders, spreading the message: Here’s how you verify information. Here’s how you question people’s agendas.

We’ve developed educational materials for all age groups, starting from kindergarten, because even young children are independent media users and can be vulnerable to misinformation or even radicalization. Our approach is playful, ensuring that media literacy education reflects real digital habits. Instead of lecturing kids about TV news—something they rarely watch—we talk about influencers, the attention economy, and the mechanics of online influence.

We’re also involved in international collaborations, particularly through the European Digital Media Observatory (EDMO) network. One key project is BEDSID—the Baltic Engagement Center for Combating Information Disorders, which helps us compare misinformation trends across different countries. Recognising both similarities and differences is crucial—for example, some narratives spread universally, while others are more localised.

A major part of our strategy is pre-bunking—inoculating people against misinformation before they encounter it. These disinformation threats aren’t going away; they’re an inherent part of the information ecosystem.

We also support professional journalism however we can—providing expertise, fact-checking, and amplifying quality reporting. Without independent, democratic journalism, democracy itself is at risk. We’ve seen this happen. That’s why we work to ensure journalists have access to resources, training, and multilingual content.

At the University of Tartu, we’ve launched a one-year master’s program in English, focused on building societal resilience to information disorders. The first group has just completed the program, and it’s been eye-opening. Countries as different as Estonia and Armenia face identical disinformation tactics aimed at sowing distrust in society. Sharing knowledge and best practices across borders is essential.

Beyond education, we also push for better regulations—advocating for stronger legal frameworks to counter disinformation. But Estonia, with just 1.3 million people, is a tiny market. Companies like Meta don’t prioritise moderating disinformation in small languages and cultures. That’s a major challenge.

That’s why we need to stand alongside others and lobby for better regulations on major online platforms—to put an end to this broligarchy that’s taking hold.

We’re contributing with the best research and knowledge available, ensuring that our advocacy is backed by solid evidence. And as I mentioned earlier—whoever is willing to listen, we urge them to take action.

Don’t just give up and bury yourself before you’re actually dead. Do something. Even if it’s small—just a minute of your day—make an effort. Leave a mark. Improve things, even in the tiniest way. We can’t let this be the end of us.

You can learn more about the disinformation landscape in Estonia here.

Source: EURACTIV.pl

Partners