
Christina Elmer is Professor of Digital Journalism and Data Journalism at TU Dortmund University, an experienced data journalist and co-coordinator of GADMO (German Austrian Digital Media Observatory)
By Merle van Berkum
This text is part of the PROMPT series.
In 2025, Germany was facing a crucial federal election in the midst of a tense political climate following the dissolution of the coalition government. The election campaign was increasingly characterised by targeted disinformation campaigns, often spread via social media and sometimes by political actors themselves. What are the mechanisms behind this? How can journalists counter it? And what responsibility do platforms and regulatory authorities bear?
In this interview, Christina Elmer, Professor of Digital Journalism and Data Journalism at the Technical University of Dortmund, experienced data journalist and co-coordinator of GADMO (German Austrian Digital Media Observatory), speaks with Merle van Berkum about the challenges of the digital information landscape and the effectiveness of fact-checking initiatives and regulatory efforts at the European level. She highlights areas where progress has been made, and where the media, platforms, and politics need to take stronger countermeasures.
Merle van Berkum: What is your general assessment of the information situation ahead of the German federal election? What relevant developments do you see here?
Christina Elmer: Our impression is that the spread of misinformation is currently permeating the entire political landscape. We would expect political actors in particular to communicate truthfully and based on facts, especially in prepared speeches or statements. But apparently that is not or no longer the case, which I personally find very depressing. We repeatedly observe cases in which false or misleading information is disseminated, which ultimately undermines credibility.
Before and during the federal election – and this topic is far from over – we noticed that public discourse on the internet was repeatedly influenced by fake news and disinformation campaigns, which could then be classified accordingly. On the one hand, these were topics that are repeatedly observed before democratic elections, for example, topics that question the integrity of elections. These include claims that postal votes were destroyed or ballot boxes were not properly sealed. Here, we were able to benefit particularly from the experience of the European network EDMO and point out these issues in a preventive way. But then there were also targeted false reports related to individual persons, in particular to Robert Habeck and Friedrich Merz. We also saw campaigns that were obviously co-directed from Russia, such as “Storm-1516”.
So we are also seeing an attempt at exerting influence from abroad. And what I find particularly worrying is that we are also seeing misinformation in the political discourse itself, in parliament, in talk shows and statements – from where it then often spreads particularly quickly.
From a journalistic perspective, what levers do you see for counteracting this?
A first approach would be to strengthen trust in one’s own work and to make journalistic working methods and procedures transparent. It is important to explain what distinguishes quality reporting from other sources. Many people have difficulty assessing the credibility of information. Therefore, reporting should not only be accurate, but should also convey media literacy. The readership should be made aware of how disinformation works and encouraged to reflect critically on their own media consumption, especially in emotionally charged moments, when you want to impulsively share a message, you should stop and question the source. That would be a very specific recommendation to prevent you from contributing to the spread of misinformation in such moments.
In my view, however, teaching these skills is not just the job of the media. It would be desirable for media literacy to be promoted more in schools. As we know from research, young people are increasingly getting their information from social media, especially from platforms like TikTok, which they use in a similar way to a search engine. So the challenge is how best to reach these target groups.
How did you perceive the role of the platforms before the federal election? Were there regulations or efforts to counter disinformation?
There were certainly efforts to protect the integrity of the election. Various measures were implemented in the context of EU regulations. For example, the EU Commission organised a so-called rapid response system, in which we were also involved. This system made it possible to report particularly problematic postings directly to the platforms’ contact persons. The reaction was usually quick, some posts were deleted after an internal review. In some cases, posts were found not to have violated guidelines, but were still restricted in their visibility to prevent mass distribution. TikTok also reacted quite quickly in such cases. It was very good to have this direct line and to be able to have posts checked that were associated with a high risk.
In addition, there were supportive measures at the European and German level in the context of or related to regulation. For example, the Federal Network Agency conducted a stress test with the platforms to analyse possible scenarios and see how they could be prevented or counteracted. Nevertheless, there is still a lot of room for improvement. Research by Correctiv has shown that content reported by ordinary users remained online on TikTok without any reaction. This shows that the current mechanisms of regulation are not yet sufficiently effective. We are also currently in the transition phase from the Code of Practice on Disinformation to a Code of Conduct, i.e. from a voluntary commitment to a code of conduct within the Digital Services Act. In this process, differences can already be seen between the platforms, for example, in terms of how they work with fact-checkers or support researchers. So there are definitely ups and downs. My impression is that it is simply an issue that is only taken seriously by all players when it comes into contact with a systematic risk.
What role did fact-checking initiatives play in the election campaign? Were they able to effectively counter disinformation?
Of course, we can’t measure exactly to what extent fact-checking has made a difference. Unfortunately, as I said earlier, some political actors have also participated in the spread of misinformation. This is particularly problematic when such statements are left uncommented and not supplemented with facts. In some cases, platforms have even encouraged disinformation by giving certain content more reach. One example of this is the interview between Alice Weidel and Elon Musk on X, which provided a large stage for potentially misleading statements. Those responsible should have realised in advance that this format carries a high risk of misinformation that cannot be captured and effectively addressed afterwards.
If we compare the current development with past federal elections: has the problem of disinformation intensified?
Yes, I have that impression. A look at other countries shows that there is always influence on elections. In Romania, the national elections even have to be repeated due to suspected manipulation in the area of disinformation. Research shows that such campaigns are often systematically controlled. It is particularly worrying that some platforms are currently scaling back their efforts to combat disinformation. In the US, for example, Meta wants to end its collaboration with fact-checkers and is instead relying on Community Notes. This also gives the topic of quality control a new dynamic. In the medium term, this could also have an impact on Europe. Furthermore, we are seeing that accusations of censorship are increasingly being directed against EU regulatory proposals and related measures in which media are also involved. This could further undermine trust in quality journalism.
What countermeasures are there? Can governments or platforms take countermeasures?
The EU is funding numerous projects in this area. The Digital Services Act (DSA), for example, could lead to stronger monitoring capabilities, enabling better analysis of disinformation. Nevertheless, it remains to be seen whether the platforms will participate in measures in the long term and implement them effectively.
It is particularly worrying that in the US, companies like Meta are scaling back their protective measures. If this trend spreads to Europe, it could further undermine the credibility of digital content and also intensify the accusation of censorship.
It would also be important to have attentive and constructive moderation on the platforms – and, of course, an appropriate climate in digital discourses, although I have my doubts as to what extent this climate can be established on the platforms with their mechanisms.
What role do media organisations play in this process?
The media can create or improve spaces for discourse by providing trustworthy information, correcting false statements and providing the necessary context so that statements can be properly classified. Of course, the media will continue to try to reach as many people as possible with reliable information. Young target groups in particular, who consume traditional news offerings less and less, need to be addressed through other channels. This also applies to media literacy offerings, which should also be more strongly integrated into schools and teacher training. The challenge is to encourage and enable people to reflect on information and make fact-based decisions.
Is there a prognosis for the future of the digital space in terms of disinformation?
It is difficult to predict how the digital world will develop. The dynamics are rapid and premises can change quickly. For example, no one could have foreseen how much platform strategies would change in recent months. The next federal election will show whether regulations are having an effect or whether platforms continue to play a major role in the spread of disinformation. But we will, of course, continue to monitor this issue within GADMO and other projects at the Institute for Journalism Studies.
To conclude: What can we take away from the discourse surrounding the federal election?
During the election campaign, we observed numerous false reports about individual candidates, the electoral process or polarising topics such as migration or energy transition. These narratives were also reflected in political discourse. And a lot has been done in the context of current regulation to counteract this. However, public interest in the topic seems to be waning again after the election. This is probably also due to the fact that the potential impact is no longer as drastic. At the same time, the digital discourse often lacks factual accuracy. Many people also share information that they know is not true because it fits their own group identity or generates attention. This presents journalists with the challenge of preparing fact-based reporting in such a way that it also reaches those who tend to operate in echo chambers.
Another important aspect is that fact-checking alone is often not enough. To truly banish a false narrative from people’s minds, you need an alternative narrative that is just as emotionally appealing. This is a major challenge for journalism, but also an important task for the future. We should recognize and take such psychological factors into account. People may feel very insecure and uncomfortable in the political system. Of course, fact-checking can be published and can make a valuable contribution, but it may be that only a few of the participants in the discussion will take note of it. We should therefore focus even more on people’s perspectives and needs and consider how we can prevent them from being taken in by misinformation. And what can we offer them that is equally appealing?
Further information can be found here:
https://www.tu-dortmund.de/nachrichtendetail/gegen-desinformation-vor-der-bundestagswahl-48861/
This article was first published on: https://de.ejo-online.eu The original article is available here:
Opinions expressed on this website are those of the authors alone and do not necessarily reflect or represent the views, policies or positions of the EJO or the organisations with which they are affiliated.