"Think twice, especially if something provokes strong emotions or seems extreme"
The threat of election interference is increasing. A new report shows that disinformation and influence operations are becoming more common in Europe.
"The most important thing for most people is to be aware of what they are reading and what they take at face value," says Paul M. H. Buvarp at the Norwegian Defence Research Establishment.(Photo: Norwegian Defence Research Establishment)
Attempts at election interference were reported in Europe and around the world in 2023 and 2024.
Researchers are now pointing to measures against election interference in Norway.
Paul Magnus Hjertvik Buvarp and Eskil Grendahl Sivertsen are researchers at the Norwegian Defence Research Establishment.
As part of a larger research project, they have examined how various actors have used social media and the internet to influence elections in the past few years.
Russia, China, and Iran
Annonse
In the report, the two researchers write that Russia, China, and Iran are the most active election influencers.
Several attempts at influence from these countries have been exposed, but their motives vary.
While Russia focuses on justifying its warfare and undermining trust in Western media, China seeks to strengthen its global reputation and weaken criticism of its regime.
Eskil Grendahl Sivertsen is a special advisor at the Norwegian Defence Research Establishment.(Photo: Norwegian Defence Research Establishment)
Iran exploits existing conflicts to create division in, among other places, Israel, the USA, and Canada, the researchers write.
What these countries have in common is their use of disinformation, fake accounts, and manipulated news sources to achieve their goals.
Election interference in France
The report shows that disinformation and influence operations were more widespread in Europe in 2024 than in 2023, with Russia being the most active influencer.
Russia was behind four major influence operations against France in 2024, a year that included both the presidential election and the Olympic Games.
The most well-known attempt is called the Doppelgänger operation. It was carried out by Russian authorities and state-controlled media, according to the report.
They created accounts and fake news websites that spread stories about migration and criticism of France's president.
They also spread rumours about corruption in the International Olympic Committee.
Other Russian operators ran French-language media channels to spread Russian propaganda and created AI-generated news with Russian messaging.
The impact of these operations was limited, according to the Norwegian researchers, but artificial intelligence could make influence operations more sophisticated in the future.
Thomas Bach, president of the International Olympic Committee, and France's President Emmanuel Macron during the Olympics in Paris in 2024.(Photo: Stian Lysberg Solum / NTB)
"Harder to detect and counteract"
Annonse
In the report, the researchers write that generative AI will likely be used more in the future. This refers to AI that can create images, code, and text based on large amounts of data. Tools like ChatGPT and Microsoft's Copilot are capable of this.
'This technology allows actors to scale and adapt influence operations in ways that can be harder to detect and counteract,' they write.
The researchers believe that this type of AI is not yet fully mastered by those conducting influence operations, but that this will change.
They also note that social media is exploited to spread manipulative content:
'Influence campaigns are spread across multiple platforms, making countermeasures more complex.'
According to Buvarp and Sivertsen, AI makes influence operations more sophisticated.(Photo: Shutterstock / NTB)
Tactics, techniques, and methods have become more complex, according to Buvarp and Sivertsen.
'Networks of users, fake websites, and specialised forums create a self-reinforcing ecosystem that gives influence actors an advantage,' they write.
This means, for example, that the same false information can be posted in many different places, making it easier for those who see it to believe it is true.
This also makes it harder to detect, label, and remove disinformation, according to the researchers.
AI: A risk and a resource
'A key part of the effort against influence operations lies in developing and utilising technology to identify and counter disinformation,' write Buvarp and Sivertsen.
They argue that generative AI is both a risk and a resource.
'While AI can produce realistic disinformation, it can also help strengthen fact-checking and quickly categorise false claims,' they write.
At the same time, it is important to consider how we evaluate sources of information.
"The most important thing for most people is to be aware of what they are reading and what they take at face value," Buvarp tells sciencenorway.no, adding:
"Think twice, especially if something provokes strong emotions or seems extreme."
Paul M. H. Buvarp, P.M.H. & Sivertsen, E.G. Valgpåvirkning i 2023 og 2024 - en trendstudie (Election influence in 2023 and 2024 - a trend study), Norwegian Defence Research Establishment Report, 2025.