Hostname: page-component-68c7f8b79f-kbpd8 Total loading time: 0 Render date: 2025-12-20T00:40:49.580Z Has data issue: false hasContentIssue false

Conditions of subversive reach comparing societal and strategic factors for Russian propaganda outlets’ reach among Western European fringe communities

Published online by Cambridge University Press:  12 December 2025

Christiern Santos Okholm*
Affiliation:
Social and Political Science, European University Institute, Fiesole, Italy
Rights & Permissions [Opens in a new window]

Abstract

What societal factors influence the reach of Russian propaganda outlets among fringe audiences? Recent debates within international relations and political communication have questioned the ability of Russia’s information warfare practices to persuade general public opinion in the West. Yet, Russian propaganda outlets have historically focused on reaching Western fringe communities, while a growing literature on societal resilience argues that variance in specific societal factors influences the effect of information warfare. Here I study the degree to which various societal factors condition Russia’s ability to reach fringe audiences. I measure the reach of Russian propaganda outlets among online fringe communities in ten Western European countries in the three months before Russia’s full-scale invasion of Ukraine. I compare national measures of public service media, media trust, affective polarisation, and populism and find descriptive indications that the latter two are tied to performance of Russian propaganda outlets in fringe communities. In addition, I find reach to be concentrated among regional great powers, highlighting the need to consider strategic risks when discussing societal resilience.

Information

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The British International Studies Association.

Can information warfare be effective, and what conditions may influence such effectiveness? Much attention has been given to the subversive promise of information warfare, yet the debate has focused on outcomes instead of conditions for outcomes. Alarmed by the discovery of Russia’s election interferences, scholars initially rushed to warn about the dire ability of information warfare to subvert liberal democracy. Through the manipulation of electorates using disinformation, that is, intentionally misleading information, Russia was painted as an existential threat to liberal democracy.Footnote 1 But as more studies found Russian disinformation unable to persuade the average voter, scholars from political psychology,Footnote 2 political communication,Footnote 3 and international relationsFootnote 4 have argued these fears to be overblown. Here I argue that this criticism may underestimate the non-persuasive goals of information warfare and be too dependent on single case studies in the United States and the Baltics to appreciate the role of societal factors enabling or limiting Russian propaganda outlets among its key audience.

Studies on information warfare by IR scholarsFootnote 5 indicate that instead of seeking to persuade general audiences, it can opt for the less ambitious goal of exploiting existing opinions that polarise and distort public debate. Previous work describes how Russian propaganda outlets adopt and boost existing narratives of disenfranchised, extremist, and conspiratorial audiences, that is, fringe communities.Footnote 6 Through this targeting of fringe groups, it can be assumed that Russia has a specific goal of boosting societal groups’ democratic discontentFootnote 7 to direct their frustration against their own governments. Fully evaluating Russian information warfare ability therefore requires understanding the conditions for its ability to project its narratives and reach these very specific audiences.

Meanwhile, a growing literature argues that information warfare’s performance is dependent on national contexts. Scholars have arguedFootnote 8 that specific societal factors help build societal resilience, understood as the societal ability to operate despite exposure to disinformation.Footnote 9 According to this societal resilience literature, countries with strong public service media with high trust levels, low levels of affective polarisation, and weak populist movements are likely to be unaffected by information warfare, as citizens are less drawn to propaganda outlets and far less swayed by it. Yet, the literature has two shortcomings rooted in the difficulty of defining and measuring the impact of information warfare, which have left it relying on mainly conceptual works based on studying single and often extreme cases. Hence, little research has comparatively investigated their empirical effect on the conditions of Russian propaganda outlets’ ability to project their content to their primary audience.Footnote 10 Nevertheless, the concept has become a key policy tool for countering foreign subversion and interference, for example NATO’s ‘strengthened resilience commitment’ and the EU’s ‘comprehensive resilience ecosystem’ model.Footnote 11

To challenge the premises of current criticism of Russian information warfare’s effect, I use Lanoszka’sFootnote 12 three arguments on limited effects due to source bias, entrenched beliefs, and domestic counter-measures as a guiding structure to engage with this debate. I counter-argue that information warfare should be evaluated based on its ability to reach primary audiences, that is, fringe communities. To explore how propaganda outlets’ ability may be conditioned by societal contexts and empirically test the claims of the societal resilience framework, I proceed to the prominent components of societal resilience.Footnote 13 From this I derive three hypotheses on societal factors influencing the reach of Russian propaganda outlets among fringe communities: H1) the strength of public service media (PSM) in the media sector and levels of trust in media, H2) levels of affective polarisation, and H3) the prevalence of populism. As Russia is likely to use information warfare to support geopolitical goals, I further argue that Russia is likely to prioritise gaining reach among countries it considers regional rivals, introducing H4) on reach being centred on great power countries.

I use previous academic studies and reports to create a database of national indicators related to each hypothesis. I then compare this with a secondary dataset of the monthly frequency with which the Russian propaganda outlets RT and Sputnik appeared in online fringe communities in Austria, Belgium Denmark, Finland, France, Germany, Italy, Netherlands, Sweden, and Spain during the three months preceding Russia’s full-scale invasion of Ukraine.

Theory: Manipulating who and when?

The debate on information warfare’s effectiveness in manipulating foreign audiences often defines effective manipulation as the ability to persuade audiences, be it to adopt new identities,Footnote 14 voting preferences,Footnote 15 or state policies,Footnote 16 and assumes this persuasion is unaffected by the societal contexts in which they operate. But rather than changing behaviour through persuasion, information warfare also seeks to manipulate through the boosting of pre-existing beliefs fitting its objectives, thereby failing to conform to a binary understanding of effects as persuaded or not. The ability to exploit existing beliefs largely depends on the ability of one’s propaganda outlets to reach relevant communities in targeted societies. As societal traits may influence such communities’ consumption patterns of information, for example, more polarised societies may encourage extremist communities to consume more polarising content, it is important to understand information warfare as conditioned by societal factors that are likely to affect propaganda outlets’ reach. Understanding these conditions is necessary to advance the state of the current debate and have a more dynamic understanding of information warfare operations.

To critically engage with this debate, I use Lanoszka’sFootnote 17 three arguments as they offer a useful roadmap to this recent trend of recalibrating our understanding of information warfare’s effect. As will be shown below, his mix of both realist theory from international relations (IR) and mechanisms from political communication and psychology provides the chance for a systematic engagement with key arguments. I argue his criticism may be premature as it overlooks the strategic intent of mobilising fringe communities and the role of societal resilience in mediating attempts to reach such audiences. Before the effects of information warfare can be concluded, we must better understand the conditions under which it operates, and this requires comparative studies of Russian propaganda outlets’ performance among primary audiences.

Building on the realist theory of the international anarchic systemFootnote 18 and how to navigate it,Footnote 19 Lanoszka’sFootnote 20 first argument relates to the natural scepticism audiences have toward adversaries when their country is in a state of international insecurity. As surviving international politics requires navigating foreign threats and adversaries, the fear of being manipulated enforces source biases among both elites and electorates. As this reduces the likelihood of any information, including disinformation, provided by Russia being believed or acted upon, Lanoszka argues information warfare is unlikely to affect decision-makers or foreign audiences. This finds some resonance in Feklyunina,Footnote 21 Hudson,Footnote 22 and Szostek’sFootnote 23 findings of Russia being unable to appeal to Ukrainian audiences due to historic legacies of imperialism and recent experiments by Nassetta and GrossFootnote 24 showing that American users’ acceptance of online claims is significantly reduced when attributed to Russia.

However, Russian information warfare practices go to great lengths to avoid such biases, by masking themselves via a network of ‘grey’ and ‘black propaganda’, that is, communication channels that obfuscate the true source from the audience.Footnote 25 Ranging from setting up fake NGOs and think tanksFootnote 26 and creating imposter websites of well-known Western mediaFootnote 27 to indirectly controlling ‘independent’ media organisations,Footnote 28 such as state media RT and Sputnik,Footnote 29 Russia goes to great lengths to avoid activating source biases against it. According to Yablokov and Chatterjee-Doody,Footnote 30 this avoidance of biases includes rebranding the propaganda outlet ‘Russia Today’ in 2009 to the more neutral ‘RT’. Moreover, Lanoszka’s assumption of a source bias against Russian sources may be too optimistic; for example, experimental studies by FisherFootnote 31 find that making audiences aware does not minimise the belief in antagonistic disinformation claims. Furthermore, in Wagnsson et al.’sFootnote 32 ethnographic study, they found the appeal of Sputnik among Swedish audiences is partially its identity as a Russian propaganda outlet that presents an alternative view of the world.

Lanoszka’s second argument builds on key insights from political communication and psychology that persuading individuals to accept opposing opinions is extremely difficult.Footnote 33 As people tend to reject any information that clashes with pre-existing perceptions, Russian information warfare is also unlikely to persuade foreign audiences to adopt new beliefs. This has been widely supported by studies of Russian botnet interactions with American online users that found these did not change users’ voting preferences.Footnote 34

However, this may wrongly equate information warfare’s objectives and effects with persuasion and ignore the potential of exploiting existing beliefs among fringe communities. While Erlich and GarnerFootnote 35 and RoozenbeekFootnote 36 found that despite years of being targeted by Russian information warfare efforts, Ukrainians would generally reject pro-Russian disinformation, they also found that Russia would concentrate its efforts on anti-Ukrainian claims and gained resonance when focusing on already accepted issues of corruption and economic stagnation. Similarly, the above-mentioned studies of Russian botnets found the exposure to be concentrated around a minority of American users. Other studies have found this minority to live in key swing states,Footnote 37 be on the fringes of the political spectrum, for example the far-right,Footnote 38 or be economically disenfranchised.Footnote 39 Hence, targeting fringes may be a strategic feature of Russian information warfare and not merely a bug produced by fringe demand. In their analysis of the same botnets as previously discussed, Dawson and Innes,Footnote 40 DiResta et al.,Footnote 41 and Kriel and PavliucFootnote 42 find these would continuously adapt claims to fringe audiences’ beliefs to radicalise their online discourse. A reason for this seems to be these communities’ feelings of disenfranchisement, distrust in authorities, and strong ideological convictions, which make them more likely to accept new conspiratorial and misleading claims.Footnote 43 These beliefs, experiences, and feelings create the basis for what Rhodes-Purdy et al.Footnote 44 describe as democratic discontent, referring to how a growing number of people are disappointed with liberal democracy to the degree that they reject its basic principles. By tapping into this discontent, for example by absorbing conspiracy theories,Footnote 45 Euroscepticism,Footnote 46 far-right issues,Footnote 47 and far-leftFootnote 48 frames into its propaganda, Russia can present itself as the counterweight to the failure of the Western order.Footnote 49 By exploiting such existing discontent, information warfare can subvert the cohesion of adversarial states.Footnote 50 By creating enough domestic problems for adversaries, Russia may gain more geopolitical wiggle room to achieve its foreign policy objectives.

Lanoszka’s last counter-argument to the effectiveness of Russian information warfare refers to the superiority of counter-measures that activate biases against foreign adversaries, such as authorities warning against campaigns and stopping channels distributing disinformation. However, such counter-measures may not be equally activated and are confined by societal contexts. As Lanoszka bases this on the post-2014 Baltic states, he does not include the political effects of previous Russian interference in the region,Footnote 51 such as the Russian orchestration of riots in Tallinn in 2007,Footnote 52 that led to these states’ particular attention to policies inoculating their societies against Russian information warfare.Footnote 53 Nevertheless, this speaks to the core focus of the societal resilience literature, the identification of societal factors that provide resilience against disinformation and misinformation, for example, simply misleading information.Footnote 54 A core claim of this literature is that vulnerability to disinformation and foreign information war is unevenly distributed among societies, due to national variances in specific societal factors, which I argue can be extended to Russian information warfare’s performance of projecting to primary audiences.

As the previous point above underscores, such audiences are fringe communities. I further argue that the performance of Russia’s propaganda outlets among fringe communities is conditioned by societal factors in the targeted country. While the previously discussed characteristics of fringe communities, for example strong feelings of exclusion and anti-elitism, may suggest that these audiences would be indifferent to surrounding society, I argue they remain part of broader society and thus receptive to socialisation from it.Footnote 55 While fringe communities may be in opposition to the establishment, they are still shaped by the society they grew up in. On the other hand, investigating these factors’ ability to act as barriers among Russia’s primary audience can help improve our understanding of societal resilience as linked to fringe communities and highlight which factors do indeed act as barriers or enablers among the primary audience of information warfare.

However, it is important also to critically reflect on the state of societal resilience literature. While the concept has had a substantial political impact and guided state policies to counter the spread of mis- and disinformation,Footnote 56 most of the literature is grounded in conceptual works and primarily draws its empirical findings from single case studies. For example, Bjola and PapadakisFootnote 57 and Kõuts-Klemm et al.Footnote 58 base the conceptualisation of resilience as media on the Finish and Baltic cases, without employing their models beyond these contexts. De ConningFootnote 59 and Sørensen and Bach NyemannFootnote 60 have relied on meta-studies on empirical studies of individual-level mechanisms that lead to assuming societal-level factors. This can also be found in MarchalFootnote 61 and Serrano-Puche’sFootnote 62 work on individual-level resilience, which is expanded to make societal-level recommendations. To the knowledge of this author, only Humprecht et al.Footnote 63 have attempted a systematic comparative investigation. But as most first-movers, their attempt suffers from limited data access and methodological issues. This mainly concerns their reliance on Newman et al.’sFootnote 64 national-level data on self-reported exposure to misleading information, which risks mistaking the issue’s salience in public consciousness for national-level exposure to disinformation. As both Newman et al.Footnote 65 note and a number of other studies show,Footnote 66 most respondents tend to conflate disinformation with information clashing with pre-existing beliefs, thereby leaving the used measurement unable to capture exposure to actual disinformation. As such, societal resilience literature lacks thorough empirical and comparative analysis. This paper tries to partly fill this gap.

But what are these societal factors? According to societal resilience scholars, three main factors are relevant to counter dis- and misinformation: 1) state of the media sector, 2) levels of affective polarisation, and 3) populistic discourse.

State of the media sector

One of the most popular policy recommendations for stopping disinformation is to arm truthful and trusted information by making sure that independent and national media can compete with free and accessible propaganda outlets.Footnote 67 Based on McQuail’sFootnote 68 argument of media being the ‘carrier of news and former of opinions’, scholars like Benkler et al.Footnote 69 and Wardle and DerakhshanFootnote 70 have argued that media sectors dominated by financially strong and trusted news organisations are better equipped to counter disinformation. While some argue that propaganda outlets can be part of a varied diet, acting as a supplement to traditional media,Footnote 71 there is still competition for the limited attention of news consumers,Footnote 72 and studies on fact-checking have underscored the role of speed in bringing news first to audiences as important to reduce belief in misperceptions.Footnote 73 Similarly, in their survey Altay et al.Footnote 74 found that increased news consumption did lower acceptance of misinformation, as it kept audiences informed. By ensuring that news media are well positioned to offer accessible and reliable information coverages, this factor is seen as key as acting against disinformation. In Kõuts-Klemm et al.’s study of Baltic media resilience, they point to news media as key to societal resilience as it must include ‘media systems’ ability to survive the efflux of resources and loss of audience attention and trust, and as the capacity to support a reliable, transparent, and diverse information sphere’.Footnote 75 Echoing earlier work by Hallin and Mancini,Footnote 76 Horowitz et al.Footnote 77 and Humprecht et al.Footnote 78 proceed to highlight extensive state funding of PSM as a key to combating disinformation, as this would provide the necessary economic comfort for national news media to prioritise independent quality journalism and fast universal coverage. Smoleňová et al.’sFootnote 79 report describes how the struggling Eastern European media sector was unable to maintain international correspondents, which allowed the Russian propaganda outlet Sputnik to have an outsized reach by providing free content on international affairs. Hence, it can be assumed that media markets with large PSM shares can outproduce and outreach disinformation. But while making information more accessible may make quality journalism competitive with Russian propaganda outlets among average readers, it is uncertain whether the same mechanism can be found among fringe audiences and affect Russian information warfare’s performance. While fringe communities are likely still influenced by the society around them and are likely to be exposed to free PSM content, they feel isolated and disenfranchised by the same society and media and gravitate towards other fringe group members for feelings of community.

To test this, I propose H1, stating ‘Russian propaganda outlets have lower reach among fringe audiences, in societies with larger PSM market shares’.

Meanwhile, Hameleers et al.,Footnote 80 Kreft et al.,Footnote 81 Watts,Footnote 82 and Valenzuela et al.Footnote 83 point to the prerequisite of high societal trust in media, finding it correlates negatively with individual consumption of and belief in disinformation. As fringe communities may be more emboldened to share and consume dubious sources if general society distrusts media, it is likely that low societal trust in media will also allow for a wider reach of Russian propaganda outlets among fringe communities.

This I test in H1.1: ‘Russian propaganda outlets have lower reach among fringe audiences in societies with high-trust in media’.

Affective polarisation

As previously mentioned, Russia’s exploitation and mobilisation of democratic discontent rests on the notion that individuals’ feelings of anger make them more receptive to disinformation about opponents.Footnote 84 This has led to policy recommendations that often emphasise the need to de-polarise and foster inclusion.Footnote 85 As discontent with existing democratic government includes anger towards anyone perceived as contributing to its failure or corrupting democratic governance,Footnote 86 societies with high levels of polarisation are likely to provide more emotional material to be exploited. In Gallache and HeerdinkFootnote 87 and Kriel and Pavliuc’sFootnote 88 studies of Russian botnets during the 2016 US presidential election, they find that activities focused on adapting to and increasing polarised and toxic discourse of online communities. Similarly, Freelon et al.Footnote 89 and Bradshaw, DiResta, and MillerFootnote 90 describe how Russian propaganda outlets would target and attract both sides of polarised debates. In a recent study JenkeFootnote 91 found that affective polarisation drives misinformation beliefs as it increases negativity bias against out-groups, while Osmundsen et al.Footnote 92 found partisan polarisation was the main driver of sharing misinformation for the same reason. Due to this link to political hostility and disinformation, Marchal,Footnote 93 Serrano-Puche,Footnote 94 and Humprecht et al.Footnote 95 have all argued that societies with high levels of affective polarisation are more vulnerable to disinformation. At the same time, as political hostility increases, individuals and parties from one or both sides of polarising issues may be incentivised to search for allies against the other and come to see former foreign adversarial states as counter-weights to the existing establishment or order. Following the logic of the enemy of my enemy is my friend, high polarisation risk trumps any source biases, and thus for societies with higher levels of political polarisation it may be easier to gain a larger reach for Russian propaganda outlets.

This I test in H2: ‘Russian propaganda outlets have lower reach among fringe audiences in societies with lower affective polarisation’.

Populist parties

Populism and Russian information warfare efforts have a long history of collaboration, making populism a recurring factor for societal resilience.Footnote 96 Among the clearest examples is Shekhovtsov’sFootnote 97 tracing of the extensive and historic ties between the Kremlin and the European far-right populist. However, Russia still holds some inherited anti-American attraction among far-left populists as pointed out by FoxhallFootnote 98 and most recently in Holesch et al.’sFootnote 99 study of radical left party alignment in the European Parliament post Russia’s full-scale invasion of Ukraine. As what ‘distinguishes the current Russian government from the erstwhile Soviet leaders familiar to the West is its rejection of ideological constraints’,Footnote 100 the Kremlin is ideologically opportunistic in its exploitation of ideological causes. A key argument by Yablokov and Chatterjee-DoodyFootnote 101 is that the anti-establishment rationale embedded within populism allows Russia to present itself as an ally of the people in its fight against an evil elite, be it based on ethnic or economic grounds. But when these movements of democratic discontent become more structured in the form of populist parties, they are likely to provide two opportunities for Russian propaganda outlets to further tap into fringe communities’ frustration, highlighting exploitable issues and providing natural established allies. As populist parties act as political entrepreneurs exploiting frustration with the democratic form of government, to challenge the existing order and often the perceived elites within that order, they create a political platform that structures grievances.Footnote 102 By campaigning on polarising issues and often contributing to the spread of disinformation,Footnote 103 populist political platforms and candidates make specific exploitable topics relevant to national and fringe discourse more visible to Russian propaganda outlets, which in turn can adapt to these. Meanwhile, strong populist parties may also enforce the enemy of my enemy is my friend, as both populists and Russia try to position themselves as the moral opposition to an evil elite, be it states or political establishments.Footnote 104 Indeed this has frequently been the case with Russia funding Eurosceptic populist parties and giving them favourable exposure on propaganda outlets;Footnote 105 in turn, propagandist parties may further echo and normalise Russian propaganda, as found by Beseler and Toepfl,Footnote 106 thereby providing a larger reach among key constituents, that is, fringe communities. By highlighting topics and offering alliances of convenience, a strong presence of populist parties is likely to increase Russian propaganda outlets’ reach.

This I test in H3: ‘Russian propaganda outlets have lower reach among fringe audiences in societies with a smaller representation of populist parties’.

Strategic priorities

The three factors above all refer to societal conditions that elevate the risk profile of countries against disinformation regardless of origin. But since Russian information warfare is strategic by intent, countries and their fringe communities’ risk profile is equally affected by Russian interest in targeting them. After all, Russia’s information warfare is a political tool for managing an international scene in ways that benefit it strategically, in which states are the main objective and fringe communities a means to an end. Hence, certain states are strategically irrelevant while others are not, which in turn influences the strategic effort to court their fringe communities. Such courtship of course includes prioritising in which languages to make one’s propaganda outlet available, but also prioritising what content resonates best with specific audiences. Even if fringe communities have global connections, reports on foreign local politics far away are less relevant than scandals at home. This lack of appreciation of strategic priorities by specific states, such as Russia, is partly due to societal resilience literature’s main objective of limiting the consumption and effects of the wider societal problem of misinformation and also a tendency not to distinguish it from disinformation. This also includes the focus on a general audience, instead of the more vulnerable fringe audience, leaving some of the claims of resilience against Russian information warfare in doubt.Footnote 107 Among IR scholars specialising in Russian foreign policy, it is often stressed that Russia views itself as a great power in competition or conflict with the West.Footnote 108 Giles refers to this perception as being obsessive Footnote 109 and derived from the country’s realist belief that only great powers can be safe from foreign subjugation and must defend themselves from foreign threats; this is further echoed by SherrFootnote 110 and the Russian academic Trenin’sFootnote 111 debate on Russian exceptionalism. According to GaleottiFootnote 112 debate on Ru and Skak,Footnote 113 this position means Russia has special rights, such as spheres of influence and being above international law, but must be in constant conflict with its foreign adversaries to achieve this. Though this mainly means competing with the United States, it also includes balancing and competing with lesser regional great powers such as the UK, Germany, France, and Italy, who can resist Russian coercions through either stronger economies or larger militaries. But while these may be more powerful, especially when unified through NATO, undermining the threat these pose by subverting their resolve makes information warfare more useful.Footnote 114 This means regional great-power countries are prioritised by Russian information warfare, honing and adapting their narratives to to the democratic discontent in such societies, and it is likely this will translate into a higher reach among these.

This provides us with H4: ‘Russian propaganda outlets have lower reach among fringe audiences in countries that are not regional great powers’.

Data and methods

To explain the methods used, this section will first focus on how the population (i.e., fringe community) was sampled. It then moves to explain how the dependent variable (i.e., Russian propaganda reach) was operationalised, before operationalising the independent variables (i.e., the resilience factors) and the choice of analysis approach.

I use a previously collected dataset from Santos Okholm et al.,Footnote 115 which maps the online meeting points of fringe communities in ten Western European countries and provides a glimpse into their consumption of online content, including Russian propaganda outlets (see more below). The dataset consists of the posting history of 493 public Facebook groups in Austria (96), Germany (139), France (77), Spain (78), Italy (37), the Netherlands (14), Denmark (2), Sweden (36), Belgium (1), and Finland (13) (see Figure 1).

Figure 1. Distribution of fringe communities across countries.

These groups were identified based on an extensive literature review and a subsequent automated snowball sample method. The literature review consisted of studying reports on online media by academics, think tanks, investigative journalists, and state authorities and produced a list of 202 fringe media in each country. I use the term fringe media, with mainly methodological intentions, as a label for media that continuously share content that can be labelled as misinformation, conspiracy theories, and extremist content,Footnote 116 as these feed fringe communities with content confirming their feelings of democratic discontent. Whenever reports would refer to a site as alternative, misinformation, conspiratorial, or extremist, these would be visited to verify the categorisation, whether they remained active, and then listed. Once country lists were collected, they were sent for additional verification to national members of the International Fact-Checking Network. This led to two methodological consequences. As country media were mainly defined by linguistic barriers, British and American audiences could not be untangled, and so both cases were left out of the dataset. This has some consequences for H4’s expectations for regional great powers such as the UK. However, it does include three regional powers: France, Germany, and Italy. The pro-Western bias further reduces variation in some societal factors that could have been alleviated by the inclusion of Eastern Europe; however, due to a lack of data access to Eastern European fringe media, this was abandoned. But by focusing on what are often considered the most resilient countries, the dataset can provide a hard case for Russian information warfare. Furthermore, as the reviewed literature had a far-right, anti-vaccine, and conspiracy focus, the dataset holds a far-right bias and is limited in capturing far-left populism, with methodological consequences for H3. Yet as fringe communities have gravitated towards the far-right, where democratic discontent has been particularly strong,Footnote 117 the dataset still captures a large portion of what I argue is the primary audience of Russian information warfare. These fringe media created a list of media URLs from which the automated snowball could begin using Meta’s now-terminated CrowdTangle platform. As fringe communities feel excluded and often persecuted by society, the dataset chooses a standard anthropological method for identifying hidden and stigmatised groupsFootnote 118 in which one group member is identified (fringe media) and through its contacts (Facebook groups) a population is identified (fringe community). By identifying public Facebook groups in which URLs from the list of fringe media were posted at more than one per week throughout 2021, the dataset identified fringe groups in each country. To avoid exposing individuals within these groups, we only included groups of greater than fifty members and had a majority of groups with members in the 100,000s. This resulted in a dataset of fringe groups’ history with more than 5 million. aggregated posts. These privacy concerns were central to the ethics approval given by the EUI ethics committee. Though fringe communities have recently moved to alternative platforms,Footnote 119 Facebook remains a favoured platform for Russian influence campaignsFootnote 120 and fringe communitiesFootnote 121 and is a key online platform for European audiences.Footnote 122

To identify the reach of Russian propaganda outlets, I measure the frequency with which URLs by the two main Russian propaganda outlets in Europe, RT and Sputnik Europe,Footnote 123 appear in these fringe groups’ posting history. In total, 102 URLs were collected from each relevant European language version of either Sputnik or RT, plus the English language version, social media profile, and by tracking ‘mirror sites’ (e.g., RT’s official English language site is accessible via both ‘www.rt.com’ and ‘www.swentr.site’). Furthermore, by focusing on the three months before Russia’s full-scale invasion of Ukraine, I highlight a specific period where Russian reliance on information warfare was arguably at its highest. During this period the Kremlin sought to deter European opponents, coerce Ukraine to accept its pre-war demands, and justify its escalating policies, relying on information warfare practices to do so.Footnote 124

While this allows for a unique glimpse into Russian propaganda outlets’ ability to reach fringe communities during a crisis, this is admittedly limited by Russian propaganda taking other forms that are difficult to measure quantitatively. Russian propagandists’ use of imagery and memes are difficult to categorise at scale, while their pervasive use of botnets to manipulate social media algorithms to boost reach of content, for example by artificially posting or interacting with such posts, erodes the reliability of interaction measures.Footnote 125 However, the collected dataset has a reliability advantage when measuring genuine reach and not attempted or manipulated reach. As messages require the acceptance of group administrators to be posted, the dataset shows the ability of RT and Sputnik’s content to pass such gatekeepers and reach online safe spaces used by fringe communities.

To test the four hypotheses, I collect a secondary dataset with a new country-level dataset over associated measurable indicators for the ten countries using data from research reports and journal articles (see Table 1).

Table 1. Dataset for societal factors, measurable indicators, and sources.

* Due to issues with data availability, this does not include data for Belgium and the Netherlands.

** Due to issues with data availability, this does not include data for Denmark.

As H1 refers to the effects of PSM media, I use Florian Saurwein et al.’sFootnote 132 national-level data on the market share of national news audiences consuming PSM media regularly and complement it with Neff and Pickard’sFootnote 133 estimates of audience share by PSM media for Denmark, using the same operationalisation. Choosing market share has the benefit of avoiding scaling problems between different contexts for costs, for example by just looking at funding schemes, and being attentive to actual reach among national audiences of PSM media. Yet due to problems with data availability, this does not include Belgium or the Netherlands.

To test H1.1, expectations of trust in media, I use the Reuters Digital News Report’sFootnote 134 cross-national survey of self-reported trust in mainstream media, as the report series has become a standard within media and misinformation research.Footnote 135

To test H2’s expectations for affective polarisation, I rely on Bettarelli et al.’sFootnote 136 national estimates of polarisation levels for all our cases except Denmark. They use Wagner’s Spread-of-Scores, providing a national average on a scale of dislike between 0–10, with the cases of this study’s relevance ranging from 1.8 (the Netherlands) to 2.66 (Sweden).

To measure the societal factor populist parties’ effect, I rely on the PopuList dataset by Rooduijn et al.Footnote 137 and its operationalisation of populist parties as the national percentage of parliamentary seats held by left- and right-wing populist parties. Despite becoming a standard indicator within populist studies,Footnote 138 the dataset struggles to measure populism as a movement across countries, due to the difference in voting systems. While most Western European parliamentary elections rely on versions of the proportional vote, the French National Assembly relies on a two-round first-past-the-post system, which is likely to favour large established parties and underestimate populist parties. As a more precise measurement requires extensive polling, the dataset has nevertheless been widely used as a convenient proxy for populism due to its expert-informed qualitative comparative classification methodology, which ensures a systematic comparison of populist parties.Footnote 139 Meanwhile, the mechanism behind H3 is not based on populism per se, but on the argument that political platforms can operate as an enabler of Russian information war, by signalling useful mobilisable issues to adopt and serving as political allies that normalise Russian strategic narratives. Playing this role requires not only a fit between fringe ideology and Russian opportunism but also being visible in the political landscape and having the strength to be in parliament.

To measure the great power hypothesis, I have relied on SIPRI data on military expenditure and World Bank data on sieze of economy and population to subtstantiate key regional players. For more see Table A1 in the appendix.

To account for the hierarchical structure that merging a societal-level and group-level dataset would entail, with relatively few countries, I opt for a more conservative descriptive analysis, testing each hypothesis separately in a bivariate association.

Results

As Figure 2 shows, we see that Russian propaganda outlets have a higher average reach in Italian, French, and German fringe communities, with a high performance in Italy in the period of study. While this alludes to a confirmation of H4’s expectations of greater regional powers being targeted by Russia and thus having greater reach, any analysis will have to take the distribution of fringe groups within the dataset into account. Firstly, the high concentration of Russian propaganda outlets URLs, in a minor regional great power, is exacerbated by Italian fringe communities being the fifth largest in the dataset, on par with Sweden (see Figure 1). Secondly, the relative performance in Italy and the other great regional powers is still unexplained. While the relative influence of these country-level averages should be kept in mind, to go to a deeper level of analysis and also answer the other four hypotheses in full, further exploration is needed.

Figure 2. Country mean frequency of Russian propaganda URLs among fringe groups.

Turning to H1 in Figure 3 and H1.1 in Figure 4, we find some support for the expectation of national media providing resilience by having the ability to reach national audiences and by being trusted by the general population. Russian propaganda outlets perform poorly in high trust and strong PSM traditions in Scandinavia and the Netherlands. Furthermore, this societal-level factor seems at first sight to seep into fringe communities. However, this should not be overstated, as such inferences are driven by the relative influence of French and Italian communities. On the removal of those, the resilience effects of national media seem to either disappear or increase, as Russian performance is still notable in countries like Germany, with a relatively trusted media with a large market share. Hence, the ability of national media to be a resilience factor among the primary audience of Russian propaganda may be questioned.

Figure 3. National levels of average frequency of Russian URLs in fringe groups and market share of public service media.

Figure 4. National levels of average frequency of Russian URLs in fringe groups and trust in media.

Looking into the relation in H2 between Russian information warfare and affective polarisation, we find more convincing confirmation in Figure 5. With a general trend of Russian propaganda outlets performing better in countries with high affective polarisation, like France and Italy, there is support for the argument that feelings of anger and hatred provide material for Russian propagandists. Instead of a source bias against geopolitical adversaries, this may indicate the existence of a threshold whereafter foreign adversaries are seen by fringes as lesser evils compared to domestic adversaries. This points to highly polarised societies being more at risk of having their fringe groups co-opted and manipulated by Russian information warfare.

Figure 5. National levels of average frequency of Russian URLs in fringe groups and affective polarisation. No data for Denmark.

In Figure 6, we see that Russian information warfare performance in France and Spain is followed by a larger presence of populist parties in their respective parliamentary assemblies. As both far-right parties in France and Italy, for example Rassemblement National (previously Front Nationale), Lega Nord, and Fratelli d’Italia, have had extensive relations with the Kremlin, Figure 6 indicates that this may improve the performance of Russian propaganda outlets among fringe communities. While the relatively lower parliamentary presence of Alternative Für Deutschland may be followed by a smaller reach, it is important to note that the relative size of German fringe groups in Figure 1, which may deflate the performance of RT and Sputnik. This does indicate that populist parties can play a dual role of both Russian information warfare’s ally against liberal democratic states and political platforms signalling of emotional material for Russian propaganda outlets to exploit; this is still driven by findings from regional great powers.

Figure 6. National levels of average frequency of Russian URLs in fringe groups and percentage share of populist parties in national parliaments.

Finally, we return to the strategic priority and H4’s expectation of Russian information warfare being incentivised to target national fringe communities before its propaganda outlets gain reach. In Figure 7, we see that fringe communities in regional great powers have a disproportionately higher reach of Russian propaganda outlets, with 40 per cent of their fringe groups sharing content from Russian propaganda outlets more than once a week on average. Meanwhile, the corresponding number of smaller regional powers is around 10 per cent. As pointed out earlier in Figure 2, Russian propaganda outlets performed far better in great powers, that is, Germany, France, and Italy, supporting H4. While this strategic incentive is also reflected in RT and Sputnik being available in Italian, French, and German, it is noteworthy that the same availability in Spanish does not translate to the reach of Russian propaganda. Acting as a proof of concept, Figure 2 shows that in the fringe community of this non-great power, RT and Sputnik have limited reach. As Spain is not a regional great power in Europe, the propaganda outlet’s content is more focused on appealing to the geopolitically important audiences in Latin America. Similarly, though Austria has the second-largest fringe community, it also has a vastly lower average reach than Italy, France, and Germany. As such, H4 is supported by and underscores the need to include strategic priorities as a risk factor when considering societal resilience against information warfare.

Figure 7. Share of fringe groups where Russian URLs appeared more than once per week on average, by great and small power status.

Discussion and conclusion

The findings underscore the central argument that the effects of Russian information warfare are conditioned by the context of the society and audience they target. As the performance of its main enablers, that is, Russian propaganda outlets, varies across different societal and strategic contexts of the targeted fringe communities, effects depend on context. Taking these factors into account underscores not only which factors enable subversive information warfare but also which countries are at a high risk of their fringes being co-opted by Russia.

The findings indicate a more nuanced relationship between the societal factors than societal resilience literature would expect. Highly influenced by the Italian cases, the findings are not conclusive in support for media as limiting the Russian projection ability to fringe communities. Without this outlier, it is questionable if high trust and big PSM market share do indeed limit Russia’s performance among a subset of audiences who reject media as epistemic authority. One important data point supporting this scepticism is the relative strength of German media in this regard. More analysis of Russian performance in different media contexts is therefore necessary.

Meanwhile, the findings on affective polarisation and populism were both supportive of the expectations of these being enablers of Russian information warfare, underscoring both how Russia needs emotional material, allies, and signals to reach its primary audiences. But a similar nuanced relationship can be found between affective polarisation as material of exploitable sentiments and beliefs to Russian information warfare. As populist parties absorb fringe parties’ emotional material and provide political platforms for Russian propaganda outlets to collaborate or adjust their content, these become more important for Russian propaganda outlets. Apart from echoing Russian propaganda outlets, by highlighting both what topics are most effective among fringe communities and how to frame them effectively, stronger political parties make it easier for Russian information warfare to reach fringe audiences. The role of populist parties and affective polarisation as enablers of Russian information warfare further points to the alliance of convenience that may appear between Russia and fringe communities. While this challenges Lanoszka’s source biases towards adversarial states, it further touches upon a deeper assumption of foreign adversaries being uniformly perceived as such within a country, as populist parties have collaborated with Russia and may perceive it as a lesser and necessary evil to combat the domestic political establishment. However, to fully untangle if such mediating interaction exists, more empirical research is necessary.

But while strong populist parties in, for example, Spain could be effective allies for Russian information warfare efforts, the strategic incentive to do so is not there. Instead, Russia focuses its efforts on fringe audiences who can influence regional rivals that otherwise cannot be coerced economically or militarily. As such, societies’ vulnerability to information warfare must also be seen in context of the political interest of the state waging it. In this sense, effects must take into account both the context in which states become vulnerable and the incentives for Russia to target its fringe audiences. However, it should be noted that priorities are not fixed and crises in smaller adversaries can be opportunistically exploited in the short term. One example is the recent role Russian propaganda outlets played in exacerbating tensions and anger in Spain following the 2024 flood in Valencia, in which it peddled anti-establishment conspiracy theories.Footnote 140

Nevertheless, sceptics of Russian information warfare may argue that fringe communities by definition are irrelevant to the formation of state policies, due to their small size. However, anecdotal evidence from the COVID-19 pandemic shows that fringe communities are growing in size, have voiced their disagreement through various large-scale demonstrations, and have elected representatives in Western governments.Footnote 141 By tapping into such fringe communities’ concerns, populist figures have gained political momentum and, despite their small numbers, been able to position themselves in domestic politics to have an outsized influence on state policies. Meanwhile, the analysis findings may be influenced by studying the three months before Russia’s full-scale invasion of Ukraine, whose aggressiveness may have since triggered the source bias against Russia, making pro-Russian statements politically untenable for populist parties. However, further anecdotal evidence suggests that by re-framing support to Russia as opposition to Ukraine, Russian propaganda can tap into affective polarisation and populist movements and thereby influence Western policies of supporting Ukraine. Examples of this include the delay of military aid in the United States and large peace demonstrations in Germany opposing weapons delivery ‘prolonging the war’, supported by both the far-right and far-left.Footnote 142

While this paper is a first step to understanding the complex conditions for information warfare, more work is needed to untangle the interactions between these factors and their effect on Russian information warfare ability to project strategic narratives. The exploitation and mobilisation of fringe communities underscore an inherent challenge of measuring information warfare that does not persuade opinions but boosts what already exists. Differing from tracing when individuals go from not believing to believing, boosting opinions requires untangling the effects of Russian propaganda outlets from other correlated fringe beliefs and the pre-existing opinions of individuals. Meanwhile, this would require access to measuring the resonance and change in behaviour among fringe communities, which are unlikely to be approachable for scientific inquiry due to anti-elite biases. Nevertheless, what influences the Russian ability to project and reach these communities is important aspect of both societal resilience and the study of information warfare.

Christiern Santos Okholm is a Ph.D.-student at the European University Institute with a special focus on Russian information warfare.

Acknowledgements

None.

Appendix

Table A1. Statistic summary of the great power variable.

Note: Categorisation of countries was based on SIPRI data on military budget and World Bank data on population and GDP.

Footnotes

Note: Categorisation of countries was based on SIPRI data on military budget and World Bank data on population and GDP.

References

1 See Mark Galeotti, Russian Political War: Moving beyond the Hybrid (New York: Taylor & Francis, 2019); Herbert Lin, ‘The existential threat from cyber-enabled information warfare’, Bulletin of the Atomic Scientists, 75:4 (2019), pp. 187–96; Christopher Walker, Shanthi Kalathil, and Jessica Ludwig, ‘The cutting edge of sharp power’, Journal of Democracy, 31:1 (2020), pp. 124–37; Mikael Wigell, ‘Hybrid interference as a wedge strategy: A theory of external interference in liberal democracy’, International Affairs, 95:2 (2019), pp. 255–75.

2 See Greagory Eady, T. Paschalis, J. Zilinsky, et al., ‘Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior’, Nature Communications, 14:62 (2023), pp. 1–11; Aaron Erlich and Calvin Garner, ‘Is pro-Kremlin disinformation effective? Evidence from Ukraine’, The International Journal of Press/Politics (2021), pp. 1–11; Nir Grinberg, Kenneth Joseph, Lisa Friedland, Briony Swire-Thompson, and David Lazer, ‘Fake news on Twitter during the 2016 U.S. presidential election’, Science, 363 (2019), pp. 374–8; Guess and Tucker in Nathaniel Persily and Joshua A. Tucker, Social Media and Democracy: The State of the Field, Prospects for Reform (Cambridge: Cambridge University Press, 2020).

3 See Johan Farkas and Jannick Schou, Post-Truth, Fake News and Democracy – Mapping the Politics of Falsehoods (New York: Taylor and Francis, 2020), p. 19; Jon Roozenbeek, Propaganda and Ideology in the Russian–Ukrainian War (UK: Cambridge University Press, 2024).

4 See Alexander Lanoszka, ‘Disinformation in international politics’, European Journal of International Security, 4:2 (2019), pp. 227–48; Thrall and Armstrong in Christopher Whyte, A. Trevor Thrall, and Brian Mazanec, Information Warfare in the Age of Cyber Conflict (London: Routledge, 2020), pp. 73–87.

5 See Andreas Krieg, Subversion - The Strategic Weaponization of Narratives (Washington DC: Georgetown University Press, 2023). Walker et. al. (2020). Ilya Yablokov and Precious N. Chatterjee-Doody, Russia Today and Conspiracy Theories: People, Power and Politics on RT (New York: Routledge, 2021).

6 See Samantha Bradshaw, Renee DiResta, and Carly Miller, ‘Playing both sides: Russian state-backed media coverage of the #BlackLivesMatter movement’, The International Journal of Press/Politics (2022), pp. 1–27; Andrew Dawson and Martin Innes, ‘How Russia’s Internet Research Agency built its disinformation campaign’, The Political Quarterly, 90:2 (2019); Deen Freelon, Michael Bossetta, Chris Wells, Josephine Lukito, Yiping Xia, and Kirsten Adams, ‘Black Trolls Matter: Racial and ideological asymmetries in social media disinformation’, Social Science Computer Review (2020), pp. 1–19; John D. Gallache and Marc W. Heerdink, ‘Measuring the effect of Russian Internet Research Agency information operations in online conversations’, Defence Strategic Communications, 6:1 (2019), pp. 155–98; Charles Kriel and Alexa Pavliuc, ‘Reverse engineering Russian Internet Research Agency tactics through network analysis’, Defence Strategic Communications, 6 (2019), pp. 190–227.

7 See Mathew Rhodes-Purdy, Rachel Navarre, and Stephen Utych, The Age of Discontent – Populism, Extremism and Conspiracy Theories in Contemporary Democracies (UK: Cambridge Press, 2023).

8 See Yochai Benkler, Robert Faris, and Hal Roberts, Network Propaganda - Manipulation, Disinformation, and Radicalization in American Politics (Oxford University Press, 2018). Edda Humprecht, Frank Esser and Peter Van Aelst, ‘Resilience to Online Disinformation: A Framework for Cross-National Comparative Research’, The International Journal of Press/Politics, (2020), pp. 1–24. Richard Rogers and Sabine Niederer, The Politics of Social Media Manipulation (Amsterdam: Amsterdam University Press, 2020). Ragne Kõuts-Klemm, Anda Rožukalne and Deimantas Jastramskis, ‘Resilience of national media systems: Baltic media in the global network environment’, Journal of Baltic Studies, 53:4, (2022), pp. 543–64.

9 See Edward T. Hall, Beyond Culture (New York: Doubleday, 1977), p. 2.

10 See Edda Humprecht et al. (2020).

11 See North Atlantic Treaty Organization (NATO), ‘Strengthened Resilience Commitment’ (14 June 2021) available at: {https://www.nato.int/cps/en/natohq/official_texts_185340.htm}, accessed 24 June 2024; R. Jungwirth, H. Smith, E. Willkomm, J. Savolainen, M. Alonso Villota, M. Lebrun, A. Aho, and G. Giannopoulos, Hybrid Threats: A Comprehensive Resilience Ecosystem (Publications Office of the European Union, 2023), available at: {https://publications.jrc.ec.europa.eu/repository/handle/JRC129019}, accessed 24 June 2025.

12 Alexander Lanoszka, ‘Disinformation in international politics’, European Journal of International Security, 4:2, (2019), pp. 227–48.

13 See Corneliu Bjola and Krysianna Papadakis, ‘Digital propaganda, counterpublics and the disruption of the public sphere: The Finnish approach to building digital resilience’, Cambridge Review of International Affairs, 33:5 (2020); Shelley Boulianne, Chris Tenove, and Jordan Buffie, ‘Complicating the resilience model: A four-country study about misinformation’, Media and Communication, 10:3 (2022), pp. 638–66; Spenser Mckay and Chris Tenove, ‘Disinformation as a threat to deliberative democracy’, Political Research Quarterly (2020), pp. 1–15; Chris Tenove, ‘Protecting democracy from disinformation: Normative threats and policy responses’, The International Journal of Press/Politics, 25:3 (2020), pp. 517–37.

14 See Roozenbeek (2024).

15 See Chris A. Bail, B. Guay, E. Maloney, A. Combs, D. S. Hillygus, F. Merhout, D. Freelon, and A. Volfovsky, ‘Assessing the Russian Internet Research Agency’s impact on the political attitudes and behaviors of American Twitter users in late 2017’, Proceedings of the National Academy of Sciences of the United States of America (2019), pp. 1–8.

16 See Lanoszka (2019).

17 See Lanoszka (2019).

18 See Kenneth Waltz, Theory of International Politics (MA.: Reading, 1979).

19 See Robert Jervis, Perception and Misperception in International Politics (New Jersey: Princeton University Press, 1976).

20 See Lanoszka, (2019).

21 See Valentina Feklyunina, ‘Soft power and identity: Russia, Ukraine and the “Russian world(s)”’, European Journal of International Relations, 22:4 (2016), pp. 773–96.

22 See Victoria Hudson, ‘‘Forced to friendship’? Russian (mis-)understandings of soft power and the implications for audience attraction in Ukraine’, Politics, 35:3–4 (2015), pp. 330–46.

23 See Joanna Szostek, ‘The Power and Limits of Russia’s strategic narrative in Ukraine: The role of linkage’, Perspectives on Politics, 15:02 (2017), pp. 379–95.

24 See Jack Nassetta and Kimberly Gross, ‘State media warning labels can counteract the effects of foreign disinformation’, Harvard Kennedy School Misinformation Review, (2020), pp. 1–11, available at: {https://doi.org/10.37016/mr-2020-45}.

25 See Garth S. Jowett and Victoria O’Donnell, Propaganda and Persuasion (London: Sage Publications, 2012), p. 18.

26 For an overview see Peter Pomerantsev and Michael Weiss, ‘The menace of unreality: How the Kremlin weaponizes information, culture and money’, The Institute of Modern Russia (2014), available at: {https://imrussia.org/media/pdf/Research/Michael_Weiss_and_Peter_Pomerantsev__The_Menace_of_Unreality.pdf}, accessed 27 May 2018.

27 See Alexandre Alaphilippe, Gary Machado, Raquel Miguel, and Francesco Poldi, ‘Doppelganger media clones serving Russian propaganda’, EU Disinfolab (27 September 2022), available at: {https://www.disinfo.eu/wp-content/uploads/2022/09/Doppelganger-1.pdf}, accessed 24 June 2025.

28 See Samantha Bradshaw, DiResta and Miller (2022).

29 See Gordon Ramsay and Sam Robertshaw, ‘Weaponising news RT, Sputnik and targeted disinformation’, King’s College London, (January 2019), available at: {https://www.kcl.ac.uk/policy-institute/assets/weaponising-news.pdf}, accessed 24 June 2025.

30 See Yablokov and Chatterjee-Doody (2021), p. 26.

31 See Aleksandr Fisher, ‘Demonizing the enemy: The influence of Russian state-sponsored media on American audiences’, Post-Soviet Affairs, 36:4 (2020), pp. 281–96.

32 See Charlotte Wagnsson, Torsten Blad and Aiden Hoyle, ‘Keeping an Eye on the Other Side’: RT, Sputnik, and Their Peculiar Appeal in Democratic Societies’, The International Journal of Press/Politics, 29:4, (2024), pp. 1109–113.

33 See Daniel Kahneman, Thinking Fast and Thinking Slow (New York: Farrar, Straus and Girox, 2011); Daniel Kahneman, Paul Slovic, and Amos Tversky, Judgment under Uncertainty, Heuristics and Biases (Cambridge: Cambridge University Press, 1982); Hugo Mercier, Not Born Yesterday – The Science of Who We Trust and What We Believe (New Jersey: Princeton University Press, 2020).

34 Bail et. al (2023).

35 Erlich and Garner (2021).

36 Roozenbeek (2024).

37 Philip N. Howard, Bence Kollanyi, Samantha Bradshaw, and Lisa Maria Neudert, ‘Social media, news and political information during the US election: Was polarizing content concentrated in swing states?’, The Project on Computational Propaganda at Oxford University (28 September 2017), available at: {https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/12/2017/09/Polarizing-Content-and-Swing-States.pdf}, accessed 24 June 2025.

38 See Rebecca Adler-Nissen and Frederik Hjort, ‘Ideological asymmetry in the reach of pro-Russian digital disinformation to United States audiences’, Journal of Communication, 69:2 (2018), pp. 168–92.

39 See Freelon et al. (2020).

40 Dawson & Ines (2019).

41 See Renee DiResta, Kris Shaffer, Becky Ruppel, David Sullivan, Robert Matney, Ryan Fox, Jonathan Albright, and Ben Johnson, ‘The Tactics & Tropes of the Internet Research Agency’, Knew Knowledge (December 1 2018), available at: {https://int.nyt.com/data/documenthelper/533-read-report-internet-research-agency/7871ea6d5b7bedafbf19/optimized/full.pdf}, accessed 24 June 2025.

42 Kriel and Pavliuc (2019).

43 See Afonso de Albuquerque, Thaiane M. Oliveira, Marcelo A. dos Santos Jr., Rodrigo Quinan, and Daniela Mazur, ‘Coronavirus meets the clash of civilizations’, Convergence: The International Journal of Research into New Media Technologies, 28:4 (2022), pp. 1198–213; Michael Bang Petersen, Mathias Osmundsen, and Kevin Arceneau, The “Need for Chaos” and Motivations to Share Hostile Political Rumors. American Political Science Review. 117:4, (2023), pp. 1486–1505, available at: {http://doi.org/10.1017/S0003055422001447}; Edlira Palloshi Disha, Albulena Halili, and Agron Rustemi, ‘Vulnerability to disinformation in relation to political affiliation in North Macedonia’, Media and Communication, 11:2 (2023), pp. 42–52; Jennifer L. Hochschild and Katherine Levine Einstein, ‘Do Facts Matter? Information and Misinformation in American Politics’, Political Science Quarterly, 130:4, (2015), pp. 585–624; Brian Mckernan, Patricia Rossini, and Jennifer Stromer-Galley, ‘Echo Chambers, Cognitive Thinking Styles, and Mistrust? Examining the Roles Information Sources and Information Processing Play in Conspiracist Ideation’, International Journal of Communication, 17, (2023), pp. 1102–25; Anna K. Spälti, B. Lyons, F. Stoeckel, S. Stöckli, P. Szewach, V. Merola, C. Stednitz, P. Gonzales, and J. Reifler, ‘Partisanship and anti-elite worldviews as correlates of science and health beliefs in the multi-party system of Spain’, Public Understanding of Science (2023), pp. 1–20.

44 See Rhodes-Purdy et al. (2023).

45 See Ben Dubow, Edward Lucas, and Jake Morris, ‘Jabbed in the back – Mapping Russian and Chinese information operations during COVID-19’, Center of European Policy Analysis (2021), available at: {https://cepa.org/wp-content/uploads/2021/12/Jabbed-in-the-Back-12.2.21.pdf}, accessed 24 June 2025; Ilya Yablokov, ‘Conspiracy theories as a Russian public diplomacy tool: The case of Russia Today (RT)’, Politics, 35:3–4 (2015), pp. 301–15.

46 See Andrew Foxall, ‘From Evropa to Gayropa: A critical geopolitics of the European Union as seen from Russia’, Geopolitics, 24:1 (2019), pp. 174–93; James Headley, ‘Challenging the EU’s claim to moral authority: Russian talk of ‘double standards’’, Asia European Journal, 13:3 (2015), pp. 297–307.

47 See Vincent C. Keating and Katarzyna Kaczmarska, ‘Conservative soft power: Liberal soft power bias and the “hidden” attraction of Russia’, Journal of International Relations and Development, (2017), pp. 1–27.

48 See Andrew Foxhall, ‘Putin’s useful idiots: Britain’s left, right and Russia’, Russia Studies Centre at the Henry Jackson Society (2016), available at: {https://henryjacksonso.wpengine.com/wp-content/uploads/2016/10/Putins-Useful-Idiots.pdf}, accessed 27 May 2018.

49 See Yablokov & Chatterjee-Doody (2021).

50 See Krieg (2023).

51 See Mike Winnerstig, Gudrun Persson, Anna Bulakh, Julian Tupay, Karel Kaas, Emmet Tuohy, Kristiina Visnapuu, Juhan Kivirähk, Andis Kudors, and Nerijus Maliukevičius, ‘Tools of Destabilization Russian Soft Power and Non-military Influence in the Baltic States’, Swedish Defence Ministry (2014), available at: {https://www.foi.se/rapportsammanfattning?reportNo=FOI-R–3990–SE}, accessed 4 March 2018.

52 See KAPO, ‘Annual Report 2007’, Estonian Internal Security Service (2007), available at: {https://kapo.ee/sites/default/files/content_page_attachments/Annual%20Review%202007.pdf}, accessed 24 June 2025.

53 See ´Kõuts-Klemm et al. (2022). Jean-Baptiste Jeangène Vilmer, ‘Effective state practices against disinformation: Four country case studies’, EU Hybrid CoE, (2021), available at: {https://www.hybridcoe.fi/publications/hybrid-coe-research-report-2-effective-state-practices-against-disinformation-four-country-case-studies/}, accessed June 24 2025.

54 See Bjola & Papadakis (2020); Shelley Boulianne and Edda Humprecht, ‘Perceived Exposure to Misinformation and Trust in Institutions in Four Countries Before and During a Pandemic’, International Journal of Communication 17, (2023), pp. 2024–47. Edda Humprecht, Laia Castro Herrero, Sina Blassnig, Michael Brüggemann and Sven Engesser, ‘Media Systems in the Digital Age: An Empirical Comparison of 30 Countries’, Journal of Communication, 72:2, (2022), pp. 145–64. Humprecht et al. (2020). Mckay and Tenove (2020). Tenove (2020). Wigell (2019).

55 See Joan E. Grusec and Paul D. Hastings, Handbook of Socialization – Theory and Research (London: Guilford Press, 2014).

56 An overview of such state policies can be found in Richard Rogers and Sabine Niederer, The Politics of Social Media Manipulation (Amsterdam: Amsterdam University Press, 2020), p. 53.

57 Bjola & Papadakis (2020).

58 Kõuts-Klemm et al. (2022).

59 See Cedric de Conning, ‘Strengthening the resilience and adaptive capacity of societies at risk from hybrid threats’, Hybrid CoE Working Paper 9 (2021).

60 See H. Andrzej Sørensen Turkowski and D. Bach Nyemann, ‘Going beyond resilience – a revitalized approach to countering hybrid threats’, Hybrid CEO Strategic Analysis, 13 (2018).

61 See Nahema Marchal, ‘“Be nice or leave me alone”: An intergroup perspective on affective polarization in online political discussions’, Communication Research (2021), pp. 1–23.

62 See Javier Serrano-Puche, ‘Digital disinformation and emotions: Exploring the social risks of affective polarization’, International Review of Sociology, 31:2 (2021), pp. 231–45.

63 See Humprecht et al. (2020).

64 See Nic Newman, Richard Fletcher, Antonis Kalogeropoulos, David A. L. Levy, and Rasmus Kleis Nielsen, ‘Reuters Institute Digital News Report 2018’, Reuters Institute for the Study of Journalism (2018), available at: {https://reutersinstitute.politics.ox.ac.uk/sites/default/files/digital-news-report-2018.pdf}, accessed 24 June 2024.

65 Newman et al. (2018), p. 19.

66 See Robert B. Michael and Brooke O. Breaux, ‘The relationship between political affiliation and beliefs about sources of “fake news”’, Cognitive Research: Principles and Implications, 6:1 (2021), pp. 1–5; Mathias Osmundsen, Alexander Bor, Peter Bjerregaard Vahlstrup, Anja Bechmann, and Michael Bang Petersen, ‘Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter’, American Political Science Review, 115:3 (2021), pp. 999–1015; Joseph E. Uscinski, Casey Klofstad, and Matthew D. Atkinson, ‘What drives conspiratorial beliefs? The role of informational cues and predispositions’, Political Research Quarterly, 69:1 (2016), pp. 57–71.

67 See European Commission (2023), p. 81. John Watts, ‘Whose Truth? Sovereignty, Disinformation, and Winning the Battle of Trust’, Atlantic Council, (september 2018) available at: {https://www.atlanticcouncil.org/wp-content/uploads/2018/09/Sovereign_Challenge_Report_091718_web.pdf }, accessed 22 May 2018.

68 See Denis McQuail, McQuail‘s MassCommunicationTheory (London: SAGE Publications, 2010)

69 See Benkler et al. (2018).

70 See Claire Wardle and Hossein Derakhshan, ‘Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making’, Council of Europe Report (2017), available at: {https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c}, accessed 24 June 2025.

71 See Rhys Crilley, M. Gillespie, B. Vidgen, and A. Willis, ‘Understanding RT’s audiences: Exposure not endorsement for Twitter followers of Russian state-sponsored media’, The International Journal of Press/Politics, 27:1 (2020), pp. 220–42; Charlotte Wagnsson, Torsten Blad, and Aiden Hoyle, ‘“Keeping an eye on the other side”: RT, Sputnik, and their peculiar appeal in democratic societies’, The International Journal of Press/Politics, 29:4 (2024).

72 See Vincent Hendricks and Mads Vestergaard, Reality Lost: Markets of Attention, Misinformation and Manipulation (Cham: Springer, 2018).

73 See Sander van der Linden and Jon Roozenbeek, The Psychology of Misinformation (Cambridge: Cambridge University Press, 2024).

74 S Altay, M Berriche, H. Heuer, J. Farkas & S. Rathje, ‘A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field,’ HKS Misinformation Review, (2023), available at: {https://doi.org/10.37016/mr-2020-119}.

75 See Kõuts-Klemm et al. (2022), p. 547.

76 See Daniel Hallin and Paolo Mancini, Comparing Media Systems: Three Models of Media and Politics (Cambridge: Cambridge University Press, 2004).

77 Minna Horowitz, Stephen Cushion, Marius Dragomir, Sergio Gutiérrez Manjón and Mervi Pantti, ‘A Framework for Assessing the Role of Public Service Media Organizations in Countering Disinformation’, Digital Journalism, 10:5 (2021), pp. 843–65.

78 Humprecht et al. (2021).

79 See Ivana Smoleňová, Barbora Chrzová, Iveta Várenyiov, Dušan Fischer, Dániel Bartha, András Deák, András Rácz, and Andrzej Turkowski, ‘United We Stand, Divided We Fall: The Kremlin’s Leverage in the Visegrad Countries’, Prague Security Studies Institute (November 2017), available at: {https://www.ceid.hu/wp-content/uploads/2017/11/Publication_United-We-Stand-Divided-We-Fall.pdf}, accessed 10 February 2018, p. 29.

80 See Michael Hameleers, Anna Brosius and Claes H. de Vreese, ‘Whom to trust? Media exposure patterns of citizens with perceptions of misinformation and disinformation related to the news media’, European Journal of Communication, 37:3, (2022), pp. 237–68.

81 See Jan Kreft, Monika Boguszewicz-Kreft, and Daria Hliebova, ‘Under the fire of disinformation: Attitudes towards fake news in the Ukrainian frozen war’, Journalism Practice (2023), pp. 1–21.

82 See Watts (2018).

83 See Sebastián Valenzuela, Daniel Halpern, and Felipe Araneda, ‘A downward spiral? A panel study of misinformation and media trust in Chile’, The International Journal of Press/Politics (2021).

84 See Krieg (2023).

85 See EU’s CORE-model in European Commission, (2023), p. 82.

86 See Rhodes-Purdy et al. (2023).

87 Gallache & Heerdink (2019).

88 Kriel & Pavliuc (2019).

89 Freelon et. al (2020).

90 Bradshaw, DiResta and Miller (2022)

91 Libby Jenke, ‘Affective polarization and misinformation belief’, Political Behavior, 46 (2024), pp. 825–84.

92 Osmundsen et. al. (2021).

93 Marchal (2021).

94 Serrano-Puche (2021).

95 Humprecht et. al. (2020).

96 See Edda Humprecht et al (2020), Boulianne and Humprecht (2023).

97 Anton Shemhovtsov, ‘Russia and the Western far Right’ (Routledge 2018).

98 Foxhall (2016)

99 Adam Holesch, Piotr Zagórski, and Luis Ramiro, ‘European radical left foreign policy after the invasion of Ukraine: Shifts in assertiveness towards Russia’, Political Research Exchange, 6:1 (2024), pp. 1–22.

100 Russian opposition figure Mikhail Khodorkovsky, ‘Plan for life after Vladimir Putin’, Politico (2016), available at: {https://www.politico.eu/article/life-after-vladimir-putin-eu-russia-relations-sanctions-kremlin-moral-boundaries-mikhail-khodorkovsky/}, accessed 24 June 2025.

101 Yablokov & Chatterjee-Doody (2021).

102 See Rhodes-Purdy et al. (2023).

103 See Larisa Doroshenko, ‘Far-right parties in the European Union and media populism: A comparative analysis of 10 countries during European parliament elections’, International Journal of Communication, 12 (2018), pp. 3186–206; Jonathan Kennedy, ‘Populist politics and vaccine hesitancy in Western Europe: an analysis of national-level data’, Eur J Public Health, 29:3 (2019), pp. 512–6; Rasmus Skytte, ‘Dimensions of elite partisan polarization: Disentangling the effects of incivility and issue polarization’, British Journal of Political Science, 51:4 (2020), pp. 1457–75.

104 See Cas Mudde, Populist radical right parties in Europe (2007) (Cambridge University: UK).

105 See Alliance For Securing Democracy, ‘Targeting Baerbock: Gendered Disinformation in Germany’s 2021 Federal Election’ (2021), available at: {https://securingdemocracy.gmfus.org/targeting-baerbock-gendered-disinformation-in-germanys-2021-federal-election/}, accessed 24 June 2025; Fredrik Wesslau, ‘Putin’s Friends in Europe’, European Council on Foreign Relations (2016), available at: {https://ecfr.eu/article/commentary_putins_friends_in_europe7153/}, accessed 4 April 2018.

106 See Arista Beseler and Florian Toepfl, ‘Conduits of the Kremlin’s informational influence abroad? How German-language alternative media outlets are connected to Russia’s ruling elites’, The International Journal of Press/Politics (2024), pp. 1–20.

107 See Mikael Wigell, ‘Democratic deterrence: How to dissuade hybrid interference’, The Washington Quarterly, 44:1 (2021), pp. 49–67.

108 See Dmitry Adamsky, The Russian Way of Deterrence – Strategic Culture, Coercion, and War (California: Stanford University Press, 2023); Tracey German, ‘Harnessing protest potential: Russian strategic culture and the colored revolutions’, Contemporary Security Policy, 41:4 (2020); Samuel A. Greene, Putin v. The People – The Perilous Politics of a Divided Russia (London: Yale University Press, 2019); Oscar Jonsson, The Russian Understanding of War: Blurring the Lines Between War and Peace (Georgetown University Press, 2019); Martin Kragh, Det fallna imperiet: Ryssland och väst under Vladimir Putin (Sweden: Fri Tanke, 2022).

109 See Keir Giles, Moscow Rules (UK: Brookings Institution Press, 2018), p. 13.

110 See James Sherr, Hard Diplomacy and Soft Coercion: Russia’s Influence Abroad (ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/asb/detail.action?docID=1481189: Brooks Institution Press, 2013).

111 See Dmitri Trenin, Russia (UK: Polity, 2019).

112 See Galeotti (2019), Mette Skak, Russias new Monroe Doctrine, in Roger E. Kanet, Russias foreign policy in the 21st Century Palgrave MacMillan, 2011).

113 See Galeotti (2019), Mette Skak, ‘Russia’s new Monroe Doctrine’, in Roger E. Kanet (ed.), Russia’s Foreign Policy in the 21st Century (Palgrave MacMillan, 2011).

114 See Steve Abrams, ‘Beyond propaganda: Soviet active measures in Putin’s Russia’, Connections: The Quarterly Journal, 15:1 (2016); Jolanta Darczewska and Piotr Żochowski, ACTIVE MEASURES – Russia’s Key Export (Poland: Centre for Eastern Studies, 2017); Katri Pynnöniemi and Minna Jokele, ‘Perceptions of hybrid war in Russia: Means, targets and objectives identified in the Russian debate’, Cambridge Review of International Affairs, 33:6 (2020), pp. 828–45.

115 See Christiern Santos Okholm, Amir Ebrahimi Fard and Marijn ten Thij, ‘Blocking the information war? Testing the effectiveness of the EU’s censorship of Russian state propaganda among the fringe communities of Western Europe’, Internet Policy Review, 13:3 (2024), p. 7.

116 See Santos Okholm et al. (2024).

117 See Rhodes-Purdy et al. (2023).

118 See Kath Browne, ‘Snowball sampling: Using social networks to research non‐heterosexual women’, International Journal of Social Research Methodology, 8:1 (2005), pp. 47–60; Rebecca D. Petersen and Avelardo Valdez, ‘Using snowball-based methods in hidden populations to generate a randomized community sample of gang-affiliated adolescents’, Youth Violence & Juvenile Justice, 3 Y:2 (2005), pp. 151–67.

119 See Pew Research Center, ‘The Role of Alternative Social Media in the News and Information Environment’ (6 October 2022), available at: {https://www.pewresearch.org/journalism/2022/10/06/the-role-of-alternative-social-media-in-the-news-and-information-environment/}, accessed 24 June 2025.

120 See Institute for Strategic Dialogue, ‘The Murky Origin Story of #IstandwithRussia: How Influencer Networks Proliferating across Social Media Platforms Spread Pro-Kremlin Narratives and Hashtags’ (2022), available at: {https://www.isdglobal.org/wp-content/uploads/2022/05/The-murky-origin-story-of-I-stand-with-russia.pdf}, accessed 22 June 2025.

121 See Jakob Guhl, Julia Ebner, and Jan Rau, ‘The Online Ecosystem of the German Far-Right’, Institute for Strategic Dialogue (2020), available at: {https://www.isdglobal.org/wp-content/uploads/2020/02/ISD-The-Online-Ecosystem-of-the-German-Far-Right-English-Draft-11.pdf}, accessed 24 June 2025.

122 See Statista, ‘Number of Users of Selected Social Media Platforms in Europe from 2017 to 2027, by Platform’ (2023), available at: {https://www.statista.com/forecasts/1334334/social-media-users-europe-by-platform}, accessed 15 March 2023.

123 See Gordon Ramsay and Sam Robertshaw, ‘Weaponising News RT, Sputnik and Targeted Disinformation’, King’s College London, (January 2019), available at: {https://www.kcl.ac.uk/policy-institute/assets/weaponising-news.pdf}, accessed 24 June 2025.

124 See Nika Aleksejeva and Andy Carvin, ‘Narrative Warfare – How the Kremlin and Russian News Outlets Justify a War of Aggression against Ukraine’, Atlantic Council (2023), available at: {https://www.atlanticcouncil.org/wp-content/uploads/2023/02/Narrative-Warfare-Final.pdf}, accessed 24 June 2025.

125 See Peter W. Singer and Emerson T. Brooking, Like War – The Weaponization of Social Media (New York: First Mariner Books, 2018).

126 See Florian Saurwein, Tobias Eberwein, and Matthias Karmasin, ‘Public service media in Europe: Exploring the relationship between funding and audience performance’, Journal of the European Institute for Communication and Culture, 26:3 (2019).

127 See Timothy Neff and Victor Pickard, ‘Funding democracy: Public media and democratic health in 33 countries’, The International Journal of Press/Politics (2021).

128 Newman et al. (2022)

129 See Luca Bettarelli, Andres Reiljan, and Emilie Van Haute, ‘A regional perspective to the study of affective polarization’, European Journal of Political Research, 62:2 (2022).

130 See Matthijs Rooduijn, Andrea L.P. Pirro, Daphne Halikiopoulou, Caterina Froio, Stijn van Kessel, Sarah de Lange, Cas Mudde and Paul Taggart, ‘The PopuList 3.0: An Overview of Populist, Far-left and Far-right Parties in Europe’, (15 February 2024) available at: {www.popu-list.org}, accessed June 24 2025.

131 The smallest of these (Italy) outperforms the next largest power Spain by 48 pct. on GDP, 24 pct. on population (see World Bank Open data, available at: {https://data.worldbank.org/}, accessed June 24 2024) and a 85 pct. on military budget (see SIPRI database "SIPRI Militarty Expenditure Database" available at: {https://milex.sipri.org/sipri}, accessed June 24 2024.

132 Saurwein et al. (2019).

133 Neff and Pickard (2021).

134 Nic Newman, Richard Fletcher, Craig Robertson, Kristen Eddy & Rasmus Kleis Nielsen (2022) ‘Reuters Institute Digital News Report 2022’ Reuters Institute for the Study of Journalism. Available at: {https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2022-06/Digital_News-Report_2022.pdf}, accessed June 24 2025.

135 See Hameleers, et al. (2022), Horrowitz et al. (2021), Humprecht et al. (2022).

136 Bettarelli et al. (2022).

137 See Rooduijn et. al. (2024).

138 See Maurits J. Meijers and Andrej Zaslove, ‘Measuring populism in political parties: Appraisal of a new approach’, Comparative Political Studies, 54:2 (2021), pp. 372–407; Mudde, Populist Radical Right Parties in Europe; Rhodes-Purdy, Navarre, and Utych, The Age of Discontent; Tobias Widmann, ‘How emotional are populists really? Factors explaining emotional appeals in the communication of political parties’, Political Psychology (2020), pp. 1–9.

139 See Rooduijn et. al. (2024).

140 See European Digital Media Observatory (EDMO), ‘How Russian Channels Spread and Amplified Hoaxes about the Spanish King’s and PM’s Entourage Visiting Valencia’ (2024), available at: {https://edmo.eu/publications/how-russian-channels-spread-and-amplified-hoaxes-about-the-spanish-kings-and-pms-entourage-visiting-valencia/}, accessed 24 June 2025.

141 See Washington Post, ‘How the anti-vaccine movement is gaining power in statehouses’ (2023), available at: {https://www.washingtonpost.com/health/2023/12/22/anti-vaccine-covid/}, accessed 24 June 2025.

142 See Washington Post, ‘Kremlin tries to build antiwar coalition in Germany, documents show’ (2023), available at: {https://www.washingtonpost.com/world/2023/04/21/germany-russia-interference-afd-wagenknecht/}, accessed 24 June 2025; Washington Post, ‘Top Republican warns pro-Russia messages are echoed “on the House floor”’ (2024), available at: {https://www.washingtonpost.com/politics/2024/04/07/russian-propaganda-republicans-congress/}, accessed 24 June 2025.

Figure 0

Figure 1. Distribution of fringe communities across countries.

Figure 1

Table 1. Dataset for societal factors, measurable indicators, and sources.

Figure 2

Figure 2. Country mean frequency of Russian propaganda URLs among fringe groups.

Figure 3

Figure 3. National levels of average frequency of Russian URLs in fringe groups and market share of public service media.

Figure 4

Figure 4. National levels of average frequency of Russian URLs in fringe groups and trust in media.

Figure 5

Figure 5. National levels of average frequency of Russian URLs in fringe groups and affective polarisation. No data for Denmark.

Figure 6

Figure 6. National levels of average frequency of Russian URLs in fringe groups and percentage share of populist parties in national parliaments.

Figure 7

Figure 7. Share of fringe groups where Russian URLs appeared more than once per week on average, by great and small power status.

Figure 8

Table A1. Statistic summary of the great power variable.