Despite initial hopes that advances in information technology would spread and deepen democracy around the world, new platforms for communicating have instead provided opportunities for the weakening of democracy. Social media, website hosting, messaging apps, and related technologies provide easy and cheap ways for micro-actors such as individuals and small groups (in addition to more traditional state and non-state actors) to wield soft power for antidemocratic purposes. Of course, the probability that any one soft power action on the part of a micro-actor will have a consequential effect in the world is minuscule. Yet cyber-enabled micro-actions by micro-actors can make a difference, often one that has a negative effect on democratic institutions. QAnon, the anti-vaxx movement, and the spread of racism, antisemitism, and Islamophobia online are just a few examples of the effects of aggregated micro exercises of soft power. Spreading mis- and disinformation as well as divisiveness and hateful messages through social media is a kind of malign soft power that undermines democracy.
Although soft power is generally thought to be the “good” kind of power, since it works by attraction rather than coercion,Footnote 1 it has a dark side (Marlin-Bennett, Reference Marlin-Bennett2022) when the motive is to harm – and I include undermining democracy as a kind of harm. While adversarial state actors or other large and well-resourced non-state actors can easily deploy malign soft power, micro-actors can as well because the costs of doing so are low.Footnote 2 Exactly who is acting is often unknown, which complicates how to fight back. Furthermore, even when micro-actors are behind antidemocratic activities, such efforts may hook up with the interests of states or non-state adversaries that seek to subvert democracy and weaken democratic states. Despite the difficulties of identifying the actors who wield this power, we can observe the wieldingFootnote 3 of it as flows of information move through social networks.
This chapter provides a framework for analyzing antidemocratic soft power that focuses on flows of antidemocratic messaging. I begin with a review of the initial hopes for information technologies’ contributions to a more democratic world and why those hopes were dashed. In the second section, I explain why it is reasonable to analyze these attacks on democracy by focusing first on the information as it flows rather than on the actors who are attacking. Actors, I argue, are emergent, which is especially relevant in the context of cyber-enabled technologies. In the third section, I focus on conceptualizing soft power, including its malign form that can be used to undermine democracy. In the fourth section, I examine the wielding of antidemocratic soft power through control of information flows (content, velocity, and access). The chapter ends with brief concluding comments.
Initial Hopes
In 2004, the internet became truly interactive. Previously, accessing the internet meant users seeing the information that was provided to them and providing the information that was requested of them. Web 2.0, the web of social media, allowed users to post content they chose to share online and to do so in a way that allowed their posts to be visible to their friends or interested others. This was widely considered to be a very good thing. By 2006, Time Magazine named “You” the person of the year, because “[y]ou control the Information Age” by providing content. Lev Grossman, then a technology writer for the magazine, and also a novelist, praised internet users – “for seizing the reins of global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game.” The tribute continues with a minor caveat, but ends with what in hindsight seems to be recklessly positive spin:
Sure, it’s a mistake to romanticize all this any more than is strictly necessary. Web 2.0 harnesses the stupidity of crowds as well as its wisdom. Some of the comments on YouTube make you weep for the future of humanity just for the spelling alone, never mind the obscenity and the naked hatred.
But that’s what makes all this interesting. Web 2.0 is a massive social experiment, and like any experiment worth trying, it could fail. … This is an opportunity to build a new kind of international understanding, not politician to politician, great man to great man, but citizen to citizen, person to person. It’s a chance for people to look at a computer screen and really, genuinely wonder who’s out there looking back at them. Go on. Tell us you’re not just a little bit curious.
Many others writing in the popular media and scholarly/scientific literature echoed this optimism, sometimes downplaying the obvious caveats. Canadian commentators Don Tapscott and Anthony D. Williams argued for a positive vision of the transformative nature of information technology-enabled interaction, focusing on the possibilities for collaboration to radically alter (in a good way) business and society (Tapscott & Williams, Reference Tapscott and Williams2006). In later work they also claimed that collaborative governance – citizens being able to weigh in on domestic and transborder policy issues – would improve the democracy of governments and open the world to democratic participation (Tapscott & Williams, Reference Tapscott and Williams2010). In an interview in CIO Insight, an information technology industry trade magazine, Williams lauded the potential of “governance webs” – interactive websites for policy deliberation and sharing of information. The new capacity for interactivity online “provides a mechanism for collaboration of public agencies, the private sector, community groups and citizens.” While Williams cautioned that Web 2.0 would not be a “silver bullet [delivering] world peace,” he still foresaw “a new golden age of democracy” (quoted in Klein, Reference Klein2008, p. 36).
Many of the early scholarly publications on Web 2.0 and its broader social consequences also made optimistic claims about the exciting democratizing potential of e-democracy, though scholarly works tended to be more moderate than publications written for a broader audience. Many scholars acknowledged that various hiccoughs, such as the possible lack of citizen interest in participating online, could limit the democratizing potential of this new technology. For example, Kalnes (Reference Kalnes2009), writing on the 2007 elections in Norway, found that Web 2.0 allowed for greater ease of participation for citizens who wished to engage, even though its effect on pluralism was limited. (See also Anttiroiko, Reference Anttiroiko2010; Boikos, Moutsoulas, & Tsekeris, Reference Boikos, Moutsoulas and Tsekeris2014; Breindl & Francq, Reference Breindl and Francq2008; Costa Sánchez & Piñeiro Otero, Reference Costa Sánchez and Piñeiro Otero2011; Parycek & Sachs, Reference Parycek and Sachs2010; Raddaoui, Reference Raddaoui and Beldhuis2012; Reddick & Aikins, Reference Reddick, Aikins, Reddick and Aikins2012.)
Yet, other scholars provided strong warnings. In an early article on this topic, Cammaerts (Reference Cammaerts2008, p. 359) notes the high hopes that analysts had for a more expansively pluralist society in which anyone could say what they wish (enabling the “radical plurality of the blogosphere”). More to the point, though, he identifies antidemocratic pressures, including those resulting from peer-to-peer intimidation leading to self-censorship, and “the existence of anti-publics, abusing the freedom of expression with the aim to weaken democracy and democratic values” (Cammaerts, Reference Cammaerts2008, p. 372).Footnote 4 (Also see concerns raised by, inter alia, Marlin-Bennett, Reference Marlin-Bennett2011; Schradie, Reference Schradie2011; Van Dijck & Nieborg, Reference Van Dijck and Nieborg2009.) In the years that have passed since those earlier assessments, Cammaerts’ and other scholars’ pessimism has been validated. The failure of this massive social experiment (to use Grossman’s term) has had and continues to have correspondingly negative social repercussions.
Despite prior naive expectations that social media and related forms of communication would only encourage people to embrace democracy, these cyber-enabled technologies have also opened opportunities for actors to use soft power to undermine democracy.
Why Not Start by Figuring Out Who the Bad Guys Are?
The analysis of power usually starts with an assessment of who (or what) is acting on whom. However, the nature of actors on social media and using related technologies is a moving target. All actors continue to change as a consequence of their interactions, and new and often surprising actors pop up (and often disappear). As Berenskötter (Reference Berenskötter2018) suggests, actors’ ontologies and their motives are constructed through their interactions. In this section, I discuss the governing logics of cyberspace and the emergence of actors to buttress my claim that it makes sense to first focus on the flows of information rather than on who is enacting the flows in the analysis of antidemocratic soft power.Footnote 5
Libertarian and Neoliberal Logics
Cyberspace, encompassing the interactions among users of websites, social media platforms, messaging, and related technologies, has been constituted through the libertarian and neoliberal logics that generate the technical and regulatory structures of internet development and policymaking. Cyberspace consequently usually permits anonymity, pseudonymity, and even spoofing.Footnote 6 Although early in the development of the internet many computer scientists and engineers held a soft socialist view that saw code as something to be shared and the internet as something of a public service, by the 1980s that had changed (in tandem with larger social shifts). Internet developers had adopted a Silicon Valley worldview: “‘Technolibertarianism’ became one of the central ideologies of the Internet” (Rosenzweig, Reference Rosenzweig1998, p. 1550; see also Borsook, Reference Borsook1996). Decentralized, participatory decision-making procedures, which in the earlier period signaled equality and camaraderie, were repurposed to fit libertarian norms of limited government intervention and individualism. The emphasis on individual liberty fits with allowing the individual to hide their identity as the default setting, making anonymity and pseudonymity permitted and warranted (See Berker, Reference Berker2022 on deontic and fitting categories.) Put differently, it would have been possible to engineer into internet systems a strong requirement or an expectation that users would have to identify themselves truthfully to have an online presence. That did not happen.
The move to libertarian logics included a shift toward business and profit-making, prioritizing the commodification of information and the protection of intellectual property (Marlin-Bennett, Reference Marlin-Bennett2004), all of which overlap with neoliberal logics. The Clinton administration designed support for neoliberal logics into its formative internet policies, as evinced by the decision to open the internet to commercial activities and by the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) as a private, nonprofit, multistakeholder organization that would perform internet governance functions. In doing so, the United States maintained its hegemonic position and the power of corporations in cyberspace (Taggart & Abraham, Reference Taggart and Abraham2023).Footnote 7 To require identification by default would have added friction to the system, which was optimized for efficient, fast transactions of a market. What this meant for users’ anonymity and pseudonymity was that there was no reason to change course and require that users identify themselves truly by default. Nor was there any requirement that a person (natural or legal) could have only one cyber identity. In short, the libertarian and neoliberal underpinning of the structure of the internet is constitutive of cyberspace in which users generally may keep their (multiple) identities hidden.
Actors as Emergent
Furthermore, actors and agencies are emergent (Abraham, Reference Abraham2022; Chatterje-Doody & Crilley, Reference Chatterje-Doody, Crilley, Stengel, MacDonald and Nabers2019; Dunn Cavelty & Jaeger, Reference Dunn Cavelty and Jaeger2015; Elder-Vass, Reference Elder-Vass2008), regardless of whether the actors are identified, pseudonymous, or anonymous. As Karen Barad explains: Actors “do not preexist their interactions; rather [actors] emerge through and as part of their entangled intra-relating.” Emergence continues as actors “are iteratively reconfigured through each intra-action” (Barad, Reference Barad2007, p. ix). When actors’ identities are reasonably stable, efforts to identify bad actors open opportunities for deterring their actions in the future. That is often the case when actors are states or recognized as institutionally coherent non-state actors, though even if they do exist in a recognizable form, they are changing.
The emergence of surprising new micro-actors whose messages undermine democracy is not a novel phenomenon of the Information Age. Henry Ford (1863–1947), well-known as the founder of Ford Motor Company, surprisingly became a leading proponent of antisemitic hate and disinformation. His hate-filled screed, The International Jew, was subsequently used for Nazi propaganda (Flink, Reference Flink2000). The anti-Black and antisemitic Ku Klux Klan, which was created in 1865 by a group of Confederate Army veterans in Pulaski, Tennessee, is another example (Baudouin, Reference Baudouin2011; Quarles, Reference Quarles1999).
However, cyber-enabled technologies, because of their affordability, make it easier for surprising new micro-actors to participate in spreading hate, divisiveness, and mis- and disinformation. The resurgence of racism, antisemitism, Islamaphobia, and other forms of intergroup hatred spread through cyber-enabled technologies, and a large proportion of people were exposed to abusive content (ADL, 2023; Vidgen, Margetts, & Harris, Reference Vidgen, Margetts and Harris2019). Cyber-enabled technologies have allowed anti-vaxx groups to share anti-vaxx misinformation and promote vaccine hesitancy and rejection, eroding trust in public health agencies and causing a drop in vaccination rates, as these groups become a “political force” in democratic societies (Piper, Reference Piper2023; see also Burki, Reference Burki2020; Wilson & Wiysonge, Reference Wilson and Wiysonge2020). Perhaps most surprising is the pseudonymous QAnon, which first emerged on the 4chan social media site. QAnon adherents spread a bizarre conspiracy theory that combines alt-right divisiveness, disinformation about the outcome of the 2020 US presidential election, confabulation about the so-called deep state, and hatred for Jews, the LGBTQ+ community, and immigrants (QAnon | ADL, 2022). Many of the participants in the January 6, 2021, attack on the United States Capitol were QAnon followers (Lee et al., Reference Lee, Merizalde, Colautti, An and Kwak2022).
In the next two sections, I focus upon conceptualizing malign soft power and exploring how it is wielded using cyber-enabled technologies.
Conceptualizing Malign Soft Power
Cyber-enabled technologies can be used to wield soft or hard power. Hard power cyberattacks, including sabotage of critical infrastructureFootnote 8 and ransomware, is only deployed in adversarial situations. Soft power, on the other hand, seems almost friendly.Footnote 9 The usual view is that soft power practices “contribute to a positive image that endears nations with soft power to other nations, which in turn enhances these soft power nations’ influence in world politics” (Gallarotti, Reference Gallarotti2011, p. 32, stress added).Footnote 10 Those who advocate for relying on soft power over hard claim that soft power “cultivates cooperation and compliance in a much more harmonious context” than hard power does (Gallarotti, Reference Gallarotti2022, p. 384, stress removed).Footnote 11 Other scholars have questioned this positive view. Successfully wielding soft power – even of the most pleasant sort – is forceful, in the basic sense of getting others to do what they would not have otherwise done, which means that interests have been denied or manipulated (Bially Mattern, Reference Bially Mattern2005; Hayden, Reference Hayden2012, Reference Hayden2017). Soft power can also have negative unintended consequences (Johnson, Reference Johnson2011; Siekmeier, Reference Siekmeier2014). And to the point for understanding how soft power can negatively impact democracy: Soft power can be wielded in ways that undercut democracy.
As such, antidemocratic soft power is a kind of malign soft power, as opposed to good or neutral soft power. Malign soft power is a power of attraction used for harm (Marlin-Bennett, Reference Marlin-Bennett2022), but discerning that a particular action was motivated by a wish to do harm is difficult. Nevertheless, even when we cannot see who is acting – as is often the case with cyber-enabled technologies – we can reasonably infer the motivation behind an action by drawing upon our own practical reasoning. Members of society routinely make such judgments, and inferring motivations from actions and their consequences is a normal part of social life. While it is possible that otherwise well-meaning people accidentally do something that has a malign consequence or is misconstrued does not cancel out the quotidian way people adjudge actions. In addition, judgments of harm depend on the standpoint of whoever is making the determination,Footnote 12 and I acknowledge that my analysis comes from a pro-democracy position.Footnote 13 As I discuss in the next section, messages of divisiveness, hate, and mis- and disinformation become attractive (and therefore powerful) when they seduce, trick, or amuse those who are exposed to them into feeling that they share the sentiments.
Ironically, in Western democracies, wielding malign soft power is often legal, which further sets this kind of power apart from hard power. Democracy’s guarantee of freedom of speech makes punishing antidemocratic speech more difficult, though jurisdictions have different laws about the limits of protected speech. (European laws generally have more limits on speech than the US laws, but rights to freedom of expression still provide ample room for the lawful spread of politically divisive messages.) In some cases, state actors (e.g., Russia and China) initiate and/or amplify these messages; in other cases, they may simply be “home grown.” How these messages are received is important, too. Most people who see a message of hate on X (formerly Twitter) or a bit of disinformation on TikTok will probably not be susceptible to the soft power lure of the antidemocratic content and will simply move on to the next message, one that is unlikely to be similarly problematic. However, some people will be affected in that they will be attracted to the underlying antidemocratic message and the normalization of antidemocratic language and images is problematic.Footnote 14
Wielding Malign Soft Power: Controlling Flows of Antidemocratic Information
Wielding malign soft power, the action of powering the soft power, refers to the instance of controlling flows of messages that are politically divisive, hate-filled, and/or mis- or disinformation through social networks, exposing recipients, some of whom are attracted by the antidemocratic messages. The flows have the properties of content (the messages that are divisive, hate-filled, or mis- or disinformation), velocity (direction and speed of the messages), and access to them (by choice of the person who is exposed, by chance, or by force).Footnote 15 Focusing on these properties allows patterns of actions rather than actors to be the subjects of analysis and rule in an interactional, social sense (Szabla & Blommaert, Reference Szabla and Blommaert2020).
Content
The property of content refers to messages being transmitted and the emotions they convey. Message density is important as well, as the more times a particular message is received, the more it seems to be correct because it is common knowledge (Unkelbach et al., Reference Unkelbach, Koch, Silva and Garcia-Marques2019). I focus here on three commonly deployed modes by which the content works: seduction, trickery, and amusement.Footnote 16 When malign soft power seduces, tricks, or amuses, it does so through a combination of semantic and emotional content of messages.
Seduction links semantic content to emotion by invoking desire in the recipient. Online, as in person, we recognize how attracting through seductionFootnote 17 (of the sexual (e.g., Faisinet, Reference Faisinet1808) and nonsexual varieties (e.g., Bjerg, Reference Bjerg2016)) can cause harm. Content eliciting these desires spreads online in various ways, including through social media and via niche websites. Antidemocratic seduction convinces people to feel attachments that are inconsistent with democracy, such as hatred for political opponents, rather than simple disagreement with them. The spread of White Christian nationalist identitarianism in North America and Europe, a movement that is profoundly antidemocratic (Zuquete & Marchi, Reference Zuquete and Marchi2023), is an example. Individual participants produce and reproduce seduction through identitarian practices. For example, sharing “dog whistles” that express hatred in coded language gives participants a seductive sense that they are in on the secret (Marlin-Bennett, Reference Marlin-Bennett2022). Much of the content antisemitic feed of Stew Peters (@realstewpeters) on X (formerly Twitter) seeks to draw a White Christian audience into solidarity against Jews. Peters, who has a following on X of over 500,000 accounts (some of which may be automated) also engages in trickery about vaccines and other alt-right topics.
Trickery replaces falsehoods for facts, with claims that lies (known to be false) and/or bullshit (statements that are disconnected from the truth) (Frankfurt, Reference Frankfurt2005) are actually true. This practice is in play when disinformation is spread as misinformation by gullible users. For example, the Stew Peters Network disseminated disinformation about the COVID vaccine in a video, Died Suddenly (Skow & Stumphauzer, Reference Skow and Stumphauzer2022), on “Rumble, a moderation-averse video-streaming platform” (Tiffany, Reference Tiffany2023). Peters also promoted the video on X, Gab, and other social media sites. The many likes his tweets have received suggest that his falsehoods have been received by and are attractive to other users. The many reposts suggest that others spread his disinformation as misinformation.Footnote 18 Peters’ seductive messaging and lies connect with similar views disseminated by other individuals, creating interconnected networks of people who share antisemitism, White Christian nationalism, and anti-vaxx views. The widespread campaign to convince people that Joe Biden stole the US 2020 presidential election works similarly, drawing on trickery and often blending with the seductiveness of White Christian nationalism. Its success can be seen in a March 2023 CNN/SSRS poll that found that 63 percent of Republicans continued to believe it (Durkee, Reference Durkee2023). Individuals who believe in this falsehood have been tricked by antidemocratic mis- or disinformation.
When amusement is used for antidemocratic purposes, the content links humor or other pleasures to messages that in other contexts would be transgressive. A racist joke does not seem as bad to those who find it amusing. The mode of amusement allows funny or entertaining content to seem acceptable even when it is harmful (Apel, Reference Apel2009; Gaut, Reference Gaut1998). Topinka (Reference Topinka2018) examines the Reddit platform and specifically the r/ImGoingToHellForThis subreddit (now banned), in which members of the subreddit used humor to express racism, misogyny, antisemitism, and extreme anti-immigrant sentiments. Topinka provides a close analysis of how the redditors treated the famous, haunting picture of Alan Kurdi, the two-year old Syrian refugee who drowned along with his mother and brother in the Mediterranean Sea while fleeing to the Greek isle of Cos. The posts remixed the image of the dead toddler on the beach into jokes that were at once racist and anti-immigrant extremist. As soft power practices, “[t]he very ostentation on which this humor relies thus functions as a cloak concealing the networks of racist sentiment that the discourse sustains” (Topinka, Reference Topinka2018, p. 2066). This now defunct subreddit and other similar sites simultaneously rely on the democratic principle of free speech and eschew core democratic values. More generally, humor is often used on social media platforms to catch the eye of the user who is scrolling through feeds, be they TikTok or Instagram or another app, to find a quick laugh. Amusement is antidemocratic when it works to normalize hate or convince people of the truthfulness of fallacious claims relevant to current politics. Jokes may intensify the connection between humor and hate (Askanius, Reference Askanius2021; Marlin-Bennett & Jackson, Reference Marlin-Bennett and Jackson2022).
Velocity and Access
Controlling velocity (direction and speed) and access to messages are the other two properties for acting on soft power. Cyber-enabled technologies afford actors at all scales the capacity to manipulate these, but it is not necessarily the case that any specific techniques are relevant solely for antidemocratic purposes. If the content is antidemocratic, then increasing its velocityFootnote 19 – that is, increasing the speed at which the messages move and the spread of the messaging – is a means of wielding soft power to undermine democracy. Directions can be direct (going from a source to an expectant recipient) or circuitous. Speeds of message transmission range from fast to slow, and from constant to intermittent. The metaphor of a unit of information (e.g., a meme) “going viral,” means it is spreading quickly and in multiple directions. Messages can also jump from platform to platform and across technologies (DiResta, Reference DiResta2018).
Bots are usually designed to increase velocity. Bessi and Ferrara analyzed tweets about the 2016 presidential election during a five-week period in the fall of 2016. They found that approximately one-fifth of these tweets were generated by bots. They conclude:
The presence of social bots in online political discussion can create three tangible issues: first, influence can be redistributed across suspicious accounts that may be operated with malicious purposes; second, the political conversation can become further polarized; third, the spreading of misinformation and unverified information can be enhanced.
Micro-actors can also increase velocity, which they do by reposting and commenting, as well as by moving posts to new platforms (Marlin-Bennett & Jackson, Reference Marlin-Bennett and Jackson2022). Wielding malign soft power could also work by decreasing the velocity of messages that support democracy.
Controlling access to information is another means of wielding soft power. At stake is whether someone is exposed – or not exposed (denied access) – to information by choice, by chance, or by force. A choice approach means providing access to those who have requested the information, those who are already attracted to the messages. This soft power doubles down on existing attachments. Social media sites like r/ImGoingToHellForThis work this way. Users choose to subscribe and, in doing so, build community among those who are attracted to the divisiveness, hate, and mis- and disinformation. An access-by-chance approach is not targeted at any specific actor but instead involves providing access widely and anticipating that some who happen upon the messages will be swayed by them. The actors behind bot accounts uncovered by Bessi and Ferrara (Reference Bessi and Ferrara2016) probably disseminated manipulated information on X using a chance strategy of sending out a lot of content quickly and widely. The effectiveness of a chance approach depends on whether the messages find a core group of people who are receptive to it. Hindman and Barash (Reference Hindman and Barash2018) also find more real news than fake news on X in the months before and after the 2016 election in the United States, but they remain concerned about the dense networks of followers of popular fake news accounts: “[T]he popularity of these accounts, and heavy co-followership among top accounts, means that fake news stories that reach the core (or start there) are likely to spread widely” (p. 4). (See also Grinberg et al. Reference Grinberg, Joseph, Friedland, Swire-Thompson and Lazer2019.) The movement of antidemocratic messaging thus accelerates when it reaches a community who are disposed to be attracted to it. Forced access involves exposing a user to information in a targeted way but without prior subscription or other sort of confirmation of willingness to receive it. Algorithms that display increasingly extreme messages to users force access to antidemocratic information upon them.
An algorithm’s control of the flow of antidemocratic information is an instance of deploying malign soft power even though the algorithm itself does not have motivation in the same way a human does. Algorithms that determine what appears in social media newsfeed are part of an assemblage (Bennett, Reference Bennett2005) determining access to information. DeCook and Forestal argue that “digital platforms not only curate and channel certain content to individual users but also facilitate a particular mode of collective thinking that [they] term undemocratic cognition” (2023, p. 631).Footnote 20 The practice of curating and channeling is clearly an instance of soft power in which individuals are subject to attraction. Motivations, I suggest, are written into algorithms that collect data, analyze what would attract users in a way that serves the motivations of the assemblage, and then produce a newsfeed that gives users access to certain messaging and withholds access to other messaging. (And because of the vagaries of coding, the outcomes of this process may or may not be wholly what the firm’s management had expected.) While “make money!” is a compelling motivation for many social media firms, concerns have been raised about Chinese-owned social media like TikTok (The Economist, 2023). Kokas notes that China seeks to “manipulate messaging to key [Western] constituencies” for (implicitly) antidemocratic ends (2022, p. 95; see also Zhong, Reference Zhong2023). Additionally, a failure to actively protect against undermining democracy by preventing access to a biased stream of information in one’s newsfeed also suggests a motivation that undervalues the protection of democracy. This is perhaps a lesser kind of maliciousness, one of omission rather commission.
Control over content, access, and velocity usually operate together. For example, information flows that flood social media quickly move a large volume of messages in terms of the number of posts and/or the amount of message within posts (Cirone & Hobbs, Reference Cirone and Hobbs2023). These flows characterized by steady streams of the same untruth contribute over time to making those false claims seem true as a function of “truth by repetition” (Lewandowsky et al., Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Morgan & Cappella, Reference Morgan and Cappella2023; Unkelbach et al., Reference Unkelbach, Koch, Silva and Garcia-Marques2019).
As Figure 4.1 summarizes, someone or something wields malign soft power for antidemocratic motives by controlling the flows of information. In the background of this analysis are those for whom the information flow is a hard power attack. In the foreground are the targets of malign soft power, the recipients of flows of attractive hate, mis- and disinformation, and other content that undermines democratic institutions. The intended recipients are those likely to be swayed by the information presented to them.

Figure 4.1 Wielding malign soft power.
Concluding Comments
This chapter makes a simple point: By carefully analyzing antidemocratic efforts using cyber-enabled technologies as malign soft power, we can see that the power of attraction is not necessarily harmless nor even less harmful than the power of coercion. The affordances of social media platforms and other participatory media make it easy for emergent actors to contribute to efforts undermining democracy. Antidemocratic content that is seductive, deceptive, or amusing flows easily and quickly throughout social networks. Some new actors further disseminate information while cognizant of the nature of the content and being willing participants in its spread. Others are seduced, distracted by the fun they are having, or simply duped. These are the unwitting actors who further spread hate and mis- and disinformation.
Who the bad actors are may or may not be easy to discern. The low cost of malign soft power resources means that both existing actors and surprising new ones can wield this kind of power. Actors are emergent: Wielding malign soft power dynamically constitutes actors’ identities. Moreover, that information flows, that it moves over time and space, is key. Manipulating content, velocity, and access and thereby making divisiveness, hate, and mis- and disinformation more available, more quickly, more widely, and to more users is harmful. The common interest is in discovering means of preventing or, if necessary, stopping these power flows and remediating existing harms. While governance can also be harmful, especially if it limits freedom of speech, limits rights to privacy, and is targeted toward vulnerable populations, good governance methods do not single out individuals but rather look at patterns of flows. A purpose of this framework for analyzing antidemocratic soft power is to uncover possible intervention points, the points in the flow of information at which defensive mechanisms can prevent or remediate the harms of malign soft power.
Each of the properties of information flows, the content, velocity, and access, provides opportunities for countering antidemocratic challenges, but undertaking democracy-affirming efforts must be done in a manner that preserves freedom of information. I am mindful of Friedrich Nietzsche’s warning:Footnote 21 “He who fights with monsters should be careful lest he thereby become a monster. And if [you] gaze long into an abyss, the abyss will also gaze into [you]” (2012, p. 83, sec. 146).
Cyber-enabled technologies have become an essential part of life in many ways and are necessary for democracies to function, but these technologies also afford the means to disrupt democracy. Understanding how antidemocratic soft power works and is wielded is just one tool for building resilient democracies.
