Skip to main content Accessibility help
×
Hostname: page-component-68c7f8b79f-gx2m9 Total loading time: 0 Render date: 2025-12-19T03:59:15.247Z Has data issue: false hasContentIssue false

4 - Cyber Challenges to Democracy and Soft Power’s Dark Side

from Part I - Challenges to Democratic Institutions

Published online by Cambridge University Press:  aN Invalid Date NaN

Scott J. Shackelford
Affiliation:
Indiana University, Bloomington
Frédérick Douzet
Affiliation:
Paris 8 University
Christopher Ankersen
Affiliation:
New York University

Summary

Despite initial hopes that advances in information technology would spread and deepen democracy around the world, new platforms for communicating have instead provided opportunities for the weakening of democracy. Social media, website hosting, messaging apps, and related technologies provide easy and cheap ways for micro-actors such as individuals and small groups (in addition to more traditional state and non-state actors) to wield soft power for antidemocratic purposes. This chapter probes how the malign version of soft power works by attracting targets through flows of information that seduce and trick audiences with mis- and disinformation as well as with divisive and hateful messaging.  Focusing on malign soft power and how it is wielded through control of information flows (content, velocity, and access) provides a framework for assessing how cyber-enabled antidemocratic efforts take form and how new actors emerge.

Information

Type
Chapter
Information
Securing Democracies
Defending Against Cyber Attacks and Disinformation in the Digital Age
, pp. 74 - 92
Publisher: Cambridge University Press
Print publication year: 2026
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

4 Cyber Challenges to Democracy and Soft Power’s Dark Side

Despite initial hopes that advances in information technology would spread and deepen democracy around the world, new platforms for communicating have instead provided opportunities for the weakening of democracy. Social media, website hosting, messaging apps, and related technologies provide easy and cheap ways for micro-actors such as individuals and small groups (in addition to more traditional state and non-state actors) to wield soft power for antidemocratic purposes. Of course, the probability that any one soft power action on the part of a micro-actor will have a consequential effect in the world is minuscule. Yet cyber-enabled micro-actions by micro-actors can make a difference, often one that has a negative effect on democratic institutions. QAnon, the anti-vaxx movement, and the spread of racism, antisemitism, and Islamophobia online are just a few examples of the effects of aggregated micro exercises of soft power. Spreading mis- and disinformation as well as divisiveness and hateful messages through social media is a kind of malign soft power that undermines democracy.

Although soft power is generally thought to be the “good” kind of power, since it works by attraction rather than coercion,Footnote 1 it has a dark side (Marlin-Bennett, Reference Marlin-Bennett2022) when the motive is to harm – and I include undermining democracy as a kind of harm. While adversarial state actors or other large and well-resourced non-state actors can easily deploy malign soft power, micro-actors can as well because the costs of doing so are low.Footnote 2 Exactly who is acting is often unknown, which complicates how to fight back. Furthermore, even when micro-actors are behind antidemocratic activities, such efforts may hook up with the interests of states or non-state adversaries that seek to subvert democracy and weaken democratic states. Despite the difficulties of identifying the actors who wield this power, we can observe the wieldingFootnote 3 of it as flows of information move through social networks.

This chapter provides a framework for analyzing antidemocratic soft power that focuses on flows of antidemocratic messaging. I begin with a review of the initial hopes for information technologies’ contributions to a more democratic world and why those hopes were dashed. In the second section, I explain why it is reasonable to analyze these attacks on democracy by focusing first on the information as it flows rather than on the actors who are attacking. Actors, I argue, are emergent, which is especially relevant in the context of cyber-enabled technologies. In the third section, I focus on conceptualizing soft power, including its malign form that can be used to undermine democracy. In the fourth section, I examine the wielding of antidemocratic soft power through control of information flows (content, velocity, and access). The chapter ends with brief concluding comments.

Initial Hopes

In 2004, the internet became truly interactive. Previously, accessing the internet meant users seeing the information that was provided to them and providing the information that was requested of them. Web 2.0, the web of social media, allowed users to post content they chose to share online and to do so in a way that allowed their posts to be visible to their friends or interested others. This was widely considered to be a very good thing. By 2006, Time Magazine named “You” the person of the year, because “[y]ou control the Information Age” by providing content. Lev Grossman, then a technology writer for the magazine, and also a novelist, praised internet users – “for seizing the reins of global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game.” The tribute continues with a minor caveat, but ends with what in hindsight seems to be recklessly positive spin:

Sure, it’s a mistake to romanticize all this any more than is strictly necessary. Web 2.0 harnesses the stupidity of crowds as well as its wisdom. Some of the comments on YouTube make you weep for the future of humanity just for the spelling alone, never mind the obscenity and the naked hatred.

But that’s what makes all this interesting. Web 2.0 is a massive social experiment, and like any experiment worth trying, it could fail. … This is an opportunity to build a new kind of international understanding, not politician to politician, great man to great man, but citizen to citizen, person to person. It’s a chance for people to look at a computer screen and really, genuinely wonder who’s out there looking back at them. Go on. Tell us you’re not just a little bit curious.

(Grossman, Reference Grossman2006, n.p.)

Many others writing in the popular media and scholarly/scientific literature echoed this optimism, sometimes downplaying the obvious caveats. Canadian commentators Don Tapscott and Anthony D. Williams argued for a positive vision of the transformative nature of information technology-enabled interaction, focusing on the possibilities for collaboration to radically alter (in a good way) business and society (Tapscott & Williams, Reference Tapscott and Williams2006). In later work they also claimed that collaborative governance – citizens being able to weigh in on domestic and transborder policy issues – would improve the democracy of governments and open the world to democratic participation (Tapscott & Williams, Reference Tapscott and Williams2010). In an interview in CIO Insight, an information technology industry trade magazine, Williams lauded the potential of “governance webs” – interactive websites for policy deliberation and sharing of information. The new capacity for interactivity online “provides a mechanism for collaboration of public agencies, the private sector, community groups and citizens.” While Williams cautioned that Web 2.0 would not be a “silver bullet [delivering] world peace,” he still foresaw “a new golden age of democracy” (quoted in Klein, Reference Klein2008, p. 36).

Many of the early scholarly publications on Web 2.0 and its broader social consequences also made optimistic claims about the exciting democratizing potential of e-democracy, though scholarly works tended to be more moderate than publications written for a broader audience. Many scholars acknowledged that various hiccoughs, such as the possible lack of citizen interest in participating online, could limit the democratizing potential of this new technology. For example, Kalnes (Reference Kalnes2009), writing on the 2007 elections in Norway, found that Web 2.0 allowed for greater ease of participation for citizens who wished to engage, even though its effect on pluralism was limited. (See also Anttiroiko, Reference Anttiroiko2010; Boikos, Moutsoulas, & Tsekeris, Reference Boikos, Moutsoulas and Tsekeris2014; Breindl & Francq, Reference Breindl and Francq2008; Costa Sánchez & Piñeiro Otero, Reference Costa Sánchez and Piñeiro Otero2011; Parycek & Sachs, Reference Parycek and Sachs2010; Raddaoui, Reference Raddaoui and Beldhuis2012; Reddick & Aikins, Reference Reddick, Aikins, Reddick and Aikins2012.)

Yet, other scholars provided strong warnings. In an early article on this topic, Cammaerts (Reference Cammaerts2008, p. 359) notes the high hopes that analysts had for a more expansively pluralist society in which anyone could say what they wish (enabling the “radical plurality of the blogosphere”). More to the point, though, he identifies antidemocratic pressures, including those resulting from peer-to-peer intimidation leading to self-censorship, and “the existence of anti-publics, abusing the freedom of expression with the aim to weaken democracy and democratic values” (Cammaerts, Reference Cammaerts2008, p. 372).Footnote 4 (Also see concerns raised by, inter alia, Marlin-Bennett, Reference Marlin-Bennett2011; Schradie, Reference Schradie2011; Van Dijck & Nieborg, Reference Van Dijck and Nieborg2009.) In the years that have passed since those earlier assessments, Cammaerts’ and other scholars’ pessimism has been validated. The failure of this massive social experiment (to use Grossman’s term) has had and continues to have correspondingly negative social repercussions.

Despite prior naive expectations that social media and related forms of communication would only encourage people to embrace democracy, these cyber-enabled technologies have also opened opportunities for actors to use soft power to undermine democracy.

Why Not Start by Figuring Out Who the Bad Guys Are?

The analysis of power usually starts with an assessment of who (or what) is acting on whom. However, the nature of actors on social media and using related technologies is a moving target. All actors continue to change as a consequence of their interactions, and new and often surprising actors pop up (and often disappear). As Berenskötter (Reference Berenskötter2018) suggests, actors’ ontologies and their motives are constructed through their interactions. In this section, I discuss the governing logics of cyberspace and the emergence of actors to buttress my claim that it makes sense to first focus on the flows of information rather than on who is enacting the flows in the analysis of antidemocratic soft power.Footnote 5

Libertarian and Neoliberal Logics

Cyberspace, encompassing the interactions among users of websites, social media platforms, messaging, and related technologies, has been constituted through the libertarian and neoliberal logics that generate the technical and regulatory structures of internet development and policymaking. Cyberspace consequently usually permits anonymity, pseudonymity, and even spoofing.Footnote 6 Although early in the development of the internet many computer scientists and engineers held a soft socialist view that saw code as something to be shared and the internet as something of a public service, by the 1980s that had changed (in tandem with larger social shifts). Internet developers had adopted a Silicon Valley worldview: “‘Technolibertarianism’ became one of the central ideologies of the Internet” (Rosenzweig, Reference Rosenzweig1998, p. 1550; see also Borsook, Reference Borsook1996). Decentralized, participatory decision-making procedures, which in the earlier period signaled equality and camaraderie, were repurposed to fit libertarian norms of limited government intervention and individualism. The emphasis on individual liberty fits with allowing the individual to hide their identity as the default setting, making anonymity and pseudonymity permitted and warranted (See Berker, Reference Berker2022 on deontic and fitting categories.) Put differently, it would have been possible to engineer into internet systems a strong requirement or an expectation that users would have to identify themselves truthfully to have an online presence. That did not happen.

The move to libertarian logics included a shift toward business and profit-making, prioritizing the commodification of information and the protection of intellectual property (Marlin-Bennett, Reference Marlin-Bennett2004), all of which overlap with neoliberal logics. The Clinton administration designed support for neoliberal logics into its formative internet policies, as evinced by the decision to open the internet to commercial activities and by the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) as a private, nonprofit, multistakeholder organization that would perform internet governance functions. In doing so, the United States maintained its hegemonic position and the power of corporations in cyberspace (Taggart & Abraham, Reference Taggart and Abraham2023).Footnote 7 To require identification by default would have added friction to the system, which was optimized for efficient, fast transactions of a market. What this meant for users’ anonymity and pseudonymity was that there was no reason to change course and require that users identify themselves truly by default. Nor was there any requirement that a person (natural or legal) could have only one cyber identity. In short, the libertarian and neoliberal underpinning of the structure of the internet is constitutive of cyberspace in which users generally may keep their (multiple) identities hidden.

Actors as Emergent

Furthermore, actors and agencies are emergent (Abraham, Reference Abraham2022; Chatterje-Doody & Crilley, Reference Chatterje-Doody, Crilley, Stengel, MacDonald and Nabers2019; Dunn Cavelty & Jaeger, Reference Dunn Cavelty and Jaeger2015; Elder-Vass, Reference Elder-Vass2008), regardless of whether the actors are identified, pseudonymous, or anonymous. As Karen Barad explains: Actors “do not preexist their interactions; rather [actors] emerge through and as part of their entangled intra-relating.” Emergence continues as actors “are iteratively reconfigured through each intra-action” (Barad, Reference Barad2007, p. ix). When actors’ identities are reasonably stable, efforts to identify bad actors open opportunities for deterring their actions in the future. That is often the case when actors are states or recognized as institutionally coherent non-state actors, though even if they do exist in a recognizable form, they are changing.

The emergence of surprising new micro-actors whose messages undermine democracy is not a novel phenomenon of the Information Age. Henry Ford (1863–1947), well-known as the founder of Ford Motor Company, surprisingly became a leading proponent of antisemitic hate and disinformation. His hate-filled screed, The International Jew, was subsequently used for Nazi propaganda (Flink, Reference Flink2000). The anti-Black and antisemitic Ku Klux Klan, which was created in 1865 by a group of Confederate Army veterans in Pulaski, Tennessee, is another example (Baudouin, Reference Baudouin2011; Quarles, Reference Quarles1999).

However, cyber-enabled technologies, because of their affordability, make it easier for surprising new micro-actors to participate in spreading hate, divisiveness, and mis- and disinformation. The resurgence of racism, antisemitism, Islamaphobia, and other forms of intergroup hatred spread through cyber-enabled technologies, and a large proportion of people were exposed to abusive content (ADL, 2023; Vidgen, Margetts, & Harris, Reference Vidgen, Margetts and Harris2019). Cyber-enabled technologies have allowed anti-vaxx groups to share anti-vaxx misinformation and promote vaccine hesitancy and rejection, eroding trust in public health agencies and causing a drop in vaccination rates, as these groups become a “political force” in democratic societies (Piper, Reference Piper2023; see also Burki, Reference Burki2020; Wilson & Wiysonge, Reference Wilson and Wiysonge2020). Perhaps most surprising is the pseudonymous QAnon, which first emerged on the 4chan social media site. QAnon adherents spread a bizarre conspiracy theory that combines alt-right divisiveness, disinformation about the outcome of the 2020 US presidential election, confabulation about the so-called deep state, and hatred for Jews, the LGBTQ+ community, and immigrants (QAnon | ADL, 2022). Many of the participants in the January 6, 2021, attack on the United States Capitol were QAnon followers (Lee et al., Reference Lee, Merizalde, Colautti, An and Kwak2022).

In the next two sections, I focus upon conceptualizing malign soft power and exploring how it is wielded using cyber-enabled technologies.

Conceptualizing Malign Soft Power

Cyber-enabled technologies can be used to wield soft or hard power. Hard power cyberattacks, including sabotage of critical infrastructureFootnote 8 and ransomware, is only deployed in adversarial situations. Soft power, on the other hand, seems almost friendly.Footnote 9 The usual view is that soft power practices “contribute to a positive image that endears nations with soft power to other nations, which in turn enhances these soft power nations’ influence in world politics” (Gallarotti, Reference Gallarotti2011, p. 32, stress added).Footnote 10 Those who advocate for relying on soft power over hard claim that soft power “cultivates cooperation and compliance in a much more harmonious context” than hard power does (Gallarotti, Reference Gallarotti2022, p. 384, stress removed).Footnote 11 Other scholars have questioned this positive view. Successfully wielding soft power – even of the most pleasant sort – is forceful, in the basic sense of getting others to do what they would not have otherwise done, which means that interests have been denied or manipulated (Bially Mattern, Reference Bially Mattern2005; Hayden, Reference Hayden2012, Reference Hayden2017). Soft power can also have negative unintended consequences (Johnson, Reference Johnson2011; Siekmeier, Reference Siekmeier2014). And to the point for understanding how soft power can negatively impact democracy: Soft power can be wielded in ways that undercut democracy.

As such, antidemocratic soft power is a kind of malign soft power, as opposed to good or neutral soft power. Malign soft power is a power of attraction used for harm (Marlin-Bennett, Reference Marlin-Bennett2022), but discerning that a particular action was motivated by a wish to do harm is difficult. Nevertheless, even when we cannot see who is acting – as is often the case with cyber-enabled technologies – we can reasonably infer the motivation behind an action by drawing upon our own practical reasoning. Members of society routinely make such judgments, and inferring motivations from actions and their consequences is a normal part of social life. While it is possible that otherwise well-meaning people accidentally do something that has a malign consequence or is misconstrued does not cancel out the quotidian way people adjudge actions. In addition, judgments of harm depend on the standpoint of whoever is making the determination,Footnote 12 and I acknowledge that my analysis comes from a pro-democracy position.Footnote 13 As I discuss in the next section, messages of divisiveness, hate, and mis- and disinformation become attractive (and therefore powerful) when they seduce, trick, or amuse those who are exposed to them into feeling that they share the sentiments.

Ironically, in Western democracies, wielding malign soft power is often legal, which further sets this kind of power apart from hard power. Democracy’s guarantee of freedom of speech makes punishing antidemocratic speech more difficult, though jurisdictions have different laws about the limits of protected speech. (European laws generally have more limits on speech than the US laws, but rights to freedom of expression still provide ample room for the lawful spread of politically divisive messages.) In some cases, state actors (e.g., Russia and China) initiate and/or amplify these messages; in other cases, they may simply be “home grown.” How these messages are received is important, too. Most people who see a message of hate on X (formerly Twitter) or a bit of disinformation on TikTok will probably not be susceptible to the soft power lure of the antidemocratic content and will simply move on to the next message, one that is unlikely to be similarly problematic. However, some people will be affected in that they will be attracted to the underlying antidemocratic message and the normalization of antidemocratic language and images is problematic.Footnote 14

Wielding Malign Soft Power: Controlling Flows of Antidemocratic Information

Wielding malign soft power, the action of powering the soft power, refers to the instance of controlling flows of messages that are politically divisive, hate-filled, and/or mis- or disinformation through social networks, exposing recipients, some of whom are attracted by the antidemocratic messages. The flows have the properties of content (the messages that are divisive, hate-filled, or mis- or disinformation), velocity (direction and speed of the messages), and access to them (by choice of the person who is exposed, by chance, or by force).Footnote 15 Focusing on these properties allows patterns of actions rather than actors to be the subjects of analysis and rule in an interactional, social sense (Szabla & Blommaert, Reference Szabla and Blommaert2020).

Content

The property of content refers to messages being transmitted and the emotions they convey. Message density is important as well, as the more times a particular message is received, the more it seems to be correct because it is common knowledge (Unkelbach et al., Reference Unkelbach, Koch, Silva and Garcia-Marques2019). I focus here on three commonly deployed modes by which the content works: seduction, trickery, and amusement.Footnote 16 When malign soft power seduces, tricks, or amuses, it does so through a combination of semantic and emotional content of messages.

Seduction links semantic content to emotion by invoking desire in the recipient. Online, as in person, we recognize how attracting through seductionFootnote 17 (of the sexual (e.g., Faisinet, Reference Faisinet1808) and nonsexual varieties (e.g., Bjerg, Reference Bjerg2016)) can cause harm. Content eliciting these desires spreads online in various ways, including through social media and via niche websites. Antidemocratic seduction convinces people to feel attachments that are inconsistent with democracy, such as hatred for political opponents, rather than simple disagreement with them. The spread of White Christian nationalist identitarianism in North America and Europe, a movement that is profoundly antidemocratic (Zuquete & Marchi, Reference Zuquete and Marchi2023), is an example. Individual participants produce and reproduce seduction through identitarian practices. For example, sharing “dog whistles” that express hatred in coded language gives participants a seductive sense that they are in on the secret (Marlin-Bennett, Reference Marlin-Bennett2022). Much of the content antisemitic feed of Stew Peters (@realstewpeters) on X (formerly Twitter) seeks to draw a White Christian audience into solidarity against Jews. Peters, who has a following on X of over 500,000 accounts (some of which may be automated) also engages in trickery about vaccines and other alt-right topics.

Trickery replaces falsehoods for facts, with claims that lies (known to be false) and/or bullshit (statements that are disconnected from the truth) (Frankfurt, Reference Frankfurt2005) are actually true. This practice is in play when disinformation is spread as misinformation by gullible users. For example, the Stew Peters Network disseminated disinformation about the COVID vaccine in a video, Died Suddenly (Skow & Stumphauzer, Reference Skow and Stumphauzer2022), on “Rumble, a moderation-averse video-streaming platform” (Tiffany, Reference Tiffany2023). Peters also promoted the video on X, Gab, and other social media sites. The many likes his tweets have received suggest that his falsehoods have been received by and are attractive to other users. The many reposts suggest that others spread his disinformation as misinformation.Footnote 18 Peters’ seductive messaging and lies connect with similar views disseminated by other individuals, creating interconnected networks of people who share antisemitism, White Christian nationalism, and anti-vaxx views. The widespread campaign to convince people that Joe Biden stole the US 2020 presidential election works similarly, drawing on trickery and often blending with the seductiveness of White Christian nationalism. Its success can be seen in a March 2023 CNN/SSRS poll that found that 63 percent of Republicans continued to believe it (Durkee, Reference Durkee2023). Individuals who believe in this falsehood have been tricked by antidemocratic mis- or disinformation.

When amusement is used for antidemocratic purposes, the content links humor or other pleasures to messages that in other contexts would be transgressive. A racist joke does not seem as bad to those who find it amusing. The mode of amusement allows funny or entertaining content to seem acceptable even when it is harmful (Apel, Reference Apel2009; Gaut, Reference Gaut1998). Topinka (Reference Topinka2018) examines the Reddit platform and specifically the r/ImGoingToHellForThis subreddit (now banned), in which members of the subreddit used humor to express racism, misogyny, antisemitism, and extreme anti-immigrant sentiments. Topinka provides a close analysis of how the redditors treated the famous, haunting picture of Alan Kurdi, the two-year old Syrian refugee who drowned along with his mother and brother in the Mediterranean Sea while fleeing to the Greek isle of Cos. The posts remixed the image of the dead toddler on the beach into jokes that were at once racist and anti-immigrant extremist. As soft power practices, “[t]he very ostentation on which this humor relies thus functions as a cloak concealing the networks of racist sentiment that the discourse sustains” (Topinka, Reference Topinka2018, p. 2066). This now defunct subreddit and other similar sites simultaneously rely on the democratic principle of free speech and eschew core democratic values. More generally, humor is often used on social media platforms to catch the eye of the user who is scrolling through feeds, be they TikTok or Instagram or another app, to find a quick laugh. Amusement is antidemocratic when it works to normalize hate or convince people of the truthfulness of fallacious claims relevant to current politics. Jokes may intensify the connection between humor and hate (Askanius, Reference Askanius2021; Marlin-Bennett & Jackson, Reference Marlin-Bennett and Jackson2022).

Velocity and Access

Controlling velocity (direction and speed) and access to messages are the other two properties for acting on soft power. Cyber-enabled technologies afford actors at all scales the capacity to manipulate these, but it is not necessarily the case that any specific techniques are relevant solely for antidemocratic purposes. If the content is antidemocratic, then increasing its velocityFootnote 19 – that is, increasing the speed at which the messages move and the spread of the messaging – is a means of wielding soft power to undermine democracy. Directions can be direct (going from a source to an expectant recipient) or circuitous. Speeds of message transmission range from fast to slow, and from constant to intermittent. The metaphor of a unit of information (e.g., a meme) “going viral,” means it is spreading quickly and in multiple directions. Messages can also jump from platform to platform and across technologies (DiResta, Reference DiResta2018).

Bots are usually designed to increase velocity. Bessi and Ferrara analyzed tweets about the 2016 presidential election during a five-week period in the fall of 2016. They found that approximately one-fifth of these tweets were generated by bots. They conclude:

The presence of social bots in online political discussion can create three tangible issues: first, influence can be redistributed across suspicious accounts that may be operated with malicious purposes; second, the political conversation can become further polarized; third, the spreading of misinformation and unverified information can be enhanced.

(Bessi & Ferrara, Reference Bessi and Ferrara2016, n.p.)

Micro-actors can also increase velocity, which they do by reposting and commenting, as well as by moving posts to new platforms (Marlin-Bennett & Jackson, Reference Marlin-Bennett and Jackson2022). Wielding malign soft power could also work by decreasing the velocity of messages that support democracy.

Controlling access to information is another means of wielding soft power. At stake is whether someone is exposed – or not exposed (denied access) – to information by choice, by chance, or by force. A choice approach means providing access to those who have requested the information, those who are already attracted to the messages. This soft power doubles down on existing attachments. Social media sites like r/ImGoingToHellForThis work this way. Users choose to subscribe and, in doing so, build community among those who are attracted to the divisiveness, hate, and mis- and disinformation. An access-by-chance approach is not targeted at any specific actor but instead involves providing access widely and anticipating that some who happen upon the messages will be swayed by them. The actors behind bot accounts uncovered by Bessi and Ferrara (Reference Bessi and Ferrara2016) probably disseminated manipulated information on X using a chance strategy of sending out a lot of content quickly and widely. The effectiveness of a chance approach depends on whether the messages find a core group of people who are receptive to it. Hindman and Barash (Reference Hindman and Barash2018) also find more real news than fake news on X in the months before and after the 2016 election in the United States, but they remain concerned about the dense networks of followers of popular fake news accounts: “[T]he popularity of these accounts, and heavy co-followership among top accounts, means that fake news stories that reach the core (or start there) are likely to spread widely” (p. 4). (See also Grinberg et al. Reference Grinberg, Joseph, Friedland, Swire-Thompson and Lazer2019.) The movement of antidemocratic messaging thus accelerates when it reaches a community who are disposed to be attracted to it. Forced access involves exposing a user to information in a targeted way but without prior subscription or other sort of confirmation of willingness to receive it. Algorithms that display increasingly extreme messages to users force access to antidemocratic information upon them.

An algorithm’s control of the flow of antidemocratic information is an instance of deploying malign soft power even though the algorithm itself does not have motivation in the same way a human does. Algorithms that determine what appears in social media newsfeed are part of an assemblage (Bennett, Reference Bennett2005) determining access to information. DeCook and Forestal argue that “digital platforms not only curate and channel certain content to individual users but also facilitate a particular mode of collective thinking that [they] term undemocratic cognition” (2023, p. 631).Footnote 20 The practice of curating and channeling is clearly an instance of soft power in which individuals are subject to attraction. Motivations, I suggest, are written into algorithms that collect data, analyze what would attract users in a way that serves the motivations of the assemblage, and then produce a newsfeed that gives users access to certain messaging and withholds access to other messaging. (And because of the vagaries of coding, the outcomes of this process may or may not be wholly what the firm’s management had expected.) While “make money!” is a compelling motivation for many social media firms, concerns have been raised about Chinese-owned social media like TikTok (The Economist, 2023). Kokas notes that China seeks to “manipulate messaging to key [Western] constituencies” for (implicitly) antidemocratic ends (2022, p. 95; see also Zhong, Reference Zhong2023). Additionally, a failure to actively protect against undermining democracy by preventing access to a biased stream of information in one’s newsfeed also suggests a motivation that undervalues the protection of democracy. This is perhaps a lesser kind of maliciousness, one of omission rather commission.

Control over content, access, and velocity usually operate together. For example, information flows that flood social media quickly move a large volume of messages in terms of the number of posts and/or the amount of message within posts (Cirone & Hobbs, Reference Cirone and Hobbs2023). These flows characterized by steady streams of the same untruth contribute over time to making those false claims seem true as a function of “truth by repetition” (Lewandowsky et al., Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Morgan & Cappella, Reference Morgan and Cappella2023; Unkelbach et al., Reference Unkelbach, Koch, Silva and Garcia-Marques2019).

As Figure 4.1 summarizes, someone or something wields malign soft power for antidemocratic motives by controlling the flows of information. In the background of this analysis are those for whom the information flow is a hard power attack. In the foreground are the targets of malign soft power, the recipients of flows of attractive hate, mis- and disinformation, and other content that undermines democratic institutions. The intended recipients are those likely to be swayed by the information presented to them.

An illustration of wielding malign soft power. Anti-democratic information wielded as malign soft power targets those manipulated and susceptible to coercion. Information flow is determined by content types, velocity, and access.

Figure 4.1 Wielding malign soft power.

Concluding Comments

This chapter makes a simple point: By carefully analyzing antidemocratic efforts using cyber-enabled technologies as malign soft power, we can see that the power of attraction is not necessarily harmless nor even less harmful than the power of coercion. The affordances of social media platforms and other participatory media make it easy for emergent actors to contribute to efforts undermining democracy. Antidemocratic content that is seductive, deceptive, or amusing flows easily and quickly throughout social networks. Some new actors further disseminate information while cognizant of the nature of the content and being willing participants in its spread. Others are seduced, distracted by the fun they are having, or simply duped. These are the unwitting actors who further spread hate and mis- and disinformation.

Who the bad actors are may or may not be easy to discern. The low cost of malign soft power resources means that both existing actors and surprising new ones can wield this kind of power. Actors are emergent: Wielding malign soft power dynamically constitutes actors’ identities. Moreover, that information flows, that it moves over time and space, is key. Manipulating content, velocity, and access and thereby making divisiveness, hate, and mis- and disinformation more available, more quickly, more widely, and to more users is harmful. The common interest is in discovering means of preventing or, if necessary, stopping these power flows and remediating existing harms. While governance can also be harmful, especially if it limits freedom of speech, limits rights to privacy, and is targeted toward vulnerable populations, good governance methods do not single out individuals but rather look at patterns of flows. A purpose of this framework for analyzing antidemocratic soft power is to uncover possible intervention points, the points in the flow of information at which defensive mechanisms can prevent or remediate the harms of malign soft power.

Each of the properties of information flows, the content, velocity, and access, provides opportunities for countering antidemocratic challenges, but undertaking democracy-affirming efforts must be done in a manner that preserves freedom of information. I am mindful of Friedrich Nietzsche’s warning:Footnote 21 “He who fights with monsters should be careful lest he thereby become a monster. And if [you] gaze long into an abyss, the abyss will also gaze into [you]” (2012, p. 83, sec. 146).

Cyber-enabled technologies have become an essential part of life in many ways and are necessary for democracies to function, but these technologies also afford the means to disrupt democracy. Understanding how antidemocratic soft power works and is wielded is just one tool for building resilient democracies.

Footnotes

I presented an early draft of this chapter at the 2023 International Studies Association in Montreal. My thanks to fellow panelists Christopher Ankerson and John Bonilla, as well as the audience members, including Greg Olsen, for their critical engagement and helpful observations. Scott Shackelford and Frederick Douzet provided extremely useful critique during a workshop for the present volume. I am also grateful to Nandini Dey, Thomas Risse, Richard Katz, Sebastian Schmidt, Susan T. Jackson, and Patrick Thaddeus Jackson for clarifying questions and constructive criticism on my larger project on malign soft power and information flows.

1 The term “soft power,” as the power of attraction rather than coercion, was first coined by Joseph Nye (Reference Nye1990a, Reference Nye1990b). The term encompasses public diplomacy efforts (Hayden, Reference Hayden2017; Nye, Reference Nye2008) as well as the efforts of other actors who are not working at the direction of states (Zaharna, Reference Zaharna and Zaharna2010). This chapter only briefly mentions antidemocratic hard power. Hacking voting machines to corrupt their data files would be an example of hard power and coercion rather than soft power and attraction. In cyberspace, as in the physical world, hard power actions often happen alongside soft power ones.

2 Nye himself notes the advent of twenty-first century information technology “means … that foreign policy will not be the sole province of governments. Both individuals and private organizations, here and abroad, will be empowered to play direct roles in world politics” (Nye, Reference Nye2002, p. 61). The context here is that Nye foresees greater volatility in the kinds of soft power efforts that circulate via information technologies. Malign efforts are not excluded from the future as he imagines it in 2002, but his emphasis is on the positive aspects of soft power: advocacy for human rights and actions taken in the global public interest.

3 I use the gerund form, wielding, intentionally (albeit awkwardly) to indicate the action as it unfolds. Unknown actors wield power by controlling flows of information.

4 Cammaerts also identifies the commodification of the internet, unequal power leading to censorship by states and intimidation by employers, and the outsized ability of elites to further their own interests.

5 I do not mean to imply that efforts to demand accountability by identifying actors are worthless or counterproductive. Such efforts may be easier or more necessary in cases of hard power attacks. Governments and nongovernmental organizations do attempt to identify who is acting when malicious actions are detected (Mueller et al., Reference Mueller, Grindal, Kuerbis and Badiei2019; The RAND Corporation, 2019). Herbert Lin (2016) discusses how to identify those who are responsible for hard power cyberattacks. He separates aspects of attribution into the machine or machines that originated the malicious action, the human who took action, and the “ultimate responsible party.” Such efforts are easier, I believe, in the context of specific hard power, such as ransomware attacks. The ubiquity of malign soft power makes tracking down culprits, including the misguided people who do not understand that they are sharing misinformation, a Sisyphean task.

6 Spoofing can be considered a form of pseudonymity.

7 Internet pioneers would have preferred privatization without the United States maintaining a good deal of control (Mueller, Reference Mueller1999).

8 For example, Iranian-backed hackers attacked an Israeli-made industrial control device used by water authorities in a number of United States systems in October of 2023 (Bajak & Levy, Reference Bajak and Levy2023).

9 A full-fledged review of the voluminous literature on soft power in International Relations and other fields of scholarship is beyond the scope of this chapter, but some additional IR works include: Gilboa (Reference Gilboa2008), Henne (Reference Henne2022), Wilson (Reference Wilson2008), Goldsmith and Horiuchi (Reference Goldsmith and Horiuchi2012), Goldthau and Sitter (Reference Goldthau and Sitter2015), Nye (Reference Nye2021), Keating and Kaczmarska (Reference Keating and Kaczmarska2019), and Tella (Reference Tella2023).

10 Nye’s quote from this early article refers to “countries” rather than the more general “actors” I have substituted here. Most assessments of soft power, including works by Nye (e.g., Reference Nye, Melissen and Wang2019), also hold that non-state actors, such as small groups and individuals, can wield soft power. Soft power is a strategy micro-actors can use to similarly endear themselves to other actors and become influential in world politics. Soft power can be generated by using resources that are relatively cheap and accessible, a much lower bar than micro-actors would face if they wished to deploy hard power. See also Zahran and Ramos (Reference Zahran and Ramos2010) and Nimmo et al. (Reference Nimmo, Eib and Tamora2019).

11 Some efforts at wielding soft power do seem quite positive in effect, and it is hard to find fault in some cultural exchanges that promote democracy (e.g., Atkinson, Reference Atkinson2010).

12 For example, patriotic Russians who support Putin and are suspicious of too much freedom and supporters of Western liberal democracy will certainly have different views of what is harmful.

13 I also acknowledge the inescapable tension between democracy’s expansive protections of freedom of speech and the dangers of hateful speech and speech that spreads lies and bullshit (Frankfurt, Reference Frankfurt2005).

14 Soft power can be adversarial without being malign, though the determination depends on standpoint and context. State actors routinely use soft power actions to portray themselves as more attractive in an effort to bolster their own image and undercut adversary countries’ relations with others. For example, China’s Belt and Road Initiative has adversarial elements to it, but in practical terms, it seems to fall short of the threshold for being malign.

15 Common sense tells us that actors wield power through material means, through informational means, or through a combination. Examples are bullets (material), brainwashing (informational), and Foucault’s panopticon (combination). This chapter focuses on wielding power with information, but it is also possible to translate the material into the informational for the purpose of analysis. The bullet informs the wounded body of injury and triggers signals of pain (Marlin-Bennett, Reference Marlin-Bennett2013).

16 Other modes are possible. Modes often work in combination.

17 I am referring here to a plain language understanding of “seduction” and “seduce” such as one finds in an English language dictionary (e.g., “Seduce, v.,” 2023). For malign soft power, the OED Online’s second definition of seduce – “To lead (a person) astray in action, conduct, or belief; to draw (a person) away from the right or intended course of action to or into a wrong or misguided one; to entice or beguile (a person) to do something wrong, foolish, or unintended – is particularly apt and helpfully covers both sexual and nonsexual kinds of conduct. For a more complex rendering, see Felman, Reference Felman2003. Baudrillard’s and Freud’s treatments of seduction are not useful in this context. Laura Sjoberg (Reference Sjoberg2018) cautions that news accounts of women around the Islamic State frequently portray them as nonagentic and subject to being duped because of their inherent feminine characteristics. Her argument is a compelling reminder that all secondhand accounts are imperfect retellings of actual motivations, knowledge, and desires.

18 Likes and reposts are generally indications that a message has been received with approval – in other words, that the soft power worked. However, that is not always the case. Likes and reposts can be done by bots or cyborgs. Also, people who disapprove sometimes repost in order to critique it.

19 Boichak et al. (Reference Boichak, Hemsley, Jackson, Tromble and Tanupabrungsun2021), citing Yang and Counts (Reference Yang and Counts2010), capture velocity in their study of the speed, scale, and range. “[S]peed […] reflects temporality of information events; scale […] speaks to the visibility of a message on a platform through popularity metrics, such as ‘likes’and ‘retweets’; and range […] denotes the depth of diffusion once the message gets propagated through user networks, reaching new audiences in the process” (Yang & Counts, Reference Yang and Counts2010, pp. 356–357).

20 Some studies (e.g., Brown et al., Reference Brown, Bisbee, Lai, Bonneau, Nagler and Tucker2022; Chen et al., Reference Chen, Nyhan, Reifler, Robertson and Wilson2023) have found that YouTube algorithms do not radicalize users by sending them “down the rabbit hole” but that YouTube does make extremist content available. Other studies do find that algorithms can serve to radicalize. See Whittaker et al. (Reference Whittaker, Looney, Reed and Votta2021).

21 Antoinette Verhage (Reference Verhage2009, p. 9), writing on similar conundrums of how to find the right balance between governance to limit harm and not impinging on rights, includes this quote.

References

Abraham, K. J. (2022). Midcentury modern: The emergence of stakeholders in democratic practice. American Political Science Review, 116(2), 631644. https://doi.org/10.1017/S0003055421001106CrossRefGoogle Scholar
Anttiroiko, A.-V. (2010). Innovation in democratic e-governance: Benefitting from Web 2.0 applications in the public sector. International Journal of Electronic Government Research (IJEGR), 6(2), 1836. https://doi.org/10.4018/jegr.2010040102CrossRefGoogle Scholar
Apel, D. (2009). Just joking? Chimps, Obama and racial stereotype. Journal of Visual Culture, 8(2), 134142. https://doi.org/10.1177/14704129090080020203CrossRefGoogle Scholar
Askanius, T. (2021). On frogs, monkeys, and execution memes: Exploring the humor-hate nexus at the intersection of Neo-Nazi and Alt-Right movements in Sweden. Television & New Media, 22(2), 147165. https://doi.org/10.1177/1527476420982234CrossRefGoogle Scholar
Atkinson, C. (2010). Does soft power matter? A comparative analysis of student exchange programs 1980–2006. Foreign Policy Analysis, 6(1), 122. https://doi.org/10.1111/j.1743-8594.2009.00099.xCrossRefGoogle Scholar
Bajak, F., & Levy, M. (2023, December 2). Breaches by Iran-affiliated hackers spanned multiple U.S. states, federal agencies say. AP News. https://apnews.com/article/hackers-iran-israel-water-utilities-critical-infrastructure-cisa-554b2aa969c8220016ab2ef94bd7635bGoogle Scholar
Barad, K. (2007). Meeting the universe halfway: Quantum physics and the entanglement of matter and meaning. Duke University Press.CrossRefGoogle Scholar
Baudouin, R. (Ed.). (2011). The Ku Klux Klan: A history of racism and violence (6th ed.). Southern Poverty Law Center. https://splcenter.org/sites/default/files/Ku-Klux-Klan-A-History-of-Racism.pdfGoogle Scholar
Bennett, J. (2005). The agency of assemblages and the North American blackout. Public Culture, 17(3), 445465.10.1215/08992363-17-3-445CrossRefGoogle Scholar
Berenskötter, F. (2018). Deep theorizing in International Relations. European Journal of International Relations, 24(4), 814840. https://doi.org/10.1177/1354066117739096CrossRefGoogle Scholar
Berker, S. (2022). Fittingness: Essays in the philosophy of normativity (pp. 2357). Oxford University Press. https://doi.org/10.1093/oso/9780192895882.001.0001CrossRefGoogle Scholar
Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 U.S. Presidential election online discussion. First Monday. https://doi.org/10.5210/fm.v21i11.7090CrossRefGoogle Scholar
Bially Mattern, J. (2005). Why soft power isn’t so soft: Representational force and the sociolinguistic construction of attraction in world politics. Millennium-Journal of International Studies, 33(3), 583612. https://doi.org/10.1177/03058298050330031601CrossRefGoogle Scholar
Bjerg, O. (2016). How is Bitcoin money? Theory, Culture & Society, 33(1), 5372. https://doi.org/10.1177/0263276415619015CrossRefGoogle Scholar
Boichak, O., Hemsley, J., Jackson, S., Tromble, R., & Tanupabrungsun, S. (2021). Not the bots you are looking for: Patterns and effects of orchestrated interventions in the U.S. and German elections. International Journal of Communication (19328036), 15, 814839. https://ijoc.org/index.php/ijoc/article/view/14866Google Scholar
Boikos, C., Moutsoulas, K., & Tsekeris, C. (2014). The real of the virtual: Critical reflections on Web 2.0. tripleC: Communication, capitalism & critique. Open Access Journal for a Global Sustainable Information Society, 12(1), 405412. https://doi.org/10.31269/triplec.v12i1.566Google Scholar
Borsook, P. (1996, August). Cyberselfish. Mother Jones. https://motherjones.com/politics/1996/07/cyberselfish/Google Scholar
Breindl, Y., & Francq, P. (2008). Can Web 2.0 applications save e-democracy? A study of how new internet applications may enhance citizen participation in the political process online. International Journal of Electronic Democracy, 1(1), 1431. https://doi.org/10.1504/IJED.2008.021276CrossRefGoogle Scholar
Brown, M. A., Bisbee, J., Lai, A., Bonneau, R., Nagler, J., & Tucker, J. (2022). Echo chambers, rabbit holes, and algorithmic bias: How YouTube recommends content to real users (SSRN Scholarly Paper 4114905). https://doi.org/10.2139/ssrn.4114905CrossRefGoogle Scholar
Burki, T. (2020). The online anti-vaccine movement in the age of COVID-19. The Lancet Digital Health, 2(10), e504e505. https://doi.org/10.1016/S2589-7500(20)30227-2CrossRefGoogle ScholarPubMed
Cammaerts, B. (2008). Critiques on the participatory potentials of Web 2.0. Communication, Culture and Critique, 1(4), 358377. https://doi.org/10.1111/j.1753-9137.2008.00028.xCrossRefGoogle Scholar
Chatterje-Doody, P., & Crilley, R. (2019). Populism and contemporary global media: Populist communication logics and the co-construction of transnational identities. In Stengel, F. A., MacDonald, D. B., & Nabers, D. (Eds.), Populism and world politics: Exploring inter- and transnational dimensions (pp. 7399). Springer International Publishing. https://doi.org/10.1007/978-3-030-04621-7_4CrossRefGoogle Scholar
Chen, A. Y., Nyhan, B., Reifler, J., Robertson, R. E., & Wilson, C. (2023). Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels. Science Advances, 9(35). https://doi.org/10.1126/sciadv.add8080Google ScholarPubMed
Cirone, A., & Hobbs, W. (2023). Asymmetric flooding as a tool for foreign influence on social media. Political Science Research and Methods, 11(1), 160171. https://doi.org/10.1017/psrm.2022.9CrossRefGoogle Scholar
Costa Sánchez, C., & Piñeiro Otero, T. (2012). Social activism in the Web 2.0. Spanish 15m movement. Vivat Academia, 117, 14581467. https://doi.org/10.15178/va.2011.117E.1458-1467CrossRefGoogle Scholar
DeCook, J., & Forestal, J. (2023). Of humans, machines, and extremism: The role of platforms in facilitating undemocratic cognition. American Behavioral Scientist, 67(5), 629648. https://doi.org/10.1177/00027642221103186CrossRefGoogle Scholar
DiResta, R. (2018, November 8). Of virality and viruses: The anti-vaccine movement and social media. Nautilus Institute. https://nautilus.org/napsnet/napsnet-special-reports/of-virality-and-viruses-the-anti-vaccine-movement-and-social-media/Google Scholar
Dunn Cavelty, M., & Jaeger, M. (2015). (In)visible ghosts in the machine and the powers that bind: The relational securitization of anonymous. International Political Sociology, 9(2), 176194. https://doi.org/10.1111/ips.12090CrossRefGoogle Scholar
Durkee, A. (2023, March 14). Republicans increasingly realize there’s no evidence of election fraud—but most still think 2020 election was stolen anyway, poll finds. Forbes. https://forbes.com/sites/alisondurkee/2023/03/14/republicans-increasingly-realize-theres-no-evidence-of-election-fraud-but-most-still-think-2020-election-was-stolen-anyway-poll-finds/Google Scholar
Elder-Vass, D. (2008). Searching for realism, structure and agency in actor network theory. The British Journal of Sociology, 59(3), 455473. https://doi.org/10.1111/j.1468-4446.2008.00203.xCrossRefGoogle ScholarPubMed
Faisinet, N. (1808). On seduction. The Lady’s monthly museum, or Polite repository of amusement and instruction: Being an assemblage of whatever can tend to please the fancy, interest the mind, or exalt the character of the British fair. / By a society of ladies, 1798–1828, 5, 286–289.Google Scholar
Felman, S. (2003). The scandal of the speaking body: Don Juan with JL Austin, or seduction in two languages. Stanford University Press.Google Scholar
Flink, J. (2000). Ford, Henry (1863–1947), automobile manufacturer. In American National Biography. Oxford University Press. https://anb-org.proxy1.library.jhu.edu/display/10.1093/anb/9780198606697.001.0001/anb-9780198606697-e-1000578Google Scholar
Frankfurt, H. (2005). On bullshit. Princeton University Press.10.1515/9781400826537CrossRefGoogle Scholar
Gallarotti, G. (2011). Soft power: What it is, why it’s important, and the conditions for its effective use. Journal of Political Power, 4(1), 2547. https://doi.org/10.1080/2158379X.2011.557886CrossRefGoogle Scholar
Gallarotti, G. (2022). Esteem and influence: Soft power in international politics. Journal of Political Power, 15(3), 383396. https://doi.org/10.1080/2158379X.2022.2135303CrossRefGoogle Scholar
Gaut, B. (1998). Just joking: The ethics and aesthetics of humor. Philosophy and Literature, 22(1), 5168. https://doi.org/10.1353/phl.1998.0014CrossRefGoogle Scholar
Gilboa, E. (2008). Searching for a theory of public diplomacy. ANNALS of the American Academy of Political and Social Science, 616(1), 5577. https://doi.org/10.1177/0002716207312142CrossRefGoogle Scholar
Goldsmith, B. E., & Horiuchi, Y. (2012). In search of soft power: Does foreign public opinion matter for US foreign policy? World Politics, 64(3), 555585. https://doi.org/10.1017/S0043887112000123CrossRefGoogle Scholar
Goldthau, A., & Sitter, N. (2015). Soft power with a hard edge: EU policy tools and energy security. Review of International Political Economy, 22(5), 941965. https://doi.org/10.1080/09692290.2015.1008547CrossRefGoogle Scholar
Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science, 363(6425), 374378. https://doi.org/10.1126/science.aau2706CrossRefGoogle ScholarPubMed
Grossman, L. (2006). You, yes, you, are TIME’s person of the year. TIME Magazine, 168(26), 3841.Google Scholar
Hayden, C. (2012). The rhetoric of soft power: Public diplomacy in global contexts. Lexington Books.Google Scholar
Hayden, C. (2017). Scope, mechanism, and outcome: Arguing soft power in the context of public diplomacy. Journal of International Relations and Development, 20(2), 331357. https://doi.org/10.1057/jird.2015.8CrossRefGoogle Scholar
Henne, P. (2022). What we talk about when we talk about soft power. International Studies Perspectives, 23(1), 94111. https://doi.org/10.1093/isp/ekab007CrossRefGoogle Scholar
Hindman, M., & Barash, V. (2018, October 4). Disinformation, “fake news” and influence campaigns on Twitter. Knight Foundation. https://knightfoundation.org/reports/disinformation-fake-news-and-influence-campaigns-on-twitter/Google Scholar
Johnson, C. G. (2011). The urban precariat, neoliberalization, and the soft power of humanitarian design. Journal of Developing Societies, 27(3–4), 445475.CrossRefGoogle Scholar
Kalnes, O. (2009). Norwegian parties and Web 2.0. Journal of Information Technology & Politics, 6(3–4), 251266. https://doi.org/10.1080/19331680903041845CrossRefGoogle Scholar
Keating, V., & Kaczmarska, K. (2019). Conservative soft power: Liberal soft power bias and the ‘hidden’ attraction of Russia. Journal of International Relations and Development, 22(1), 127. https://doi.org/10.1057/s41268-017-0100-6CrossRefGoogle Scholar
Klein, P. (2008). Web 2.0: Reinventing democracy. CIO Insight, 92, 3036.Google Scholar
Kokas, A. (2022). Trafficking data: How China is winning the battle for digital sovereignty. Oxford University Press. https://doi.org/10.1093/oso/9780197620502.001.0001CrossRefGoogle Scholar
Lee, C., Merizalde, J., Colautti, J., An, J., & Kwak, H. (2022). Storm the Capitol: Linking offline political speech and online Twitter extra-representational participation on QAnon and the January 6 insurrection. Frontiers in Sociology, 7. https://doi.org/10.3389/fsoc.2022.876070CrossRefGoogle Scholar
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106131. https://doi.org/10.1177/1529100612451018CrossRefGoogle ScholarPubMed
Lin, H. (2016, September 20). Attribution of malicious cyber incidents: From soup to nuts. Lawfare. https://lawfareblog.com/attribution-malicious-cyber-incidents-soup-nuts-0Google Scholar
Marlin-Bennett, R. (2004). Knowledge power: Intellectual property, information, and privacy. Lynne Rienner Publishers.CrossRefGoogle Scholar
Marlin-Bennett, R. (2011). I hear America tweeting and other themes for a virtual polis: Rethinking democracy in the global infotech age. Journal of Information Technology & Politics, 8(2), 129145. https://doi.org/10.1080/19331681.2011.53675CrossRefGoogle Scholar
Marlin-Bennett, R. (2013). Embodied information, knowing bodies, and power. Millennium: Journal of International Studies, 41(3), 601622. https://doi.org/10.1177/0305829813486413CrossRefGoogle Scholar
Marlin-Bennett, R. (2022). Soft power’s dark side. Journal of Political Power, 15(3), 437455. https://doi.org/10.1080/2158379X.2022.2128278CrossRefGoogle Scholar
Marlin-Bennett, R., & Jackson, S. (2022). DIY cruelty: The global political micro-practices of hateful memes. Global Studies Quarterly, 2(2). https://doi.org/10.1093/isagsq/ksac002.CrossRefGoogle Scholar
Morgan, J., & Cappella, J. (2023). The effect of repetition on the perceived truth of tobacco-related health misinformation among U.S. adults. Journal of Health Communication, 28(3), 182189. https://doi.org/10.1080/10810730.2023.2192013CrossRefGoogle ScholarPubMed
Mueller, M. (1999). ICANN and Internet governance: Sorting through the debris of “self‐regulation”. Info, 1(6), 497520. https://doi.org/10.1108/14636699910801223Google Scholar
Mueller, M., Grindal, K., Kuerbis, B., & Badiei, F. (2019). Cyber attribution: Can a new institution achieve transnational credibility? The Cyber Defense Review, 4(1), 107122.Google Scholar
Nietzsche, F. (2012). Beyond good and evil. Andrews UK. http://ebookcentral.proquest.com/lib/jhu/detail.action?docID=977667Google Scholar
Nimmo, B., Eib, C., & Tamora, L. (2019). Cross-platform spam network targeted Hong Kong protests: “Spamouflage dragon” used hijacked and fake accounts to amplify video content. Grafika. https://public-assets.graphika.com/reports/graphika_report_spamouflage.pdfGoogle Scholar
Nye, J. (1990a). Bound to lead: The changing nature of American power. Basic Books.Google Scholar
Nye, J. (1990b). Soft power. Foreign Policy, 80, 153171. https://doi.org/10.2307/1148580CrossRefGoogle Scholar
Nye, J. (2002). The information revolution and American soft power. Asia-Pacific Review, 9(1), 6076. https://doi.org/10.1080/13439000220141596CrossRefGoogle Scholar
Nye, J. (2008). Public diplomacy and soft power. The Annals of the American Academy of Political and Social Science, 616(1), 94109. https://doi.org/10.1177/0002716207311699CrossRefGoogle Scholar
Nye, J. (2019). Soft power and public diplomacy revisited. In Melissen, J. & Wang, J. (Eds.), Debating public diplomacy: Now and next (pp. 720). Brill Nijhoff. http://brill.com/view/book/edcoll/9789004410824/BP000006.xmlCrossRefGoogle Scholar
Nye, J. (2021). Soft power: The evolution of a concept. Journal of Political Power, 14(1), 196208. https://doi.org/10.1080/2158379X.2021.1879572CrossRefGoogle Scholar
Parycek, P., & Sachs, M. (2010). Open government – Information flow in Web 2.0. European Journal of ePractice, 9, 5768.Google Scholar
Piper, J. (2023, September 24). Anti-vaxxers are now a modern political force. Politico. https://politico.com/news/2023/09/24/anti-vaxxers-political-power-00116527Google Scholar
Quarles, C. (1999). The Ku Klux Klan and related American racialist and antisemitic organizations: A history and analysis. McFarland.Google Scholar
Raddaoui, A. (2012). Democratization of knowledge and the promise of Web 2.0: A historical perspective. In Beldhuis, H. (Ed.), Proceedings of the 11th European Conference on E-Learning (pp. 435441). Acad. Conferences Ltd. https://webofscience.com/wos/woscc/full-record/WOS:000321613000053Google Scholar
Reddick, C., & Aikins, S. (2012). Web 2.0 technologies and democratic governance. In Reddick, C. & Aikins, S. (Eds.), Web 2.0 technologies and democratic governance: Political, policy and management implications (pp. 17). Springer. https://doi.org/10.1007/978-1-4614-1448-3_1CrossRefGoogle Scholar
Rosenzweig, R. (1998). Wizards, bureaucrats, warriors, and hackers: Writing the history of the internet. The American Historical Review, 103(5), 1530–1552. https://doi.org/10.2307/2649970CrossRefGoogle Scholar
Schradie, J. (2011). The digital production gap: The digital divide and Web 2.0 collide. Poetics, 39(2), 145168. https://doi.org/10.1016/j.poetic.2011.02.003CrossRefGoogle Scholar
Seduce, v. (2023). OED online (3rd ed.). Oxford University Press. https://oed.com/view/Entry/174721Google Scholar
Siekmeier, J. (2014, June 6). Bolivia shows how Andean nations can be punished by US neoliberal soft power if they refuse to assist in the ‘war on drugs’. https://blogs.lse.ac.uk/usappblog/2014/06/06/bolivia-shows-how-andean-nations-can-be-punished-by-u-s-neoliberal-soft-power-if-they-refuse-to-assist-in-the-war-on-drugs/Google Scholar
Sjoberg, L. (2018). Jihadi brides and female volunteers: Reading the Islamic State’s war to see gender and agency in conflict dynamics. Conflict Management and Peace Science, 35(3), 296311. https://doi.org/10.1177/0738894217695050CrossRefGoogle Scholar
Szabla, M., & Blommaert, J. (2020). Does context really collapse in social media interaction? Applied Linguistics Review, 11(2), 251279. https://doi.org/10.1515/applirev-2017-0119CrossRefGoogle Scholar
Taggart, J., & Abraham, K. (2023). Norm dynamics in a post-hegemonic world: Multistakeholder global governance and the end of liberal international order. Review of International Political Economy, 1–28. https://doi.org/10.1080/09692290.2023.2213441CrossRefGoogle Scholar
Tapscott, D., & Williams, A. (2006). Wikinomics: How mass collaboration changes everything. Portfolio.Google Scholar
Tapscott, D., & Williams, A. (2010). MacroWikinomics: Rebooting business and the world. Portfolio Penguin.Google Scholar
Tella, O. (2023). The diaspora’s soft power in an age of global anti-Nigerian sentiment. Commonwealth & Comparative Politics, 61(2), 177196. https://doi.org/10.1080/14662043.2022.2127826CrossRefGoogle Scholar
The Economist. (2023, March 30). Both America’s political camps agree that TikTok is troubling. The Economist. https://economist.com/united-states/2023/03/30/both-americas-political-camps-agree-that-tiktok-is-troublingGoogle Scholar
The RAND Corporation. (2019). Accountability in cyberspace: The problem of attribution. The RAND Corporation. https://youtube.com/watch?v=ca9xomGmZPcGoogle Scholar
Tiffany, K. (2023, January 24). Twitter has no answers for #DiedSuddenly. The Atlantic. https://theatlantic.com/technology/archive/2023/01/died-suddenly-documentary-covid-vaccine-conspiracy-theory/672819/Google Scholar
Topinka, R. (2018). Politically incorrect participatory media: Racist nationalism on r/ImGoingToHellForThis. New Media & Society, 20(5), 20502069. https://doi.org/10.1177/1461444817712516CrossRefGoogle Scholar
Unkelbach, C., Koch, A., Silva, R., & Garcia-Marques, T. (2019). Truth by repetition: Explanations and implications. Current Directions in Psychological Science, 28(3), 247253. https://doi.org/10.1177/0963721419827854CrossRefGoogle Scholar
Van Dijck, J., & Nieborg, D. (2009). Wikinomics and its discontents: A critical analysis of Web 2.0 business manifestos. New Media & Society, 11(5), 855874. https://doi.org/10.1177/1461444809105356CrossRefGoogle Scholar
Verhage, A. (2009). Between the hammer and the anvil? The anti-money laundering-complex and its interactions with the compliance industry. Crime, Law & Social Change, 52(1), 932. https://doi.org/10.1007/s10611-008-9174-9CrossRefGoogle Scholar
Vidgen, B., Margetts, H., & Harris, A. (2019). How much online abuse is there? A systematic review of evidence for the UK (Policy Briefing, Hate Speech: Measures and Counter Measures). Alan Turing Institute. https://turing.ac.uk/sites/default/files/2019-11/online_abuse_prevalence_full_24.11.2019_-_formatted_0.pdfGoogle Scholar
Whittaker, J., Looney, S., Reed, A., & Votta, F. (2021). Recommender systems and the amplification of extremist content. Internet Policy Review, 10(2). https://doi.org/10.14763/2021.2.1565CrossRefGoogle Scholar
Wilson, E. (2008). Hard power, soft power, smart power. Annals of the American Academy of Political and Social Science, 616(1), 110124. https://doi.org/10.1177/0002716207312618CrossRefGoogle Scholar
Wilson, S., & Wiysonge, C. (2020). Social media and vaccine hesitancy. BMJ Global Health, 5(10). https://doi.org/10.1136/bmjgh-2020-004206CrossRefGoogle ScholarPubMed
Yang, J., & Counts, S. (2010). Predicting the speed, scale, and range of information diffusion in Twitter. Proceedings of the International AAAI Conference on Web and Social Media, 4(1) 355358. https://aaai.org/papers/00355-14039-predicting-the-speed-scale-and-range-of-information-diffusion-in-twitter/10.1609/icwsm.v4i1.14039CrossRefGoogle Scholar
Zaharna, R. (2010). The soft power differential: Mass communication and network communication. In Zaharna, R. (Ed.), Battles to bridges: U.S. strategic communication and public diplomacy after 9/11 (pp. 92114). Palgrave Macmillan. https://doi.org/10.1057/9780230277922_6CrossRefGoogle Scholar
Zahran, G., & Ramos, L. (2010). From hegemony to soft power: Implications of a conceptual change. In Soft power and US foreign policy (1st ed., pp. 1231). Routledge.Google Scholar
Zhong, W. (2023, May 3). Who gets the algorithm? The bigger TikTok danger. Lawfare. https://lawfaremedia.org/article/who-gets-the-algorithm-the-bigger-tiktok-dangerGoogle Scholar
Zuquete, J., & Marchi, R. (2023). Postscript. In Global identitarianism. Routledge.CrossRefGoogle Scholar
Figure 0

Figure 4.1 Wielding malign soft power.

Accessibility standard: WCAG 2.2 AAA

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

The HTML of this book complies with version 2.2 of the Web Content Accessibility Guidelines (WCAG), offering more comprehensive accessibility measures for a broad range of users and attains the highest (AAA) level of WCAG compliance, optimising the user experience by meeting the most extensive accessibility guidelines.

Content Navigation

Table of contents navigation
Allows you to navigate directly to chapters, sections, or non‐text items through a linked table of contents, reducing the need for extensive scrolling.
Index navigation
Provides an interactive index, letting you go straight to where a term or subject appears in the text without manual searching.

Reading Order & Textual Equivalents

Single logical reading order
You will encounter all content (including footnotes, captions, etc.) in a clear, sequential flow, making it easier to follow with assistive tools like screen readers.
Short alternative textual descriptions
You get concise descriptions (for images, charts, or media clips), ensuring you do not miss crucial information when visual or audio elements are not accessible.
Full alternative textual descriptions
You get more than just short alt text: you have comprehensive text equivalents, transcripts, captions, or audio descriptions for substantial non‐text content, which is especially helpful for complex visuals or multimedia.
Visualised data also available as non-graphical data
You can access graphs or charts in a text or tabular format, so you are not excluded if you cannot process visual displays.

Visual Accessibility

Use of colour is not sole means of conveying information
You will still understand key ideas or prompts without relying solely on colour, which is especially helpful if you have colour vision deficiencies.
Use of high contrast between text and background colour
You benefit from high‐contrast text, which improves legibility if you have low vision or if you are reading in less‐than‐ideal lighting conditions.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×