Contrary to the initial optimism regarding the democratizing potential of digital technologies, recent evidence suggests that authoritarian rulers have harnessed these technologies to strengthen their rule (Gunitsky Reference Gunitsky2015; Xiao Reference Xiao2019). They do so, however, not by intimidating the population through outright violent repression or indoctrination but through discreet control and manipulation of information (Guriev and Treisman Reference Guriev and Treisman2019). Despite extensive research on the political-institutional impact of political control via digital technologies (Beraja et al. Reference Beraja, Kao, Yang and Yuchtman2023; Gohdes Reference Gohdes2020; Xu Reference Xu2021), the political-psychological effects on individuals remain largely underexplored (Tirole Reference Tirole2021; Yang Reference Yang2018). This study seeks to enrich current insights into digital authoritarianism by focusing on the less-researched aspects of emotion, attitude, and their linkage.Footnote 1 Specifically, we aim to understand whether and how awareness of the repressive nature of state-led digital control shapes people’s emotional reactions and attitudes toward such practices and how these elicited emotions are translated into the formation and alteration of related political attitudes.Footnote 2
Studying the inner, emotional experiences of individual citizens who actually live with everyday practices of digital control not only sheds light on nonmaterial factors of authoritarian rule that become “subjectively felt” (Demmelhuber and Thies Reference Demmelhuber and Thies2023; Greene and Robertson Reference Greene and Robertson2022); it also helps inform political scientists about the emotional micro-foundations (Pearlman Reference Pearlman2013) for potential, often unintended sociopolitical consequences of digital authoritarianism that are increasingly occurring in politically restrictive contexts and beyond (Pearson Reference Pearson2024). As we found, emotional processes not only reflect how digital control is experienced subjectively but also capture how individual attitudes toward state digital practices change (or not) being made aware of the repressive nature of digital control.
Our study faced multiple challenges that are partly due to the authoritarian nature of the regime type of interest (Ahram and Goode Reference Ahram and Goode2016). Unlike in liberal democracies, citizens in digital autocracies are more likely to be subject to systematic government information control that keeps them from being fully informed of the repressive nature of government use of digital technologies. In such a situation, the costs for ordinary citizens of accessing information deemed unfavorable by the regime are much higher (Roberts Reference Roberts2018). Citizens in authoritarian regimes are also more prone to exposure to state propaganda justifying intrusive digital policies, leading many to believe official narratives (Nisbet, Kamenchuk, and Dal Reference Nisbet, Kamenchuk and Dal2017; Shirikov Reference Shirikov2024).Footnote 3 Moreover, both emotions and attitudes may be caused by myriad other factors such as sociodemographic variation, preexisting beliefs, and prior experiences, making it difficult to determine the causal linkage between awareness, on the one hand, and emotional responses and attitudes, on the other.Footnote 4 And, as in any survey, we also faced the issue of determining the genuineness of responses (Kuran Reference Kuran1995).
To address these challenges, we conducted an online survey experiment with information cues for eight digital control practices in mainland China (n = 4,507). The survey not only successfully isolated the effect of awareness of digital control on both emotions and attitudes but also incorporated questions that captured potential relative preference falsification of people’s attitudes by design. In addition, we conducted 50 semi-structured, in-depth interviews in 10 Chinese localities to contextualize and better capture the emotional mechanisms activated by exposure to digital control. Overall, our findings suggest that emotions act both as a direct outcome of awareness of the repressive nature of all types of state digital control and as a likely underlying mechanism that further worsened attitudes toward government digital practices.
Our study contributes to multiple streams of literature in political science. First, the results complement the emerging research on digital authoritarianism that has primarily focused on the macro-level institutional dimension (Keremoğlu and Weidmann Reference Keremoğlu and Weidmann2020; Roberts and Oosterom Reference Roberts and Oosterom2024; Schlumberger et al. Reference Schlumberger, Edel, Maati and Saglam2023). It reveals the nonmaterial, micro-level political-psychological aspect of state information control and manipulation through the expansive use of digital technologies and its impact on individuals in a controlled society. By embedding information showing the repressive nature of digital control in the experimental design, we show that, although this awareness slightly amplifies negative emotions, it more significantly diminishes positive emotions. Prompt awareness also worsens people’s attitudes, much more so when asked in hypothetical scenarios as compared with real-life ones. We offer two alternative explanations for the difference. One pertains to preference falsification in expressing attitudes, and the other centers on rationalization triggered by cognitive dissonance. Notably, the worsening of attitudes is largely captured by the reduction of positive emotions. Specifically, people between their late 20s to their mid-30s, who are better educated, frequently consume foreign news, and have had overseas experiences, are among those indicating the strongest adverse emotional and attitudinal shifts.
Second, our study extends the research of emotion beyond conventional issues in democratic politics (Brader Reference Brader2006; Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000), and contributes to the latest efforts to incorporate the subjects of emotion and emotional sensitivity into authoritarian politics research (Demmelhuber and Thies Reference Demmelhuber and Thies2023; Greene and Robertson Reference Greene and Robertson2022; Mattingly and Yao Reference Mattingly and Yao2022; Pearlman Reference Pearlman2013; Reference Pearlman2023; Young Reference Young2019; Reference Young2023). We foreground the affective-emotional element (how individuals feel) and its interplay with varying cognitive prompts in influencing attitudes toward authoritarian digital policies. The results complement existing research on public perceptions of digital control that has focused on the cognitive aspect (how individuals think; Karpa and Rochlitz Reference Karpa and Rochlitz2024; Kostka Reference Kostka2019; Xu, Kostka, and Cao Reference Xu, Kostka and Cao2022) and corroborate neuroscientific evidence of the interconnectedness of cognition and emotion in political judgment and decision making (Damasio Reference Damasio1994; Lodge and Taber Reference Lodge and Taber2005; McDermott Reference McDermott2004; Zajonc Reference Zajonc1980).
Lastly, systematic work on digital technologies and authoritarianism suggests that a wide range of digital instruments are put in use simultaneously to achieve regime stability (Deibert Reference Deibert2015; Earl, Maher, and Pan Reference Earl, Maher and Pan2022; Gohdes Reference Gohdes2024). By integrating digital surveillance and digital censorship into our framework and operationalizing them through a two-way factorial design to better reflect their interlinked nature, our conceptual approach to digital control yields a more comprehensive understanding of the impact of digital repression. Our results show that the awareness of digital surveillance has a more extensive and generally stronger effect on emotional and attitudinal reactions than does digital censorship. In addition, the impact of digital control is more pronounced when perceived in personalized rather than public terms, as often portrayed in state propaganda.
In the next section, we introduce the concept of digital control before delving into the role of emotions and their relationship with awareness of digital control and attitudes. We then present our research design, including an overview of the experimental setting and data collection, followed by demonstrating key findings with a critical discussion.
Digital Control: An Integrated Concept
Autocrats in the digital age, like their predecessors, aim to maintain power and regime stability. To achieve this goal, autocrats repress those who oppose (Davenport Reference Davenport2007), co-opt the elite and the masses (Gandhi and Przeworski Reference Gandhi and Przeworski2006; Knutsen and Rasmussen Reference Knutsen and Rasmussen2018), and legitimize their rule to create “legitimacy belief” among their subjects (Gerschewski Reference Gerschewski2013). In other words, they use all the means at their disposal to minimize threats to their staying in control (Gerschewski Reference Gerschewski2023; Svolik Reference Svolik2012). The advancement and proliferation of digital technologies have enriched the conventional toolkit of modern autocrats (Keremoğlu and Weidmann Reference Keremoğlu and Weidmann2020; Schlumberger et al. Reference Schlumberger, Edel, Maati and Saglam2023). In particular, the networked infrastructure and applications such as the internet and social media are crucial for information dissemination, and digital devices such as smartphones, computers, and CCTV cameras are central to recording, storing, and retaining data.
A large body of literature has documented how the “digital” component is incorporated into and therefore expands the existing spectrum of political repression (Earl, Maher, and Pan Reference Earl, Maher and Pan2022; Feldstein Reference Feldstein2021; Gohdes Reference Gohdes2020; Reference Gohdes2024; Xu Reference Xu2021). It has also shown how economic and technological development has transformed some brutal dictators into what Guriev and Treisman (Reference Guriev and Treisman2019; Reference Guriev and Treisman2020) term “informational autocrats.” Rather than exercising overt, violent repression to remain in power, they control and manipulate information, which is subtle and covert, often below the radar of the public. Digital surveillance and digital censorship are the two—if not the only—crucial instruments that enable digital repression and sustain authoritarian control (Deibert Reference Deibert2015). We call state-led practices of using these digital instruments for maintaining political power and regime stability digital control. Footnote 5
Our conceptualization of digital control builds on the work of Hassan, Mattingly, and Nugent (Reference Hassan, Mattingly and Nugent2022, 156) on political control, which they define as “tactics engineered by political leaders to ensure widespread compliance with state policies and to minimize political resistance.” Digital control is a distinct form of political control enabled by digital technologies, setting it apart from traditional control mechanisms. However, given the widespread application of digital technologies, the common tactics of political control, such as repression, coercive distribution, and indoctrination or infiltration (Hassan, Mattingly, and Nugent Reference Hassan, Mattingly and Nugent2022), do have their digital counterparts (Schlumberger et al. Reference Schlumberger, Edel, Maati and Saglam2023, 6–10).
Digital surveillance and digital censorship are specific instruments of digital control commonly used by autocrats. In the context of digital authoritarianism, digital surveillance broadly refers to practices of systematic monitoring of both online and offline communication, activities, and behaviors through digital technologies—the internet, CCTV cameras, mobile phones, GPS, biometric systems—to exert political control (Xu Reference Xu2021). The “digital” part makes human-led surveillance more efficient and comprehensive, thereby improving surveillance based on human agents (Pei Reference Pei2024; Xu Reference Xu2021); a similar argument can be made for digital censorship. Drawing on Roberts (Reference Roberts2018, 37), we understand digital censorship as a systematic restriction, if not complete banning, of public access to or the public expression of information that has the potential to undermine authority; it is achieved through digital techniques such as keyword filtering, website blocking, flooding, and throttling.
The two instruments are inextricably connected. By monitoring the population, digital surveillance provides information for digital censorship; digital censorship, in turn, limits the scope and scale of digital surveillance (Gohdes Reference Gohdes2020). Although each method has been widely studied by social and political scientists individually, only a few studies have integrated them into one framework (Gohdes Reference Gohdes2020; Reference Gohdes2024; Stoycheff Reference Stoycheff2023; Stoycheff, Burgess, and Martucci Reference Stoycheff, Burgess and Martucci2020). Our concept of digital control follows this integrated approach to better reflect the interrelated nature of digital tactics of modern autocracies.
Awareness, Emotions, and Attitudes under Digital Control
Although digital surveillance and digital censorship are largely used for repressive purposes of political control, autocrats exert great effort to conceal their repressive intentions and, where they cannot, legitimize the use of these technologies. Digital surveillance is promoted as an indispensable tool for antiterrorism and security efforts (Ollier-Malaterre Reference Ollier-Malaterre2023), and digital censorship is often depicted as a necessary measure to protect citizens from threats such as “extremism” (Guriev and Treisman Reference Guriev and Treisman2019). The disguised nature of digital control implies that people living in an authoritarian regime have varying levels of awareness of its underlying aims. One cannot assume that everyone knows about every facet of digital control imposed by the state, but it is equally untenable to believe that people are completely ignorant of its effects. Yet, citizens in an already heavily censored environment may lack sufficient demand for uncensored information (Chen and Yang Reference Chen and Yang2019). Under state propaganda, people may also form preexisting perceptions that normalize or even support digital control (Nisbet, Kamenchuk, and Dal Reference Nisbet, Kamenchuk and Dal2017; Yang Reference Yang2025). To understand the individual-level impacts of digital control, the key, as we argue later in this article, is to determine people’s awareness of the intrusive or repressive nature of digital control.
Previous studies used experimental designs to manipulate the awareness of state digital control practices using information cues. These results suggest that prompted awareness of the potential harm of digital surveillance negatively affects public support for government digital policies (Karpa and Rochlitz Reference Karpa and Rochlitz2024; Xu, Kostka, and Cao Reference Xu, Kostka and Cao2022). Similarly, priming people about digital censorship practices of the regime worsens people’s evaluation of the government (Wong and Liang Reference Wong and Liang2021). The awareness of censorship, as Roberts (Reference Roberts2020) argued, is key to resisting the manipulation.
How then does awareness about digital control, particularly its intrusive and repressive nature, affect people’s attitudes toward government digital policies? In what follows, we argue that emotional experiences elicited by this awareness operate as an important yet understudied psychological mechanism that not only constitutes in and of itself an informative indicator for the political-psychological consequences of digital control but can also (re)shape people’s political evaluation and attitudes via affective and emotional processing.
Partly influenced by advances in psychology and neuroscience, the study of emotions has been reincorporated into many substantive research agendas in political science over the past two to three decades. Emotions, commonly defined by political psychologists as “reactions to [external or internal] signals about the significance that circumstances hold for an individual’s goals and well-being” (Gadarian and Brader Reference Gadarian, Brader, Huddy, Sears, Levy and Jerit2023, 192), constitute noninstrumental, evaluative, and subjective experiences that carry physiological changes and action tendencies (Frijda Reference Frijda1986, cited in Pearlman Reference Pearlman2013). This definition entails two nuanced understandings of emotions in politics. First, rather than considering emotions as being irrational or inferior to rational reasoning and thus needing to be suppressed or eliminated, neuroscientists and social scientists now recognize emotions as crucial in shaping the way people think and behave: thus, they need to be better understood, along with reason and rationality (Damasio Reference Damasio1994; Elster Reference Elster1999; McDermott Reference McDermott2004). Second, although political-psychological experimentation focuses on short, visceral emotional reactions to political stimuli, political emotions per se are not necessarily short and transient (Demertzis Reference Demertzis, Nesbitt-Larking, Kinnvall, Capelos and Dekker2014, 227).Footnote 6 Yet, they are fluid and contingent. Pearlman (Reference Pearlman2013) distinguishes between transient and enduring emotions and argues that a reflex short-term emotion can convert into a long-lasting, affective orientation when it recurs and influences a person after it is experienced. Moreover, even immediate emotions can play an important role, especially in periods of political tension, and may have a long-lasting impact on policy attitudes or political choice because they affect information processing and impression management (Demertzis Reference Demertzis, Nesbitt-Larking, Kinnvall, Capelos and Dekker2014, 228–29; Loewenstein and Lerner Reference Loewenstein, Lerner, Davidson, Scherer and Goldsmith2003; McDermott Reference McDermott2004; Pearlman Reference Pearlman2013).
Of the many roles that emotions play in politics, one role that is fundamental to our study is the emotional process that involves affective evaluation of a stimulus: it may not only influence emotional expressions but also shape political attitudes and behaviors (Brader, Marcus, and Miller Reference Brader, Marcus, Miller, Edwards, Jacobs and Shapiro2011; Marcus Reference Marcus2000; Webster and Albertson Reference Webster and Albertson2022). Brader (Reference Brader2006, 55) summarized these as the detective and the directive functions of emotion, where the former assesses the relevance of the stimuli and the latter prepares the mind and body for responses. In electoral politics, both affective intelligence theory (AIT) and hot cognition theory suggest that emotions are automatic and preconscious reactions to political stimuli, such as the candidates, the campaign ads, and the issues: these reactions can have a subsequent impact on information seeking, opinion formation, and voting decisions (Brader Reference Brader2006; Lodge and Taber Reference Lodge and Taber2013; Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000). Goodwin, Jasper, and Polletta (Reference Goodwin, Jasper and Polletta2001, 10), social movement scholars, noted, “Emotions are part of the ‘stuff’ connecting human beings to each other and the world around them, like an unseen lens that colors all our thoughts, actions, perceptions, and judgment.”
Importantly, these arguments for emotion’s role in shaping attitude and behavior in politics do not seek to replace but rather to complement existing models based on rational choice theories. Nor do they focus solely on the “good” side of emotion’s role. Instead, emotions offer a nuanced lens into the micro-foundation of people’s actions in specific temporal and spatial settings that neither instrumental nor value rationality can fully capture (Pearlman Reference Pearlman2013; Reference Pearlman2023). Such an emotion-focused micro-foundational basis can help sharpen existing explanations and predictions for political choice and action (McDermott Reference McDermott2004). Therefore, delving into the emotional experiences of individuals living under a digital autocracy provides a unique affective lens to understand how ordinary citizens receive and process information and navigate the political reality that is increasingly characterized by state digital control (Redlawsk Reference Redlawsk2006).
Emotional and Attitudinal Responses to Digital Control
What emotional reactions, if any, can awareness of digital control trigger? Rather than arising and existing in isolation, multiple emotions often co-occur when faced with stimuli, as seen in experimental manipulations (Albertson and Gadarian Reference Albertson and Gadarian2017). We therefore argue that awareness of digital control practices can elicit multiple emotions across the spectrum simultaneously.
Substantial research has pointed to the chilling effects of digital surveillance and digital censorship (Büchi, Festic, and Latzer Reference Büchi, Festic and Latzer2022; Roberts Reference Roberts2018; Stoycheff Reference Stoycheff2023; Stoycheff et al. Reference Stoycheff, Liu, Xu and Wibowo2019). In such contexts, fear emerges as a predominant psychological mechanism that curtails people’s political engagement and media consumption. However, fear only works when digital control is perceived to have a credible outcome, meaning that those who oppose digital control know that they will face punishment (Roberts Reference Roberts2018). In other cases, digital control may trigger backlash. Realizing that oneself or closely related others are targeted by state digital control can also incite feelings of anger or frustration (Zhang, Tandoc, and Han Reference Zhang, Tandoc and Han2022; Zhu and Fu Reference Zhu and Fu2021) and lead to actions that attempt to circumvent that control (Chen and Yang Reference Chen and Yang2019; Roberts Reference Roberts2020). Although negative emotional responses are prevalent, research also finds that the abrupt censorship of previously available information can quickly trigger public attention, inducing a surge of curiosity in the censored content and sometimes a feeling of surprise (Jansen and Martin Reference Jansen and Martin2015). In other circumstances, it can motivate people, even those without any political or strategic aims, to learn ways to maintain initial access simply by habit, leading them to be exposed to once off-limits information (Hobbs and Roberts Reference Hobbs and Roberts2018). When learning about the political nature of digital control, changes may also occur with positive emotions. Especially in contexts in which the motives behind government surveillance and censorship are either hidden from the public or deliberately framed in a positive way, realizing personal or collective exposure to digital control can quash positive feelings such as happiness and security. Together, these reactions underlie our first hypothesis:
Hypothesis 1. Awareness of state digital control practices, such as digital surveillance and digital censorship, will (a) amplify negative emotions and (b) elicit “neutral” emotions like surprise and curiosity while (c) diminishing positive emotions.
In a similar vein, in addition to triggering varied emotional responses, awareness of digital control, particularly of its intrusive and repressive nature, can also shape people’s attitudes toward state digital control practices, relevant actions, and policies. A field survey experiment in three Chinese regions suggests that making participants aware of the repressive potentials of the country’s Social Credit System (SCS)—a nationwide, Big Data-driven surveillance infrastructure (Liang et al. Reference Liang, Das, Kostyuk and Hussain2018)—significantly reduces their support for the system (Xu, Kostka, and Cao Reference Xu, Kostka and Cao2022). Similarly, public support for SCS declines when Chinese participants were exposed to Western media framing and were informed about the system’s monitoring of their social behavior (Xu et al. Reference Xu, Krueger, Liang, Zhang, Hutchison and Chang2023). Comparable findings from Russia, Germany, Turkey, and the United States demonstrate that awareness of the potential misuse of digital government tools by the government reduces public support for them (Karpa and Rochlitz Reference Karpa and Rochlitz2024). Other experimental studies point to similarly worsening attitudes among Chinese citizens toward the government, its performance, and its problem-solving ability after exposure to uncensored internet content despite systematic digital censorship practices, such as the Great Firewall (Chen and Yang Reference Chen and Yang2019), as well as after detecting sporadic, ad-hoc online censorship activities (Wong and Liang Reference Wong and Liang2021). This evidence allows us to expect the following:
Hypothesis 2. Attitudes toward state digital control practices as well as relevant state digital policies will worsen if people are exposed to information about the intrusive and repressive nature of digital surveillance and digital censorship.
To what extent, if any, are people’s attitudes shaped by emotional responses elicited by the awareness of digital control? A large body of literature on emotions’ role in politics has emphasized their expansive and deep-seated influence in shaping individual political judgment and action. The hot cognition hypothesis recognizes that people’s political choices are almost always guided by intuitive, feeling-based affective heuristics triggered automatically by environmental stimuli, rather than by more detailed information reasoning (Damasio Reference Damasio1994; Lodge and Taber Reference Lodge and Taber2005; Reference Lodge and Taber2013; Zajonc Reference Zajonc1980); this is especially true “in situations with high uncertainly and limited information about consequences” (Dal, Nisbet, and Kamenchuk Reference Dal, Nisbet and Kamenchuk2023, 649). AIT treats emotions in a high-risk environment as “immediate and preconscious reactions to stimuli that profoundly influence subsequent cognitions” (Mintz, Valentino, and Wayne Reference Mintz, Valentino and Wayne2022, 119; see also Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000). Thus, this affective process can color and alter the decision-making process and individual perceptions of political issues (Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000; Webster and Albertson Reference Webster and Albertson2022).
Negative feelings, such as anger and anxiety, can mediate the impact of external issues, information, or events on political beliefs related to populism and anti-immigration sentiment (Brader, Valentino, and Suhay Reference Brader, Valentino and Suhay2008; Renshon, Lee, and Tingley Reference Renshon, Lee and Tingley2015; Rhodes-Purdy, Navarre, and Utych Reference Rhodes-Purdy, Navarre and Utych2021). Fear and anger also influence political behaviors in high-risk contexts like protests against regime repression (Nikolayenko Reference Nikolayenko2022; Young Reference Young2019). Although dispiriting emotions like fear, sadness, and shame drive people away from protesting, emboldening emotions like anger and positive emotions such as joy and pride can move people toward it (Pearlman Reference Pearlman2013). Positive emotions, including pride, hope, and trust, can also have the opposite effect, offering autocrats like Putin important sources for mobilizing and maintaining genuine popular support and thereby serving as nonmaterial means for autocrats to secure legitimacy (Greene and Robertson Reference Greene and Robertson2022). In the context of “networked authoritarianism” (MacKinnon Reference MacKinnon2011), recent research suggests that it is emotional reactions to risk signals, rather than the cognitive appraisal of risk, that determine people’s online political expressions (Dal and Nisbet Reference Dal and Nisbet2022; Dal, Nisbet, and Kamenchuk Reference Dal, Nisbet and Kamenchuk2023). Given this and building on our first two hypotheses, we expect that emotional reactions will mediate the effect of awareness of digital control on people’s attitudes, with decreasing positive emotions and increasing negative emotions leading to more negative attitudes.
Hypothesis 3. Awareness of the intrusive or repressive nature of state digital control practices will reduce positive emotions, increase negative emotions, and, through these emotional changes, further worsen people’s attitude toward state digital control and relevant policies.
So far, we have outlined two direct pathways through which awareness of digital control can affect people’s emotions and attitudes and one indirect pathway indicating that awareness of digital control can influence attitudes through various emotional responses. In the next two sections, we pinpoint two additional factors that may shape the influence of awareness of digital control on emotional and attitudinal responses: (1) the type of digital control instruments and (2) the level of intrusiveness.
Type of Digital Control: Surveillance versus Censorship
Although the awareness of both types of digital control practices can lead to similar emotional and attitudinal responses (Hypotheses 1–3), we argue that the awareness of digital surveillance will exert a stronger emotional and attitudinal effect than that of digital censorship, based on the following considerations. First, in autocracies, state-led censorship of undesired speech is often a routine practice that is either indirectly or directly revealed to the public (Stoycheff Reference Stoycheff2023; Zhu and Fu Reference Zhu and Fu2021). Especially during crises, state-initiated censorship campaigns can affect a broad range of individual social media users through various techniques (King, Pan, and Roberts Reference King, Pan and Roberts2013), suggesting that many people may have personally experienced digital censorship (Hobbs and Roberts Reference Hobbs and Roberts2018; Jansen and Martin Reference Jansen and Martin2015). By contrast, digital surveillance typically operates more covertly, possibly leaving the majority uninformed of its use (Deibert Reference Deibert2015).
Second, and relatedly, digital surveillance enables more targeted regime violence against those who dissent, leaving the compliant population largely unaffected; conversely, digital censorship in the form of restricting internet access leads to an increase in an indiscriminate campaign of repression, making it known to a substantial percentage of the population (Gohdes Reference Gohdes2020; Reference Gohdes2024; Xu Reference Xu2021). Therefore, knowing about potential repressive consequences of digital surveillance creates a climate of uncertainty about the content, the scope of monitoring, and its legal and social repercussions, far more than digital censorship (Stoycheff Reference Stoycheff2023, 119). This leads us to our next hypothesis.
Hypothesis 4. Awareness of digital surveillance will elicit stronger emotional reactions and shifts in attitudes toward government use of digital technologies than awareness of digital censorship.
Perceived Level of Intrusiveness: Public vs. Personalized Control
Another feature that can influence emotional and attitudinal reactions to digital control is whether the individual perceives that he or she is the target of state digital control. Accordingly, we can distinguish more targeted or personalized control from more generalized or public control. This cognitive process implies the perceived level of intrusiveness of state digital control practices, which can be influenced by the surrounding information environment. In regimes adept at information manipulation, state propaganda can bias public awareness of digital control by overemphasizing the benefits that more generalized digital control practices bring to the public, such as maintaining public order and stability (Nisbet, Kamenchuk, and Dal Reference Nisbet, Kamenchuk and Dal2017) or enhancing online civility (Yang Reference Yang2018), while at the same time remaining silent about repressive potentials embedded in targeted control. Consequently, digital control implemented in a more generalized manner may be seen as legitimate and may exist beyond the awareness of most of the population. A growing body of research reveals various mental strategies that Chinese citizens use to dissociate themselves from personal exposure to surveillance (Liu and Graham Reference Liu and Graham2021; Ollier-Malaterre Reference Ollier-Malaterre2023). Such mental tactics convince people that digital surveillance targets others, not themselves.
The generalized/targeted or public/personalized divide plays an important role in public perceptions toward digital control practices. A cross-national study involving China, Germany, the United Kingdom, and the United States found that privacy concerns have a stronger (negative) impact on the public acceptance of facial recognition technology if people feel personally affected than if they see it as targeting others (Kostka, Steinacker, and Meckel Reference Kostka, Steinacker and Meckel2022). In an experiment-based cross-national study conducted in four European democracies, researchers found enhanced public support for extensive state surveillance when it targets potential criminals (Ziller and Helbling Reference Ziller and Helbling2021). A related study shows that people tend to speak up against internet censorship when their own posts or reposts are banned but often remain silent when encountering the deletion of posts or reposts from anonymous others (Zhu and Fu Reference Zhu and Fu2021). Given this distinction, emotional reactions and attitudes toward digital control likely depend on the perceived level of intrusiveness of the control mechanism. We thus expect the following:
Hypothesis 5. Perceiving both digital control instruments in targeted or personalized terms will have a stronger effect on people’s emotional reactions and attitudes toward government use of digital technologies than perceiving them in generalized or public terms.
Research Design and Data
Rationale for Case Selection
We selected China as our empirical case for two important reasons. First, over the past decade, the rapid digitalization of Chinese society and economy—driven in part by the strategic deployment of digital technologies across various sectors by the Chinese Communist Party (State Council 2023)—positions it as a compelling and valuable case study of digital autocracy. Most Chinese citizens, whether direct or indirect targets, must navigate this expansive nationwide digitalization program. Second, although Chinese political leaders view digitalization as a catalyst for propelling the country’s modernization and increasing its competitiveness (Cyberspace Administration of China 2022), the government-initiated deep fusion of digital technology in all aspects of social, economic, and political life in a nondemocratic, single-party regime has raised widespread domestic and international concerns about potential consequences on individual and civil rights, justice, and liberty (Feldstein Reference Feldstein2021; Pei Reference Pei2024; Rothschild Reference Rothschild2024; Xu Reference Xu2021). With the increasing export of Chinese surveillance technologies to Africa, Latin America, and Southeast Asia, scholars, local civil groups and leaders, and policy makers have expressed concerns for the potential imposition there of the Chinese model of digital authoritarianism (Polyakova and Meserole Reference Polyakova and Meserole2019). Thus, our case selection offers a unique lens to study the implications of digital control on its citizens.
Rationale for Our Survey Experiment
Our study aims to test the causal effect of awareness of the intrusive and repressive nature of digital control on people’s emotional and attitudinal responses. Previous studies have used various proxies for awareness of digital control—including completion of tertiary education (Guriev and Treisman Reference Guriev and Treisman2019) and access to other non-Chinese sources of information (Xu et al. Reference Xu, Krueger, Liang, Zhang, Hutchison and Chang2023; Xu, Kostka, and Cao Reference Xu, Kostka and Cao2022)—to investigate how it affects public attitudes in autocracies. The underlying idea is that the general public has less knowledge about digital control than well-informed elites. Therefore, unveiling the repressive nature of digital control might have a greater effect on the former. Our conceptualization of awareness, however, entails both knowing about different types of digital control practices and the perceived level of intrusiveness. Relying solely on education level or information sources thus cannot fully capture these dimensions.
A survey experiment offers a cost-efficient and straightforward way to manipulate the perceived level of intrusiveness of digital surveillance and digital censorship and identify the causal effects at the population scale based on our sample. Schlumberger et al. (Reference Schlumberger, Edel, Maati and Saglam2023) pointed out that authoritarian regimes are typically keen to alter and control narratives and so keep their subjects oblivious to oppression. In our experimental setting, we integrate different versions of alternative or counternarratives that challenge the official ones that the Chinese government promote about digital control, thereby intentionally exposing people to its more repressive nature. In this way we can manipulate and contrast the influence of awareness about digital control on emotions and related attitudes.
Despite the merits of a survey experiment, studying individual sentiments and attitudes toward potentially sensitive topics in an authoritarian context, like the one our study focuses on, naturally comes with systematic challenges in obtaining and assessing valid responses (Tannenberg Reference Tannenberg2022). Responses to potentially sensitive questions may suffer from sensitivity bias due to fear of government retribution for publicly revealing dissatisfaction (Blair, Coppock, and Moor Reference Blair, Coppock and Moor2020; Kuran Reference Kuran1995) or the pressure to conform to social norms and cultural tradition (Nicholson and Huang Reference Nicholson and Huang2023). To tackle these effects, we carefully considered these practical, methodological, and ethical issues (Ahram and Goode Reference Ahram and Goode2016) and integrated them into our research design. In this and the following sections, we illustrate our empirical strategies to address the challenges posed at each stage of the research and present and discuss findings with a critical reflection on their limitations.
Data Collection
We contracted with Qualtrics, a US-based market research company, to administer an online survey experiment from June 2023 to July 2023 in mainland China. Participants were recruited via quota sampling from a pool of online panels in cooperation with Qualtrics. In appendix A.1, we provided detailed information about the sample sources, recruitment procedures, and verification measures. The survey respondents were mainly recruited via offer walls, advertisements, and pop-ups in mobile or web applications. To ensure data quality, we used stringent mechanisms to screen out respondents who failed any of the criteria (see table A1.2), apart from initial validity checks against fraud, duplicates, and inattentive responses provided by the partner panels and services. These screening processes excluded 2,424 responses.
Our final sample consisted of 4,507 participants (Guo & Kostka Reference Guo and Kostka2025). Overall, we achieved an incidence rate of 65%, meaning two out of three contacted participants met our study’s predefined criteria and responded to the invitation. Our attrition rate, defined as the proportion of participants who started but did not complete the survey, was 33%.
Table A2.1 presents the sample’s summary statistics. Like many other online surveys conducted in China (Huang, Intawan, and Nicholson Reference Huang, Intawan and Nicholson2023; Jee and Zhang Reference Jee and Zhang2025; Nicholson and Huang Reference Nicholson and Huang2023), our sample somewhat overrepresented better-educated urban respondents. However, we implemented quotas based on age, gender, and region to ensure that our sample would resemble the national distribution at least in these sociodemographic aspects.
To contextualize and deepen our insights into the research questions, we also conducted 50 semi-structured in-depth interviews with 64 Chinese nationals living in 10 cities across the country from March 2023 to April 2023, before the survey experiment was conducted. Appendix A.3 offers detailed information about the fieldwork and interview set-up and a reflexive note about conducting fieldwork in a politically restrictive environment like China. The study received approval from our university’s Central Ethics Committee (ZEA no. 2023-008).
Experiment Setting and Stimuli
In the experimental section, we used a 3 (digital surveillance: no mention, public, personalized) by 3 (digital censorship: no mention, public, personalized) between-subject factorial design. Each participant was randomly assigned to one of the nine experimental conditions, each of which presented a fictitious scenario reflecting different types of digital control and different levels of perceived intrusiveness (table 1). Table B1 offers summary statistics of all conditions, which suggest that they were balanced across demographic and other covariates, confirming successful randomization. The condition with “no mention” of both digital surveillance and digital censorship served as the control group. All participants, whether in treatment or control conditions, were exposed to the same baseline text as shown in figure 1.
Table 1 3 x 3 Factorial Design


Figure 1 Baseline Text for the Vignette
Participants in treatment conditions received additional text as stimuli. Each stimulus describes different levels of perceived intrusiveness of either one or both of the digital control instruments and is embedded in the baseline text (see table 2 for a description of stimuli and table B2 for an overview of all nine vignettes). For example, the stimulus for personalized digital surveillance depicts a fictitious character as the main target of government surveillance through the personal digital gadgets used daily. By contrast, the stimulus for public digital surveillance highlights the collective, “like everyone else,” as the target. This kind of digital surveillance occurs most obviously in but is not limited to public areas like train stations and roadways, mainly through surveillance cameras installed in these locations.
Table 2 Vignette Description by Factor and Level

Using a fictitious setting (and stimuli) in the treatment allowed us to circumvent the potential confounding effects arising from respondents’ preexisting knowledge and awareness of digital control. It also enabled us to manipulate awareness across different types of digital control and their respective level of perceived intrusiveness. To best determine whether awareness of any level of digital control instruments elicits emotional reactions and affects attitudes toward government use of digital technologies, it is important to foster relatability and reduce suspicion of the scenario among participants. Therefore, the vignette text uses a storytelling technique to introduce a character named Xiao Zhang and a brief description of his or her daily life. This person’s social status and daily life are carefully crafted to mirror the participants’ own experiences, and the scenarios are adaptations of instances drawn from myriad real-life cases of digital control cited in media reports, personal accounts, and academic research (including our own fieldwork). We contend that our stimuli, embedded in these text-based vignette stories, offer a comprehensive (though not exhaustive) reflection of the reality of governmental digital control implementation in China.
To ensure survey design and manipulation quality, we pretested the survey questionnaire, including its experimental manipulations, through expert reviews and cognitive pretesting techniques (N = 11) and a pilot test (N = 101). We subsequently refined our survey based on these insights before the final fielding. At the end of the survey, participants were invited to share their questions or points of clarification via an open-ended question and were then briefed about the experiment.
Outcome Variables
In the political science literature, emotions are commonly understood as “mental and physical reactions to identifiable stimuli deemed consequential for individual or group objectives” (Miller Reference Miller2011, 577). As posited by AIT, these transient reactions can be “triggered by the scantest of information delivered by our senses, and occur before our conscious mind becomes aware of a given stimulus in our environment” (Mintz, Valentino, and Wayne Reference Mintz, Valentino and Wayne2022, 119; see Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000). We explored emotions directly incited by the “reception of information” related to digital control (surveillance/censorship) via our treatment stimuli. Immediately after the treatment exposure, participants were asked how they would feel if they found themselves in Xiao Zhang’s situation, as portrayed in the given scenario. Participants rated each of the 10 provided emotional responses on a Likert scale from 1 (“almost none”) to 5 (“extreme”). The emotional reactions included both those on the negative side of valence, such as sadness, anger, disgust, fear, and helplessness, and on the more positive side like security and happiness. We included curiosity and surprise to capture emotions in the mid-valence range and added the state of indifference (i.e., “do not care”) to gauge an absence of emotional response to the stimuli. We also formulated three indices to denote the positive, negative, and neutral valence of the emotional reactions using principal component analysis (see appendix C).
Our other focal outcome variable was people’s attitudes toward digital control and state digital practices. We approached this from two angles: attitudes toward digital control in the fictitious situation as depicted in the vignette text and attitudes toward government digital policies in the real-life situation. Each type of attitude comprised four elements: level of understanding, acceptance, satisfaction, and support.Footnote 7 We measured the attitudes toward the fictitious scenario as if the respondents themselves were the character in the given experimental treatment. By contrast, attitudes toward real-life situations were elicited by directly asking for participants’ opinions about state digital practices as affecting them in real life. Participants answered all questions related to attitudes using a Likert scale of 1 (“not at all”) to 5 (“fully”/“strongly”). We derived two indices for both types of attitudes by aggregating the scores across the four elements. The four-item attitude scales achieved a Cronbach’s α of 0.81 and 0.78 for attitudes toward fictitious scenarios and real-life situations, respectively, indicating good internal reliability.
Given the sensitive nature of our question and the context, we expected that respondents would be much more cautious when reporting their attitudes toward the real-life situation as compared to the hypothetical situation, even though we cannot exclude the possibility of some respondents exercising similar caution toward the hypothetical. In other words, directly asking respondents about their attitudes toward either scenario may trigger self-censorship among respondents (Kuran Reference Kuran1995). Despite this limitation, our main interest was to capture potential preference falsification elicited by the attitude question toward the real-life scenario relative to the hypothetical one. As expected, we found that attitudes toward the real-life situation were, on average, 4.9% more positive than those toward the fictitious situation, across all nine conditions, suggesting a moderate degree of self-censorship.
In addition to treatment and outcome variables, we asked participants about their demographic backgrounds: age, gender, education level, ethnicity, residence location, and income. We also gathered additional covariates in the pre-treatment section, including political ideology, personality traits, and personal values. Table B3 offers an overview of the measurement and coding scheme for all variables.
Results of the Survey Experiment
Decrease in Positive Emotions Was More Pronounced than an Increase in Negative Emotions
We conducted OLS regressions to estimate the effect of each treatment condition—the specific combinations of different types of digital control instruments—on emotional responses and public attitudes. The upper panel of figure 2 shows a significant decrease in positive emotions across all eight treatment groups compared with the control condition, which does not mention any type of digital control instrument. The group with treatment “personal both,” which was exposed to information about both personalized surveillance and personalized censorship, exhibited the strongest decrease in positive emotions among all treatment groups (b = -0.55, p < 0.001), followed by the treatment group exposed to information solely on personalized surveillance (b = -0.52, p < 0.001). In contrast, the group receiving information only on public surveillance expressed the smallest decrease in positive emotions (b = -0.28, p < .01).

Figure 2 Average Treatment Effect of Digital Control on Emotions and Public Attitudes
Note: The left sides of both panels show the treatment effects of four treatment conditions exposed to a single treatment stimulus that contains information on one of the four types of digital control: personalized surveillance, public surveillance, personalized censorship, or public censorship. The right-side panels show treatment effects of four treatment conditions exposed to “compound” treatment stimuli, with a combination of any two of the four types of digital control. Error bars represent 95% confidence intervals.
Although there was an increase in negative emotions across all treatment groups, the change in these emotions was statistically significant only in three conditions: “personalized both” (b = 0.32, p < 0.05), “personalized surveillance” (b = 0.48, p < 0.01), and “personalized surveillance & public censorship” (b = 0.32, p < 0.05). Notably, these conditions all included digital surveillance and shared the personalized characteristic of digital control. A similar pattern was found in the increase in emotions of more “neutral” valence, represented by surprise, curiosity, and indifference, in which five treatment conditions showed significant differences from the control condition: “personalized both” (b = 0.17, p < 0.01), “personalized censorship” (b = 0.13, p < 0.05), “personalized surveillance” (b = 0.16, p < 0.05), “personalized censorship & public surveillance” (b = 0.18, p < 0.01), and “public censorship” (b = 0.18, p < 0.01). Overall, when comparing the general shift in positive emotions to that in negative and neutral emotions, the decline in positive emotions was much more pronounced than the increase in both negative and neutral sentiments. Moreover, on average, treatment groups receiving compound treatment stimuli (right panel) did not exhibit significantly larger effects on emotional reactions than treatment groups with single treatment stimuli (left panel). This suggests that in compound treatment conditions, the effect of one type of digital control was likely absorbed by the other.
Moving to the discrete emotional reactions (see figures D3.1–D3.3), we found a significant decline in happiness across all treatment conditions and a significant decrease in feelings of security in five of the eight treatment conditions. By contrast, discrete negative emotions rose to varying but generally moderate extents across the treatment conditions. Helplessness, disgust, and anger were the three leading negative emotions in most treatment conditions. The surprise reaction was stronger than the increase in the “neutral” emotions. Thus, H1, which posits that awareness of state digital control practices such as digital surveillance and digital censorship will amplify negative emotions (a) and elicit “neutral” emotions like surprise and curiosity (b), while diminishing positive emotions (c), is supported.
When examining the effect of each treatment on public attitudes, the bottom panel of figure 2 shows that exposure to nearly all types of treatment stimuli led to a worsening of attitudes in both fictitious and real-life situations. The only exception was observed under the “personalized censorship” condition, where the coefficient of attitudes toward real-life situations was positive but statistically nonsignificant. Interestingly, in all treatment conditions, people’s attitudes became much more negative when asked questions in a hypothetical sense (i.e., “if you were Xiao Zhang”) than when asked directly about their real-life experiences, in which most treatment effects remained statistically insignificant. This distinction possibly suggests a certain degree of preference falsification by respondents when directly asked about their personal opinions of government policies, a typical caveat when assessing individual opinions in autocracies (Kuran Reference Kuran1995); alternatively, the distinction may imply certain mental tricks that respondents may use to rationalize the state’s digital practices (Ollier-Malaterre Reference Ollier-Malaterre2023). We return to both points later in the discussion. Drawing on this, we can conclude that H2, which states that attitudes toward state digital control practices and relevant state digital policies will worsen if people are exposed to information about the intrusive and repressive nature of digital surveillance and digital censorship, is largely supported.
Mediation Effect of Emotions
We used the causal mediation analysis framework proposed by Tingley et al. (Reference Tingley, Yamamoto, Hirose, Keele and Imai2014) to examine the extent to which positive and negative emotional reactions were further associated with related attitudes.
Figure 3 presents the average causal mediating effect (ACME), the average direct effect (ADE), and the total effects. ACME represents the effect of treatments—that is, awareness of various digital control practices—on people’s attitudes toward digital control in a fictitious scenario and toward state digital policies in real-life situations that run through emotional changes. ADE indicates the effect of treatments on attitudes independent of emotions. We found that emotional changes played a significant role in mediating all types of treatments, such that both decreasing positive emotions and increasing negative emotions had a negative effect on people’s attitudes toward state digital control in both fictitious and real-life situations. This reinforces the idea that emotions “powerfully shape how individuals view issues” (Webster and Albertson Reference Webster and Albertson2022, 403). Specifically, a decrease in positive emotions (ACME in figure 3, upper panel) had a greater impact than an increase in negative sentiment (ACME in figure 3, lower panel) on shaping, or worsening, people’s attitudes toward state digital control practices and policies in both fictitious and real-life situations. In fact, table D.4 suggests that 40% to 80% of the treatment effect of awareness of digital control was mediated through the decline in positive emotions versus only 3% to 10% through an increase in negative emotions. In other words, experiencing a loss of happiness or a lack of security can worsen public perceptions of state digital policies substantially more than experiencing an increase in negative emotions. Therefore, H3, which posits that emotional reactions aroused by awareness of the intrusive or repressive nature of state digital control practices will further worsen people’s attitude toward state digital control and relevant policies, is supported.

Figure 3 Mediating Effect of Emotions on Public Attitudes
Note: Statistical significance is determined at a p-value of 0.05, with 95% confidence intervals represented by the error bars. The left panel (top and bottom) illustrates the mediating effect on public attitudes in a hypothetical scenario, whereas the right panel (top and bottom) demonstrates the mediating effect on public attitudes in an actual scenario.
Surveillance Has a Stronger Effect than Censorship
We also tested the main effects of the two digital control instruments on emotions and public attitudes. The results of a series of linear regression models using the OLS estimator (table 3) suggest that both digital surveillance and digital censorship exerted similar effects in terms of the direction of emotional and attitudinal reactions: they both led to an increase in negative and neutral emotions and a decrease in positive emotions and attitudes. However, the effect of digital surveillance was, on average, greater in magnitude than that of digital censorship, particularly on the rise of negative sentiments and the deterioration of people’s attitudes. For both reduced positive sentiment and increased neutral sentiment, the censorship effect appeared to be more pronounced than the surveillance effect. However, in these two cases, the absolute magnitude of the main effects of both treatments was likely masked by their significant interaction effect. Hence, H4, which posits that compared with digital censorship, awareness of government digital surveillance has a stronger effect on people’s emotional reactions and attitudes toward the government’s use of digital technologies, receives partial support.
Table 3 Main Effects of Digital Surveillance and Digital Censorship on Emotions and Public Attitudes

Note: Estimates of OLS regression; N = 4,057 across all models. For both treatment variables—digital surveillance and digital censorship—the baseline level is “no mention.” Column (1) presents models without interaction between the two treatments, and column (2) includes interaction terms. Individual controls include age, gender, education, income, rural residence, ethnic minority, CCP membership, overseas experiences, political ideology, VPN familiarity, personal values, and personality traits. Robust standard errors are presented in parentheses. For linear models without controls, see table D2. * p < 0.05 ** p < 0.01 *** p < 0.001.
Levels of Intrusiveness Matter
We also compared the effect of the perceived level of intrusiveness within each type of digital control instrument. Perceiving digital control instruments in personalized or targeted terms had, on average, a more pronounced effect on emotions and attitudes than when perceived in public or generalized terms. This difference was consistent for digital surveillance across all types of outcomes and applied to digital censorship in attitudinal responses. However, in terms of emotional responses, the impact of personalized digital censorship seemed to be less pronounced than that of public digital censorship. As such, H5, which states that awareness of government digital control in personalized terms has a stronger effect on people’s emotional reactions and attitudes, can also only be partially supported.
Insights from Our Fieldwork
We integrated the findings from our 50 qualitative interviews into our survey results to contextualize the results of the experiment; this analysis focused on Hypotheses 4 and 5, which highlight the awareness of two distinct yet interrelated aspects of digital control on emotional and attitudinal responses. To summarize, we found that on average, digital surveillance has a more extensive and stronger effect on (increasing negative) emotional reactions and (worsening) public attitudes than digital censorship (H4). This supports previous research that shows that online surveillance is more threatening to freedom of expression than overt censorship (Stoycheff Reference Stoycheff2023; Stoycheff, Burgess, and Martucci Reference Stoycheff, Burgess and Martucci2020). Within each type of digital control, a personalized one—perceiving oneself as the target of governmental digital control—had a greater effect on emotions and attitudes than when digital control was perceived in generalized or public terms; that is, when seeing society as a whole as the target (H5). This aligns with recent studies that find that individuals in politically restricted environments are more likely to express their opinions when they or their close associates are directly affected by censorship than when the broader community is targeted (Zhu and Fu Reference Zhu and Fu2021).
A notable exception, however, is our finding that public or generalized digital censorship had a stronger effect on diminishing positive emotional reactions than public digital surveillance. This suggests that people tend to adapt more readily to government surveillance projects than to the restrictions or elimination of data and information flows. One possible explanation for this phenomenon may be that public surveillance operates more “quietly” in the background than public censorship (Deibert Reference Deibert2015; Koops Reference Koops2021). Another explanation for weaker emotional responses toward public digital surveillance can be attributed to the effect of state propaganda, which portrays the large-scale implementation of surveillance measures as a necessary and effective means to prevent terrorism and crime (Impiombato, Lau, and Gyhn Reference Impiombato, Lau and Gyhn2023). Many interviewees indicated that they barely noticed the increased number of cameras installed in public spaces or did not find it disturbing, as shown in these responses:
There’s no difference between being watched by cameras and by other people on the street. (INTPS26).
With so many people walking around, it (surveillance) doesn’t really bother me. Usually, it doesn’t feel like being watched. (INTPS47).
Nevertheless, the interview data corroborated the survey experiment results: emotional reactions and attitudes toward personalized digital control instruments were almost all negative, whereas emotional expressions related to public digital control appeared more diverse. When interviewees talked about digital surveillance in more targeted or personalized terms, their emotions were intense and negative:
What makes me feel even more disgusted is that WeChat…really, is monitoring our private conversations. (INTPS01).
[Discovering that a conversation in a private group of one of our common friends is monitored] is really shocking to me. (INTPS02).
In a similar vein, when digital censorship was presented in personalized terms, the emotional expressions were also mostly negative, but their intensity seemed to wane over a long period of time:
Actually, in the beginning I was furious because it felt like you wanted to say something but couldn’t; it felt like you were being choked. Then you start to question, and to feel things getting strange. (INTPS07)
When digital control was perceived in public terms, emotions and attitudes began to diverge. For public digital surveillance, emotions ranged from indifferent (see quotes of INTPS26 and INTPS47) to supportive:
Surveillance…in general, it is definitely better. But not at home, for example, in a place where I sleep.…You can have video records replayed if your stuff is stolen, and it is more helpful to assist the police in tracking down the criminal.… In fact, only criminals will mind. We ordinary people did not do anything wrong, why would we worry? (INTPS55)
Emotional and attitudinal responses to public censorship were more intense and spread to both ends of the spectrum:
[Censorship] is everywhere.… There are certain things you just can’t say in certain countries, like in Muslim countries, right? If you say it, you will risk your life! I feel in every region there is a certain public consensus, you just need to follow. (INTPS05)
The first time I came across a website with a 404 error, I had a mélange of quite intense emotions: indignation, confusion, and anger, also not able to understand it at all…all these factors…I even tried really hard to think about how to solve it with all possibilities. (INTPS22)
Discussion
Drawing on empirical evidence from China, our study investigated the political-psychological impact of digital control in authoritarian regimes, focusing on the important yet understudied role of emotion. In this study, we identified, assessed, and established significant causal links between awareness of the intrusive and repressive nature of state digital control on people’s emotional responses, on their attitudes toward state digital policies and practices, and on the mediating role that emotional experiences play in shaping individual attitudes. We also teased out how awareness of different types of digital control instruments and their level of perceived intrusiveness had a varied impact on emotional and attitudinal responses. Supporting extant literature on emotions in politics, our findings highlight the added value that a systematic analysis of emotions can contribute to a broad range of subfields in political science, beyond the traditional empirical focus on voting behavior (Lodge and Taber Reference Lodge and Taber2013; Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000) and political campaigns (Brader Reference Brader2006) in democracies.
Focusing on daily practices of state digital control through the lens of individual citizens in a digital autocracy, our analysis sheds new light on the nonmaterial, noninstitutional dimension of modern authoritarian rule that is captured by and reflected in the individual-level emotional micro-foundation of political attitudes and actions (Dal and Nisbet Reference Dal and Nisbet2022; Dal, Nisbet, and Kamenchuk Reference Dal, Nisbet and Kamenchuk2023; Greene and Robertson Reference Greene and Robertson2022; Pearlman Reference Pearlman2013, Reference Pearlman2023; Young Reference Young2019). Our study highlights emotions as both a consequence of an increasingly intrusive digitalized governance model and a mechanism for shaping public attitudes. In this regard, it enriches the emerging literature on informational autocracy and digital authoritarianism, in which political control becomes increasingly covert, unobtrusive, and digitalized (Guriev and Treisman Reference Guriev and Treisman2019; Reference Guriev and Treisman2020; Hassan, Mattingly, and Nugent Reference Hassan, Mattingly and Nugent2022; Pearson Reference Pearson2024). Our results indicate that one of emotion’s roles is to update public attitudes toward digital repression in modern autocracies. The mediating role of emotions in interpreting and channeling revealing information in our survey experiment speaks to the earlier political-psychology literature that emphasized the importance of affective heuristics in political opinion formation and decision making (Lodge and Taber Reference Lodge and Taber2005; Marcus, Neuman, and MacKuen Reference Marcus, Neuman and MacKuen2000; Webster and Albertson Reference Webster and Albertson2022). In what follows, we engage with alternative explanations of opinion and attitude formation before assessing the implications and limitations of our findings.
Alternative Explanations
Our study finds both that prompted awareness of all types of digital control practices not only elicits negative emotions but also more greatly diminishes positive feelings (H1) and that the shifting emotions in both positive and negative directions are significantly related to the worsening of people’s attitudes (H3). From the perspective of liberal democracy, this is not particularly surprising. However, digital control in autocracies is often subject to fewer regulatory constraints, and its repressive nature is increasingly obscured by government propaganda that places greater emphasis on its practical functionalities than its potential risks. Moreover, rather than publicly expressing dissent, citizens under autocracy are more inclined to hide their true feelings while publicly complying with the authorities (Kuran Reference Kuran1995).
It would naturally follow that these structural constraints would lead to the opposite results. Indeed, recent empirical studies on public opinion indicate substantial public support for digital control instruments in China (Huang, Intawan, and Nicholson Reference Huang, Intawan and Nicholson2023; Kostka Reference Kostka2019; Su, Xu, and Cao Reference Su, Xu and Cao2022; Yang Reference Yang2024; Reference Yang2025). Although one may question the genuineness of the responses—people may have been simply too afraid to openly oppose the state—it is also plausible that autocrats’ popularity proves to be somewhat genuine as a result of information manipulation (Guriev and Treisman Reference Guriev and Treisman2019; Reference Guriev and Treisman2020; Tang Reference Tang2016). The high level of support thus can be attributed to the propaganda effect of extensive, nationwide digital control programs, which have led most people to believe that technologies are only used for fighting fraud or terrorism and for maintaining social stability.
In addition to structural constraints, cognitive psychological constraints can also lead to a result that runs counter to the one we found. In her extensive fieldwork on citizens’ experiences with digital surveillance in China, Ollier-Malaterre (Reference Ollier-Malaterre2023, 223–310) discovered that people used multiple mental tricks to rationalize government digital surveillance as a defensive mechanism: they brushed aside the associated risks, persuaded themselves that they were not targets, wore “blinders” as long as nothing happened to them personally, resorted to fatalism, and simply accepted intrusions into their privacy. Yang (Reference Yang2024) argued that such self-protective psychological mechanisms are partly established through the normalization of repressive apparatuses such as censorship. In other words, people become desensitized as authoritarian control practices gain more prominence in their daily lives.
Thus, previous research offers alternative explanations that seem to contradict the findings of our study. However, rather than viewing these results as conflicting with ours, we suggest that our study complements the emerging body of literature on the political-psychological consequences of digital authoritarianism by presenting a more nuanced story. We experimentally counter the official positive framing of digital control practices and zoom into the emotional micro-foundation of individual responses, which operates closely with the cognitive information processing that constantly shapes individuals’ thoughts, beliefs, and behaviors (Damasio Reference Damasio1994; Lodge and Taber Reference Lodge and Taber2013; Zajonc Reference Zajonc1980). As our findings suggest, emotional processes, though immediate and brief, offer a relatively straightforward and thus authentic avenue for studying individual motivations, reasoning, and decision making in political contexts (Bakker, Rooduijn, and Schumacher Reference Bakker, Rooduijn and Schumacher2021; McDermott Reference McDermott2004).
At a general level, we find a significant amount of “emotional rationality” (McDermott Reference McDermott2004) and evidence for “emotional microfoundation” at play (Pearlman Reference Pearlman2013), as indicated by H1–H3. Even though our study was conducted within an authoritarian regime with seemingly high levels of political trust and public support of propaganda (Huang, Intawan, and Nicholson Reference Huang, Intawan and Nicholson2023; Reference Huang, Intawan and Nicholson2024), positive emotions declined significantly when respondents became aware of the repressive nature of digital control, much more than negative emotions increased. Simultaneously, attitudes toward state digital policies and practices also deteriorated, largely mediated through diminishing positive emotions such as happiness and a sense of security. Yet only changes in attitudes toward hypothetical situations were statistically significant, and not those toward real-life situations, indicating some degree of self-censorship. At a more granular level, our analysis of treatment effect heterogeneity (see appendix D.5) indicates that, in treatment effects of all types of digital control scenarios, there is a wide variation in people’s emotional and attitudinal responses across education level, age, overseas experiences, and consumption of foreign sources—but less variation across party membership and VPN savviness. Overall, millennials, the better educated, and active consumers of foreign information were the most critical of state digital control.
Implications
These results have important implications for authoritarian regime stability, because research has found that positive emotions such as pride, hope, and trust play a vital role in establishing popular support for authoritarian rulers (Greene and Robertson Reference Greene and Robertson2022). As positive emotions disappear, the emotional foundation for authoritarian support may also be weakened. Moreover, modern autocracies depend on an engaged citizenship to provide feedback to them and so enable their increased responsiveness (Chen and Xu Reference Chen and Xu2017). Dissatisfied and less secure individuals, even those who have trust in government, are harder to mobilize (Young Reference Young2019). In an information system characterized as “porous” (Roberts Reference Roberts2018), our results point to costs the authoritarian regime may need to pay to maintain governance transparency and efficiency in parallel with increasingly pervasive and intrusive digital control practices.
Nevertheless, one may doubt how long-lasting the impact of emotional changes elicited by external stimuli may be on political attitudes and behaviors. Yet, understanding emotion as fluid and contingent, even individual-level, immediate emotions can still have a long-term impact in terms of updating or reinforcing politically relevant attitudes (Demertzis Reference Demertzis, Nesbitt-Larking, Kinnvall, Capelos and Dekker2014; McDermott Reference McDermott2004). In addition, fleeting emotional experiences may also be transformed into some long-lasting effect, just as affective orientation guides individual political behaviors (Pearlman Reference Pearlman2013). Both can influence political attitudes and shape opinions in a profound way. Realizing the intrusive and repressive nature of state digital control, even when it occurs randomly as in our experiment, can have an adverse impact on emotional reactions. On many occasions, these reactions may vanish, but in specific circumstances, such as in times of heightened social and political tensions, they may also stay, fluctuate, and accumulate—accompanying and influencing the way people think and act at any stage, consciously or unconsciously. Then they may translate into various forms of political expressions or actions—or the absence of them.
The blank-paper or A4-paper movement in China is perhaps the most illustrative example of this phenomenon. During most of the COVID-19 lockdown, people complied with the state’s restrictive policies. Then lockdown fatigue took hold, and compliance was replaced by anger and opposition. Citizens in several major Chinese cities went to the streets to protest the government’s restrictive policies, despite strict digital surveillance and prevalent digital censorship that people had to deal with every day (Thornton Reference Thornton2023). Similarly, during the Arab Spring, emotional experiences of anger, joy, and pride, rather than strategic calculation or ideology, emboldened the people in Egypt to engage in resistance (Pearlman Reference Pearlman2013).
Limitations of this Study
Unlike in open democratic settings, measuring opinions about sensitive issues in politically closed contexts through online surveys, like the one in our study, faces fundamental challenges, among which the issue of response validity deserves critical reflection. Although the factorial experiment design allowed us to isolate and identify the causal effects of awareness of digital control on individual emotional and attitudinal responses, gauging these responses through direct questioning, as implemented in our survey, may suffer from sensitivity bias. Framing the question on attitudes toward government digital control practices in a hypothetical context related to the fictive persona in the experimental treatment, was designed to mitigate sensitivity by shifting the subject of the attitudes away from the respondents. However, even in a hypothetical scenario, direct questioning may still elicit wariness among some respondents, generating bias in their responses. Ultimately, we cannot entirely rule out the possibility of self-censorship or, worse, preference falsification (Ahram and Goode Reference Ahram and Goode2016; Kuran Reference Kuran1995; Robinson and Tannenberg Reference Robinson and Tannenberg2019). By comparing the differences between the two sets of attitudinal questions—one framed in a hypothetical context and the other in a real-life setting—we were able to partially capture the magnitude of the potential bias. Our results show that public attitudes toward digital control practices measured in the real-life context are, on average, 4.9% more positive than those measured in hypothetical scenarios across all treatment conditions, suggesting a moderate degree of self-censorship in the responses. However, with our current research design, we were unable to estimate the absolute level of bias caused by sensitivity. The literature using indirect question techniques like list experiments to estimate sensitivity bias suggests that 4% to 28% overreporting of regime support or trust in government (Li, Shi, and Zhu Reference Li, Shi and Zhu2018; Nicholson and Huang Reference Nicholson and Huang2023; Robinson and Tannenberg Reference Robinson and Tannenberg2019; Tang Reference Tang2016). Given this, the treatment effects in our study are likely to be underestimated. In other words, in the absence of self-censorship, we would have expected the impact of knowing about digital control on emotions and attitudes to be much worse. To reduce sensitivity bias, future research should consider using indirect questioning techniques designed for sensitive topics, such as the list experiment, the endorsement experiment, and randomized response techniques (Blair, Coppock, and Moor Reference Blair, Coppock and Moor2020; Rosenfeld, Imai, and Shapiro Reference Rosenfeld, Imai and Shapiro2016).Footnote 8
Relatedly, the effectiveness of using an anonymous online self-administered computerized questionnaire in reducing sensitivity bias may be limited in authoritarian regimes that engage in extensive surveillance of online activities (Li, Shi, and Zhu Reference Li, Shi and Zhu2018). The validity of our results may be further compromised by artificial datasets created by Generative AI, fraud, and professional survey takers. Although we closely monitored our data quality with automated and manual scrutiny (see appendix A.1), we are aware that there is no silver bullet to ultimately resolve the issue, which might pose threats to our results.
Interpreting the findings of our study thus warrants extra caution. We urge prudence in collecting, handling, and interpreting online opinion data in politically closed contexts and in enhancing its robustness through cross-checking techniques such as triangulation, as we applied to our interview analyses. Nevertheless, sensitivity bias may also be present during in-person interviews. Following Ahram and Goode (Reference Ahram and Goode2016) and Ollier-Malaterre et al. (Reference Ollier-Malaterre, Szwajnoch, Trauth-Goik, Bernot, Liang and Poon2025), we provide a detailed reflexive note in appendix A.3.
Moreover, our one-shot experimental design has inherent limitations in exploring the long-term impact of elicited emotions on people’s attitudes. Although our results suggest that emotions can effectively translate into worsening attitudes toward digital control, and many studies argue for the profound impact of immediate emotional experiences on political choices and actions, our study does not provide empirical evidence that emotions shape attitudes in the long term. This is an important aspect for future research.
Conclusion
Over the past three decades, the Chinese state has systematically expanded its nationwide digital surveillance and censorship programs (Roberts Reference Roberts2018; Xu Reference Xu2021). A prominent feature of these digital control instruments is their unobtrusiveness: their operation often goes unnoticed by the public. When digital control does attract public attention, it is often justified as a necessary means to maintain public security and social order. This presents an empirical challenge to studying how individuals living under increasing digital control respond to it. Assumptions built on different cognitive behavioral theories offer various, sometimes contradictory, predictions on how such individuals will think and act. Leveraging a survey experiment and drawing on in-depth interviews, our focus on the role of emotions sheds light on the affective aspect of the formation of and change in public attitudes in an emerging digital authoritarianism; it thereby enriches existing literature on the cognitive dimension (Dal, Nisbet, and Kamenchuk Reference Dal, Nisbet and Kamenchuk2023; Huang, Intawan, and Nicholson Reference Huang, Intawan and Nicholson2023; Kostka Reference Kostka2019; Ollier-Malaterre Reference Ollier-Malaterre2023; Su, Xu, and Cao Reference Su, Xu and Cao2022).Footnote 9 Our findings suggest that when exposed to information revealing the intrusive nature of state-led digital control practices, citizens exhibit emotional reactions and subsequent attitudes that do not differ from those of their counterparts in an open society: both experience a decrease in positive and a rise in negative emotions, and these emotions further worsen public attitudes toward government digital practices.
Although our empirical case focuses on China, the results have implications for studying the political-psychological dimensions of digital repression, information control, and regime legitimacy in other modern autocracies and regimes facing autocratization. As new generations of digital technologies are rapidly incorporated into the governance toolkit by governments around the world, the risks of misusing or abusing these digital tools also increase. Worldwide, informational autocrats have become a trend (Guriev and Treisman Reference Guriev and Treisman2019), digital authoritarianism is on the rise (Pearson Reference Pearson2024), and digital repression is no longer unique to nondemocracies (Earl, Maher, and Pan Reference Earl, Maher and Pan2022). Our study highlights the potential unintended consequences resulting from the awareness of digital control, which first unfold in people’s emotions. Future research should explore the variation in emotional experiences across countries, cultures, and political regimes.
Supplementary material
To view supplementary material for this article, please visit http://doi.org/10.1017/S1537592725103551.
Data replication
Data replication sets are available in Harvard Dataverse at: https://doi.org/10.7910/DVN/SQRZR8.
Acknowledgments
We would like to thank the journal’s editors and reviewers for their constructive feedback. We also extend our gratitude to our colleagues Anton Bogs, Ming Ma, Hao Zhang, and Xiao Ma for their helpful comments on the earlier drafts of this article, as well as to the participants in APSA 2023; the 2023 Volkswagen scoping workshop, “The Digital Reach of the Chinese State: Emerging Research Fields,” in Hanover; the 2024 “Comparative Studies of Digital Repression” workshop at the Hertie School in Berlin; and the online lecture series “Digital Governance in China” of the Berlin Contemporary China Network held in 2023–24. Last, but not least, we are deeply grateful for those taking part in our survey and fieldwork, who made this research possible and from whom we learned a lot.
We received funding support from the European Research Council, Starting Grant No. 852169.

