Skip to main content Accessibility help
×
Hostname: page-component-68c7f8b79f-lqrcg Total loading time: 0 Render date: 2025-12-20T03:28:34.855Z Has data issue: false hasContentIssue false

Part III - Policy Responses

Published online by Cambridge University Press:  19 December 2025

Scott J. Shackelford
Affiliation:
Indiana University, Bloomington
Frédérick Douzet
Affiliation:
Paris 8 University
Christopher Ankersen
Affiliation:
New York University

Information

Type
Chapter
Information
Securing Democracies
Defending Against Cyber Attacks and Disinformation in the Digital Age
, pp. 319 - 370
Publisher: Cambridge University Press
Print publication year: 2026
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Part III Policy Responses

14 Some Truths About Lies Misinformation and Inequality in Policymaking and Politics

How do the dual trends of increased misinformation in politics and increased socioeconomic inequality contribute to an erosion of trust and confidence in democratic institutions? In an era of massive misinformation, voters bear the burden of separating truth from lies as they determine how they stand on important issue areas and which candidates to support. When candidates engage in misinformation, it uncouples the already weak link among vote intentions, candidate choice, and policy outputs. At the same time, high levels of economic inequality and social stratification may contribute to lower levels of institutional trust, and the correspondingly more insular socioeconomic groups may experience misinformation differently. Social policy, as a policy area intentionally designed to alleviate risk and redistribute resources, thus becomes a special case where the effects of misinformation and socioeconomic inequality may be cross-cutting and heightened.

We examine the potential for this amplification by drawing on the Institutional Analysis and Development (IAD) framework. This framework, a product of policy analysis and public sector performance evaluation scholarship undertaken at the Ostrom Workshop, offers a metatheoretical approach to understanding diverse institutions in which the normative value of democratic self-governance is paramount. Information and, more generally, language are among the core features of the framework, as are individual decision-makers’ (officials and ordinary participants alike) capacities (e.g., legal, social, economic, positions, and capabilities) and the distribution of outcomes (e.g., costs and benefits) from the decisions under study. While the framework makes no assumptions about the veracity of information in use or the homogeneity of capacities or equality of disbursed costs or benefits, the normative goals of democratic self-governance emphasize the necessity of warrantable information claims and the adverse effects of fixed inequalities, whether concerning decision-making authority or distributions of a policy’s costs and benefits. Regarding authority, fixed inequalities suggest an unlimited capacity to dominate; fixed inequalities in the distribution of costs or benefits may indicate normative harms – injustice – and policy failures.

We start by examining the place of information and inequality in the IAD framework. We then discuss the differences between policy framing and more aggressive forms of misinformation and disinformation, and how they relate to trust. Next, we review the negative relationship between socioeconomic inequality and trust, identifying a number of potential mechanisms linking the two. We conclude by suggesting that mis- and disinformation exacerbate the challenges of democratic self-government not only on their own but also through its interactions with inequality and (dis)trust.

IAD Framework: Information and Inequality

In 1982, Larry Kiser and Elinor Ostrom described what they called a “metatheoretical synthesis of institutional approaches.” Their picture of “three worlds of action,” the arenas of operational, collective, and constitutional choices, offered a foundational frame for understanding these distinct levels of analysis, each characterized by a set of “working parts” generally found in any theoretical explanation: attributes of (1) individual decision-makers (e.g., participants of varying positions and associated capacities to act in a given situation); (2) events (e.g., goods, services or collective acts to be produced or consumed); (3) communities affected by the decision; (4) institutions (e.g., rules and norms) guiding individual choices; and (5) the decision situation in which choices are made (Kiser & Ostrom, Reference Kiser, Ostrom and Ostrom1982, p. 58). The authors suggested that these broad domains of the kinds of variables contained in institutional theories could be used to explain two main phenomena: the actions and strategies of decision-makers and the aggregated results of their activities. The picture of cause and effect was “meta” in that any theory might be described in this way, since the framework covered the general elements found in any and all specific theories of diverse institutions. In the same year, Vincent Ostrom wrote “Institutional Analysis, Policy Analysis, and Performance Evaluation” as an extension of his earlier work on the normative contexts and limitations of institutions. These working papers became the basis of the IAD framework, a widely used approach to policy analysis.

Information and the IAD Framework

Vincent Ostrom’s scholarship, particularly, offered a critique and correction to traditional policy analysis through an institutional analysis that turned to constitutional choice and the broader problem of language, itself understood as an institution, as a basis for rule-ordered relations. Indeed, Kiser and Ostrom (Reference Kiser, Ostrom and Ostrom1982) started their discussion of a meta-framework of institutional analysis by amplifying Vincent Ostrom’s (Reference Ostrom1982) insights about the uncertainties produced by our reliance on language to create and implement policies that are mere instructions, often with little guidance or metrics for goal achievement.

The ever-present problem of imprecise language is one among many communication failures at the heart of social dilemmas. Language, as Vincent Ostrom pointed out, using the terminology of John Searle (Reference Searle1969), is an “institutional fact.” Language depends not only on shared understandings but also on the choices we make in constituting and using (humanly created) rules for articulating our individual experiences. The creation of shared understanding presumes a common desire to come to mutually beneficial conceptions of cause and effect. While language and knowledge may be imperfect, the common aim of fallible humans presumably involves a choice to pursue more, rather than less, accurate representations of “brute facts” – the facts of our physical existence. At every step in the effort to articulate what an individual makes of experiencing “brute facts,” and what the audience takes from this articulation, lie the many traps of misunderstanding, ranging from mere mistakes to intentionally false representations. Nevertheless, the assumption made in the IAD approach to policy analysis is that a self-governing people have a stake in correcting errors rather than amplifying the consequences of their mistakes.

This assumption is neither naive nor ignorant of the fact that individuals may gain advantage not only by acceptable strategies of contextualizing information – framing – but also by misrepresentation. In appraising the vulnerabilities of democracy, V. Ostrom (Reference Ostrom1997) detailed the uses of Orwellian rhetorical strategies of “doublespeak” and deceptive “glittering generalities” (V. Ostrom, Reference Ostrom1982), staples in political discourses. The many studies of self-managed commons (e.g., E. Ostrom, Reference Ostrom1990) document the capacities of communities to come to a shared understanding in facing collective choice dilemmas – and the consequences of failing to do so. Effective, self-managed commons generally use highly participatory, egalitarian strategies for reaching these understandings. Democratic self-governance requires striving for accurate representations of cause and effect. Or, put another way, assuming the opposite – that we strive to dissemble and deceive – is a nonstarter for self-governance. By assuming fallibility and a capacity for learning, the question becomes one of the institutional arrangements that promote a drive for accuracy, including corrections to misinformation, and policies that foster trust in constitutional and collective choices and outcomes.

In the IAD framework, information, as well as its availability and accuracy, reflect the foundational dilemmas posed by human fallibility, particularly our limitations when it comes to producing shared understandings, given the institutional fact, language. The IAD framework sets out information as a feature of policy decisions in two ways. First, the constraints bounding and motivating choice and action focus on the characteristics of an action situation governed by eight types of rules, including “information rules, that specify channels of communication among actors and what information must, may, or must not be shared” (E. Ostrom, Reference Ostrom2010, p. 810). Second, information about the decision at hand, the participants and their capacities, the expected effects of a decision, the measurement and evaluation of these effects influence actions in the arena of choice designated as the “action situation,” which is the core of the IAD framework.

Prior to these worlds of action is a world of ideas; Vincent Ostrom argued that constitutional choice is preceded by the public articulation of values and beliefs that may be called “epistemic choice” (V. Ostrom, Reference Ostrom1993). These shared understandings include an agreed-upon framework of inquiry, for example; orientations to our ways of evaluating choices and their effects; values such as respect, reciprocity, and fairness; and, germane to this chapter, beliefs about the degree of equality, inequality, liberty, and so forth. Misinformation, disinformation, and inequality not only affect our desire to coordinate, cooperate – or even compete – in a shared Action Arena but, indeed, to continue to believe in shared standards of value, or even conceptions of biophysical existence.

The concept of an action situation allows researchers to identify the variables generic to policymaking about any type of good or service. Empirical investigations using the framework revealed design principles conducive to well-functioning, sustainable self-managed communities (E. Ostrom, Reference Ostrom1990).Footnote 1 More broadly, the framework can help us compare institutional arrangements, with a focus on a diversity of variables (Kiser & Ostrom, Reference Kiser, Ostrom and Ostrom1982; E. Ostrom, Reference Ostrom2010). In the generic framework, “the amount of information available to a participant,” is among the relevant structural elements of the action situation. While not explicitly stated, the term “amount” necessarily exceeds “quantity,” to cover the quality, scope, scale, and accuracy of information as well.

Likewise, the framework does not assume perfect information, but it necessarily assumes a desire to improve the accuracy of information. The motivation to improve (minimally, to abandon behaviors that do not produce desirable outcomes) can be seen in Ostrom’s (Reference Ostrom2005) discussion of the action situation:

Within a particular situation, individuals can attempt to choose only in light of their beliefs about the opportunities and constraints of that situation. In an open society, individuals may be able eventually to affect the structure of action situations in which they repeatedly find themselves by changing the rule configurations affecting the structure of these situations. To do so, they move to deeper analytical levels (collective-choice or constitutional-choice action situations) where the outcomes generated are changes in the rules of other action situations.

(p. 33)

We underscore the variability that language and, more generally, ideation, brings to the framework and the implications for taking action to move to “deeper analytical levels” to affect beneficial changes. The framework assumes that an open society can pursue accurate information, and that we ask about the motivation and effectiveness of that pursuit. Framing, misinformation, and disinformation emerge as specific concerns in treating information in the IAD framework. Differences of many kinds may lead to individuals existing in diverse circumstances but the basis of collective action assumes capacities to enter arenas of action characterized by at least a minimal degree of common ground. In a context of disinformation, heterogeneities, including inequalities, may inhibit the comity required for constitutional and collective choice and action, and, more fundamentally, the epistemic choices preceding the three arenas of action.

Inequality and the IAD Framework

Research has focused on the heterogeneity of motives and resources in collective action situations as these diversities characterize the positions and consequent actions of decision-makers in the IAD framework. Early theoretical work considered the potential for highly motivated individuals with greater resources to provide common goods, even if they contributed more than those with fewer resources (Olson, Reference Olson1965). Heterogeneity often characterizes the endowments of resource users, yielding several types of inequalities, including social, political, and economic inequalities (Andersson & Agrawal, Reference Andersson and Agrawal2006). Most of the research engaging the IAD framework and heterogeneities of actors concern the governance of common-pool resources. In studies of collective actions to secure natural resources (or produce their protection through allocation decisions), there is some support for the hypothesis that participants with greater resources may be willing to contribute a larger share of the initial costs of a management regime in exchange for a greater share of the benefits (Baland & Platteau, Reference Baland and Platteau1999). More generally, economic and status inequalities have also been associated with unequal political power (Neupane, Reference Neupane2003), lower levels of trust in the decision-making methods as well as the decisions themselves (Ruttan, Reference Ruttan2006), lower levels of commitment, and decreased capacities for monitoring implemented policies (Schlager & Blomquist, Reference Schlager and Blomquist1998), that undermine cooperation and collective action. Experiments in common-pool resource management show that communication is vital to producing efficient resource use but heterogeneities among actors (e.g., differences in resources enabling contributions to collective action) impede self-governance (Hackett, Schlager, & Walker, Reference Hackett, Schlager and Walker1994); inequalities likewise impede communication across groups, often a necessary condition for collective action (E. Ostrom, Reference Ostrom2005).

Heterogeneities are ubiquitous and may be found in each element of the IAD framework, including rules governing contributions (who pays) and allocation rules (who benefits). In addition to helping researchers create a taxonomy of solutions to commons dilemmas created by myriad heterogeneities, the IAD framework alerts us to information asymmetries (including the effects of misinformation) and their relationship to economic, social, and political inequalities. Heterogeneities, asymmetries in information, and limited communication also impede trust among groups of different statuses. Related to the emphasis on open communication and warrantable claims is the overall necessity of credible commitments that lead to cooperative actions over several iterations. As people see the outcome of their choices and know the choices and actions of others, trust becomes increasingly possible. It turns out that trust is a major component in surmounting the obstacles that heterogeneities raise (e.g., E. Ostrom, Reference Ostrom2005).

Information Framing, Misinformation, Disinformation, and Trust

Information “framing” most generally means putting information into a context and has long been accepted as a legitimate rhetorical strategy: Credible contextualization and analysis are staples of journalism. Aristotle’s “rhetorical triangle” of pathos (frames engaging belief, value, experience, imagination, and feelings), logos (appeals to reason and logic), and ethos (appeals using authoritative, credible sources) mark normatively acceptable strategies of persuasive speech. Framing and appeals to feelings and imagination may be acceptable when linked to the other sides of the metaphorical triangle. But framing, in this logic has generally referred to accurate, truthful information. Disinformation and misinformation bring us to a different place in the deployment of affective rhetorical strategies.

Framing and Framing Effects

Over the fifty years of its evolution as a concept crossing a broad array of scientific disciplines, “framing” has come to indicate many different, albeit occasionally related, phenomena. In communications theory, political psychology, media studies, and related disciplines, the term “framing” has been used to indicate ways of organizing information and social realities (Cowart, Blackstone, & Riley, Reference Cowart, Blackstone and Riley2022; Goffman, Reference Goffman1974), types of narrative and their effects (Bennett, Reference Bennett2016; Iyengar, Reference Iyengar1991; Iyengar & Kinder, Reference Iyengar and Kinder1987), and efforts or means of influencing opinion and choices by directing attention toward some consequences, risks, benefits, or values and away from other plausible concerns (Chong, Reference Chong2000; Chong & Druckman, Reference Chong and Druckman2007a; Druckman, Reference Druckman2001, Reference Druckman2004; Tversky & Kahneman, Reference Tversky and Kahneman1981).

Framing effects occur when “logically equivalent (but not transparently equivalent) statements of a problem lead decision makers to choose different options” (Rabin, Reference Rabin1998, p. 36). In initial research on framing effects, for example, Tversky and Kahneman (Reference Tversky and Kahneman1981, Reference Tversky and Kahneman1986) showed that framing the expected consequences of a choice in terms of a chance of loss rather than the logically equivalent chance of gain caused participants in their experiments to take the riskier of two options; framed as a chance for gain, the safer bet was the dominant choice. In political communication, “comparability” rather than “logical equivalence” may better describe the various considerations influencing choices (Chong & Druckman, Reference Chong and Druckman2007b): Framing effects occur when the contextualization of information elevates some considerations – including principles and values – over others. For example, Druckman (Reference Druckman2001) finds either of two valued principles, freedom or security, may be highlighted by framing and used as the basis of a policy choice. Experiments on the effects of news narrative styles show how framing influences attributions of responsibility for public problems as either the fault of the government, or of the person who is featured as an example of the social concern (Iyengar, Reference Iyengar1991; Iyengar & Kinder, Reference Iyengar and Kinder1987); news broadcasters overwhelmingly hew to the latter narrative style (Allen et al. Reference Allen, Stevens, Fox-Arnold, Holtey, Vincent and Woollen2023; Stevens et al. Reference Stevens, Alger, Allen and Sullivan2006).

Framing also makes an appeal to our emotional intelligence, but absent logic or source credibility, the resulting appeals cross a line into propaganda and may be perceived as unsound. Indeed, weak frames (generally those lacking value or logical relevance) demonstrably fail to influence decisions and in some cases increase the efficacy of strong frames (Chong & Druckman, Reference Chong and Druckman2007a). Source credibility can constrain the use of frames to manipulate public opinion (Druckman, Reference Druckman2001), but the credibility of the source matters most when message recipients lack either prior knowledge or preexisting attitudes toward the message topic (Kumkale, Albarracín, & Seignourel, Reference Kumkale, Albarracín and Seignourel2010). Early work in framing effects broadly assumed the veracity of the information to be framed and investigated the effects of placing truthful information into various contexts. In an online context, “message credibility” is a result of evaluations of source credibility, the message frame, and social endorsement (Borah & Xiao, Reference Borah and Xiao2018). This research suggests that the veracity of information is perhaps only one of the many factors influencing perceptions of message credibility, although truthfulness remains a critical dimension in distinguishing framing from misinformation and disinformation as well as our analysis of the effects of each rhetorical style.

Misinformation and Disinformation Definitions and Dimensions

It might seem that the definitions of misinformation and disinformation are self-evident, but our excursion into the meaning and effects of framing signals that matters are more complex. We start our discussion of information, by focusing on the well-researched topic of news representations and the conception of “fake news” and suggesting that much of the theorizing in this area of communication studies can be applied to discussion of “information” more broadly. Fake news is a term that has recently been appropriated by office holders in the United States as meaning simply information with which the official disagrees. Here, we consider the attributes described by Newman et al. (Reference Newman and Fletcher2017, p. 20), writing for the annual Reuters Digital Report: Information invented for the purpose of discrediting others or for monetary gain (e.g., “clickbait”) or news with a basis in fact that is spun to fit a particular agenda. Martens et al. (Reference Martens, Aguiar, Gomez-Herrera and Mueller-Langer2018) discuss information that is suspect because of its source, its dissemination method, and its intention, while the European Union (EU) Commission for Communications Networks, Content and Technology (2018) similarly describes fake news as disinformation: “false, inaccurate, or misleading information that is designed, presented, and promoted” for profit or to cause intentional public harm (p. 3). In contrast to the characteristics of fake news, Newman et al. (Reference Newman and Fletcher2017) measure consumer perceptions of four qualities sought in online news: information of high accuracy and reliability, enabling better understanding of complex issues, communicating strong viewpoints and opinions, and providing entertaining content.

Unpacking these definitions reveals several dimensions to be used in evaluating information, with two that stand out in much of the literature: purpose and truthfulness. Wardle and Derakhshan (Reference Wardle and Derakhshan2017) focus on these dimensions to define three terms: misinformation as false information, where no harm is meant; disinformation as false information knowingly shared to cause harm; and mal-information as factual information shared to cause harm (e.g., doxing, disclosing private information in public forums to enable harm). The purpose of distributing the information is central to these definitions in which intentions to deceive, harm, or profit signal disinformation, exempting errors (which news outlets make and correct), satire, parody, and partisan commentary from this category. The idea of combining mistakes, parody, and partisan news spin in one category deserves greater scrutiny.

The broader definition of misinformation allows for framing, “spin,” and ideological slant as a matter of consumer trust in a news brand. The Reuters’ definition of quality news insists that news outlets meet consumers’ expectations of a given editorial slant and partisan perspective, framing, and filtering of information in reporting. Filtering and slanting news to establish a commercial position in a news market is expected, and consumers want to be presented with the topics, types, slants, and agendas – partisan, ideological, and so on – that they intended to buy when tuning in or clicking on (Martens et al., Reference Martens, Aguiar, Gomez-Herrera and Mueller-Langer2018). Following this broad view of news products, we can imagine misinformation and disinformation on these continua (see Figure 14.1).

A set of 2 classification charts on the 2 types and 3 purposes of misinformation. Types are slanting filtering and verifiably false. Purposes are editorial alignment of ideology and normative expectations, satire and humor, and harm.

Figure 14.1 Misinformation continua – purpose and accuracy

In contrast, the narrow view of misinformation confines the definition of misinformation to the misrepresentation of facts such as omissions and additions that mislead and outright lies. Recent work on US voters’ responses to political advertising (Allen & Stevens, Reference Allen and Stevens2019; Stevens et al., Reference Stevens, Sullivan, Allen and Alger2008) and news coverage of elections (Allen et al., Reference Allen, Stevens, Marfleet, Sullivan and Alger2007) show that inaccurate claims in political advertising along with news framing influence voter attitudes toward candidates and voter participation in elections (Stevens et al., Reference Stevens, Alger, Allen and Sullivan2006). Inaccurate political advertising claims not only result in inaccurate understandings of candidates’ issue positions but also lead to more general confusion about policies, candidates’ partisanship and ideologies, and the issues that are targets of the false claims (Allen & Stevens, Reference Allen and Stevens2019). Distrust of politics and news media increase the likelihood that news consumers will believe online mis- or disinformation (Zimmermann & Kohring, Reference Zimmermann and Kohring2020).

Critics of fact-checking (e.g., Park et al., Reference Park, Park, Kang and Cha2021) show how evidence that is open to interpretation or claims that seem to elude evidentiary evaluations encourage information spin on the part of fact-checkers themselves. Building on research studying partisan perceptions of ad claim accuracy and fairness (Allen & Stevens, Reference Allen, Stevens, Nai and Walter2015; Allen et al., Reference Allen, Lawrence, Stevens and Sullivan2016; Stevens et al., Reference Stevens, Sullivan, Allen and Alger2008), Allen and Stevens (Reference Allen and Stevens2019) offered two measures of claim accuracy: an absolute measure of the accuracy of each ad claim (accurate or not) and an evaluation of the degree of accuracy specifying truthfulness on a scale ranging as follows – a wholly accurate representation of facts, majority accurate representation of facts, and/or inferences drawn logically from facts, mildly misleading from a factual basis, omissions and misleading representation of facts, major distortions with little clear basis in facts, gross untruths (i.e., lies).

Efforts to combat lies in political communication do not correct misinformation (Allen et al., Reference Allen, Lawrence, Stevens and Sullivan2016; Allen & Stevens, Reference Allen and Stevens2019; Nyhan & Reifler, Reference Nyhan and Reifler2010) and may in fact cause decision-makers to double down and hew to errant information that comports with a preexisting bias (Nyhan & Reifler, Reference Nyhan and Reifler2010). In the United States, motivated information processing driven mainly by partisanship led significantly more Republicans than Democrats to reject reports of independent fact-checkers stating that the advertising claims of their favored candidate was false (Allen et al., Reference Allen, Lawrence, Stevens and Sullivan2016; Stevens et al., Reference Stevens, Sullivan, Allen and Alger2008). It may be that these partisan differences are associated with diminished trust in news media, particularly fact-checkers, in an increasingly polarized political climate.

Trust in News Media, Institutions, and Support for Democracy

Trust in news media, worldwide, has indeed declined from 2015 to 2023 (Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023). In the United States, trust in news media declined significantly between 2018 and 2022 and, while recovering to 2015 levels, remains in the bottom half of forty-six democratic countries whose citizens were surveyed in YouGov polling. Worldwide, around 40 percent of respondents say they “trust most news most of the time” (p. 10). Social media information sources are viewed as generally less trustworthy than traditional news platforms; news consumers have grown increasingly skeptical of algorithm-based information selection. Although news consumers generally trust social media less than traditional news media as an information source, trust in social media has been gaining on traditional news media, especially in younger generations (Liedke & Gottfried, Reference Liedke and Gottfried2022). Data from this time period document the rising anxiety caused by misinformation (Newman & Fletcher, Reference Newman and Fletcher2017), a trend that has continued along with record high distrust of algorithm-driven news in the post-pandemic media environment (Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023). While news consumers say they value reliable branding (e.g., ideological orientations) in news analysis, “bias, spin, and agendas” are the main causes of distrust in news media (Newman & Fletcher, Reference Newman and Fletcher2017, p. 5).

Distrust of social media as an information source appears to influence attitudes toward news media in general. In a twenty-six-country survey conducted between 2016 and 2019, Park et al. (Reference Park, Fisher, Flew and Dulleck2020) show that increased accesses to news through social media is linked to a decline in trust in media. Other spillover effects of social media use are also well documented by evidence that internet use increases voter uncertainty (Sudulich, Wall, & Baccini, Reference Sudulich, Wall and Baccini2015), partisan polarization (Bail et al., Reference Bail, Argyle, Brown, Bumpus, Chen and Hunzaker2018; Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023), as well as eroding trust in institutions and politics (Kiratli, Reference Kiratli2023).

Recent work indicates that support for democratic norms and prodemocracy candidates for office is in decline in Europe and the United States (Bartels, Reference Bartels2020; Claassen & Magalhães, Reference Claassen and Magalhães2023; Drutman, Goldman, & Diamond, Reference Drutman, Goldman and Diamond2020; Foa & Mounk, Reference Foa and Mounk2017). In the United States, this erosion of support for democracy is shown in a greater than 20 percent drop in those agreeing that democracy “is better than any other form of government,” over the thirteen years from 2006 to 2019, and a decline from 75 percent to 62 percent between 1995 and 2017 of those rejecting a political system headed by a “strong leader who does not have to bother with Congress and elections” (Claassen & Magalhães, Reference Claassen and Magalhães2023). As survey research shows, support for democracy appears to be in decline, while electoral support for illiberal candidates for office appears on the assent (e.g., Cohen et al., Reference Cohen, Smith, Moseley and Layton2022), not, perhaps, as a move against democratic norms, per se, but simply because voters’ policy preferences converge with those stated by antidemocratic candidates (Lewandowsky & Jankowski, Reference Lewandowsky and Jankowski2022). In experimental research (Lewandowsky & Jankowski, Reference Lewandowsky and Jankowski2022) and survey research (Rydgren, Reference Rydgren2008) in Germany, these preferences converged on the topic of immigration, an issue that has been the focus of considerable disinformation on social media (Newman & Fletcher, Reference Newman and Fletcher2017). In the United States, partisanship is associated with lower support for democratic norms, which include greater belief in the illegitimacy of the 2020 election results among Republicans than among Democrats as well as a higher percentage of Republicans than Democrats who support the former president refusing to vacate the office based on these claims (Drutman, Goldman, & Diamond, Reference Drutman, Goldman and Diamond2020).

The weakening attachment to democratic norms and institutions is strongest among younger citizens (e.g., Foa & Mounk, Reference Foa and Mounk2017). Research suggests this trend is not a function of life cycle effects in which youthfulness accounts for a cohort’s skepticism of democracy, with the expectation that attachments to this form of government grow with age and experience. Rather, younger generations seem less inclined to support democratic norms, institutions, or candidates (Claassen & Magalhães, Reference Claassen and Magalhães2023). In the United States, this decline occurs across each age cohort since World War II.

Younger cohorts are also those who are more likely to avoid news altogether (Eddy, Reference Eddy2022). Reasons for avoiding news consumption include not only disinterest but also concerns for mental well-being and avoidance of caustic, divisive political debates and perspectives with which the consumer disagrees (Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023). News avoidance is a matter of partisan preferences. In the United States, 70 percent of right-leaning news consumers avoid social justice news, while 64 percent avoid climate change and environmental news, for example, while avoidance of these topics among left-leaning news media users is 22 percent and 12 percent, respectively. Reported avoidance among left-leaning consumers of crime and personal security news is 30 percent, compared to 14 percent of right-leaning news users, and 25 percent of left-leaning respondents report avoiding news of business, financial, and economic activities, compared to 9 percent of right-leaning respondents (Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023). The online media landscape has also changed news consumption patterns, particularly in younger age cohorts. Instead of accessing news through such platforms as Facebook or Twitter (now known as X), where mainstream journalists lead news conversations, video-led networks such as TikTok have emerged as an information medium of choice for consumers under thirty-five years old. Coupled with the decline in voice among mainstream news organizations on X, the “new, new media” of TikTok, Instagram, and Snapchat is the domain of influencers (Newman et al., Reference Newman, Fletcher, Kalogeropoulos, Levy and Nielsen2023). From celebrities to sports personalities, views of policies, wars, and current events vie with news analysts for attention and credibility.

No definitive statement of cause can be taken from these patterns, but the trend approaching the eve of the U.S. General Elections and EU Parliamentary Elections in 2024 can be summarized: Increased misinformation, disinformation, and anxieties about discerning fact and fiction join decreased trust in news organizations, institutions, and government; support for democracy and prodemocracy candidates; and engagement with news, particularly among partisans seeking to avoid topics and viewpoints with which they disagree and, more broadly, younger cohorts. These trends are associated with increased polarization of viewpoints in the United States, as is the ascendance of misinformation and disinformation, generally.

We began this discussion by observing that no one thinks that the “information” influencing policy choices is perfect or that all participants in policymaking will adhere to norms of truth, including making warrantable claims. Nevertheless, we suggested the importance, if not necessity, of assuming a general orientation to truth-seeking rather than lie-mongering for the survival of democratic processes. Our look into the practice, prevalence, and consequences of lies supports this conjecture. We now turn to a second trend in politics, increasing inequality, which also hampers policymaking efforts in democracies.

Inequalities and Trust: Sustaining Democracy

Socioeconomic inequality is one of the primary sources of heterogeneity in actors’ resources and motivations, making the increase in income and wealth inequality important attributes of actors in the IAD framework – and of concern to the framework’s underlying social theory. Just as some level of truth-seeking is necessary for the survival of democratic processes, so is trust in those processes. Growing levels of inequality may lead to lower levels of social and political trust (e.g., Bienstman, Hense, & Gangl, Reference Bienstman, Hense and Gangl2024; Bobzien, Reference Bobzien2023), jeopardizing such core IAD principles as claim warrantability and truth-seeking. If neither participants nor processes can be trusted, an Orwellian world of “double speak” (2003 [1949]) signals a critical vulnerability of democracies (V. Ostrom, Reference Ostrom1997).

Trust can be placed in many politically relevant objects and constructs, with political trust and social trust serving as two primary dimensions of analysis. Political trust captures the trust that people have in institutions such as governments, politicians, the legal system, and parties. Goubin and Hooghe (Reference Goubin and Hooghe2020) view political trust as capturing the latent concept of political legitimacy, a necessary condition for democratic governments that rely on voluntary adherence to rules and shared norms to maintain social order. Social trust denotes the belief that “most people” (i.e., the broad group of people not personally known in a society) can be trusted. The concept can extend to related beliefs about whether most people try to be fair or try to be helpful most of the time. Drawing from the vast literature on political and social trust (Uslaner, Reference Uslaner2018), we focus specifically on the links between inequality and each dimension of trust.

Inequality also exists along multiple dimensions such as income, wealth, or social status. These dimensions can be conceptualized and gauged objectively using measures like the Gini coefficient or through a subjective measure such as perceptions of inequality or normative beliefs about the fairness of inequality. Economically rich democracies have witnessed remarkable changes in income and wealth inequality over the past three decades. While income inequality dominated the discourse in the second half of the twentieth century, it is wealth inequality that increasingly takes center stage today. Housing equity is a primary determinant of cross-national differences in wealth inequality (Pfeffer & Waitkus, Reference Pfeffer and Waitkus2021). While wealth comes in many forms for the richest, for most people it comes in the form of housing, where wealth is increasingly concentrated among high-income households (Dewilde & Flynn, Reference Dewilde and Flynn2021). A global real estate firm recently put the scope of housing wealth into perspective: The value of global real estate is larger than global equities and debt securities combined, and three-quarters of that value is in residential real estate (Savills News, Reference Allen, Stevens, Fox-Arnold, Holtey, Vincent and Woollen2023). The relevance of housing to wealth inequality extends particularly to younger generations who are locked out of housing markets, and who increasingly rely more on their parents than governments for resources (Flynn, Reference Flynn2020; Flynn & Schwartz, Reference Flynn and Schwartz2017). Such re-familialization has important policy implications, especially given the role of governments in redistributing resources and risk through social policy.

Inequality can also exist at various levels of the individual, regional, national, or even the supranational tier. Inequalities in resource distribution in any of these often-nested arenas may have an impact on social or political trust. For researchers studying the relationship between either dimension of trust and inequality, the different levels of experience produce diverse forms of the inequality–trust nexus, and diverse scholarly views of the mechanisms driving the identified relationships.

Inequality and Political Trust

Income inequality is commonly measured at the national level, and scholars focused on the United States or Europe have found that a relationship between income inequality and political trust exists across countries and over time (Bienstman, Reference Bienstman2023; Bienstman, Hense, & Gangl, Reference Bienstman, Hense and Gangl2024; Bobzien, Reference Bobzien2023). Studies that fail to find a relationship between income inequality and political trust often instead find a relationship between social inequality and political trust (e.g., Kim et al., Reference Kim, Sommet, Na and Spini2022). These results point to the importance of adopting a broad understanding of both forms of inequality.

People are situated in smaller regional subunits and larger supranational units, and inequality at these levels is also associated with political trust; these levels are nested, leading to interrelated assessments of trust in relevant institutions. Researchers find that both income inequality and regional wealth inequality affect trust in national institutions and in the EU. At the micro-level, lower levels in inequality correspond to higher levels of trust in national institutions; higher levels of inequality similarly depress trust in the EU, a so-called extrapolation effect. At the macro-level, countries with high inequality have lower levels of national trust, a situation that can prompt individuals to place their trust outside national institutions, a so-called compensation effect that corresponds to higher levels of EU trust (Lipps & Schraff, Reference Lipps and Schraff2021). Regional wealth differences may have an additional impact. Economically vibrant urban areas may see more visible EU infrastructure development that prompts more trust than poorer areas where people consider themselves left behind (Lipps & Schraff, Reference Lipps and Schraff2021). This relationship may not be linear, with regions in the middle of the wealth distribution less trusting of the EU than poorer or richer regions (Vasilopoulou & Talving, Reference Vasilopoulou and Talving2023).

What drives such relationships between socioeconomic inequality and political trust? Important factors can exist at the individual level – individual income or individual characteristics like social status – or can be more sociotropic in nature (Bienstman, Reference Bienstman2023; Kim et al., Reference Kim, Sommet, Na and Spini2022). The latter refers to the context in which a person lives (including social norms concerning relationships and autonomy) and its ability to affect the effects of inequality on trust, independent from the impact of any individual characteristics. People can respond to inequality in either a rational, evaluative manner or through a more affective social–psychological response (or some degree of each cognitive style) (Bienstman, Hense, & Gangl, Reference Bienstman, Hense and Gangl2024; Goubin & Hooghe, Reference Goubin and Hooghe2020, Greenwood-Hau, Reference Greenwood-Hau2021). When evaluating degrees of inequality through either rational–evaluative or social–psychological channels, a person’s perception of inequality, rather than the actual distribution of inequality, may more accurately predict trust levels (Bobzien, Reference Bobzien2023; Scheidegger & Staerklé, Reference Scheidegger and Staerklé2011). Lower levels of political trust are associated with higher gaps between inequality perceptions and preferences (Bobzien, Reference Bobzien2023). Scheidegger and Staerklé (Reference Scheidegger and Staerklé2011) find that it is not objective markers like income, but subjective markers like the perception of being at material risk that affect threats to the social order (including the threat that inequality could bring), which in turn affects political trust.

At first glance, the logic for an evaluative mechanism emphasizing individual socioeconomic characteristics seems straightforward. Whether inequality is a matter of perception or a matter of facts (i.e., Gini index based), people evaluate whether the current system works for them, and those who evaluate the system more positively will be more likely to trust the institutions that create the system. This reasoning implies that high-income individuals – who stand to benefit the most from inequality – are the most trusting. A closer look shows several potential caveats to this formulation, given the diverse criteria that define the system that “works for them.”

Greenwood-Hau (Reference Greenwood-Hau2021) emphasizes that people can attribute inequality to structural reasons such as government policy, or to individual reasons such as hard work, positing that those who emphasize reasons like hard work are more likely to have system-justifying beliefs. The study finds that people who attribute inequality to government policies (that benefit high-income workers) have lower levels of political trust. Bienstman et al. (Reference Bienstman2023) similarly find that citizens’ trust in government is predicated on their evaluation of government policy and performance, with inequality as one outcome used in that evaluation. The study also finds support for a “process-based evaluation” whereby inequality affects people’s belief that they can influence the political process, and this sense of external (in)efficacy influences trust.

Sociotropic explanations of the inequality–trust link contend that the criteria for evaluating a given context include values and beliefs, for instance, regarding the moral acceptability of high inequality (Goubin & Hooghe, Reference Goubin and Hooghe2020). Social–psychological explanations maintain that such individual attributes as status anxiety, or even social trust, lead to more visible status differences in a high-inequality society, thereby lowering trust in institutions.

These individual and sociotropic explanations all advance an evaluative logic. Each account also helps explain the otherwise puzzling finding that someone in an economically advantageous position in a high-inequality society nevertheless has lower levels of political trust. In this case, a person may employ a notion of social justice, judging the morality of high inequality in their society and adjusting their trust in government accordingly (Goubin & Hooghe, Reference Goubin and Hooghe2020). Bienstman (Reference Bienstman2023) finds support for both sociotropic explanations. Individuals living in countries with greater inequality have lower trust in democratic institutions, regardless of their own socioeconomic status. Although social–psychological characteristics affect levels of political trust, this research finds that evaluation-based processes (e.g., a performance evaluation based on the individual’s expectations and preferences and the regime’s economic performance) better explain the link between inequality and political trust.

The link between socioeconomic inequality and political trust is confirmed in many studies and empirically supported through plausible accounts featuring either individual characteristics or sociotropic orientations as the main explanation. The association between inequality and political trust is complex, with perceived institutional performance (expectations and evaluations), values, and beliefs about equality and social justice mediating the relationship between these two terms. Examining the contextual and social–psychological sides of the relationship encourages us to pinpoint just exactly what kind of information and beliefs people have in mind when they think about inequality.

Inequality and Social Trust

Researchers accept the near-consensus finding of a relationship between inequality and political trust, moving on to consider explanatory mechanisms for this link. In contrast, scholars currently debate the nature, scope, and scale of the inequality–social trust relationship. Much of the social trust research emphasizes the difference between analyses conducted at the regional level versus the national level, and differences when using objective measures of inequality such as the Gini index versus subjective measures.

Scholars regularly measure social trust (stated also as interpersonal trust or generalized trust) with either a one-item scale asking whether most people can be trusted, or a three-item scale additionally including whether most people try to be fair and whether most of the time people try to be helpful (Hastings, Reference Hastings2018; Kanitsar, Reference Kanitsar2022; Kim et al., Reference Kim, Sommet, Na and Spini2022; Knell & Stix, Reference Knell and Stix2021; Olivera, Reference Olivera2015; Stephany, Reference Stephany2017). Using these measures, researchers have found lower levels of social trust correlate with high-income inequality, especially when making the comparison across countries (see Buttrick & Oishi, Reference Buttrick and Oishi2017 for a review).

As with the nexus of inequality and political trust, the relationship of social trust and inequality is complex. Fairbrother and Martin (Reference Fairbrother and Martin2013) show that inequality has increased while social trust has declined between 1970 and 2002 in all US states, but states (and counties) with the greatest increase in inequality have not shown a significantly greater loss of social trust. Similarly, in a study using panel data from thirty-two EU countries, Olivera (Reference Olivera2015) confirms the conventional relationship between social trust and inequality, but also shows that country-specific factors including institutions and culture, discrimination, and ethnic and linguistic fractionalization may play a bigger role than growing income inequality in explaining declining social trust.

Testing a claim put forward in a review by Wilkinson and Pickett (Reference Wilkinson and Pickett2009), Kanitsar (Reference Kanitsar2022) confirms that the relationship between income inequality and social trust exists at the cross-national level but not at the regional level within countries. Using US tax return data at the state level, Hastings (Reference Hastings2018) similarly finds no evidence for the state-level relationship, but some evidence that an overtime increase in inequality correlates with a decline in trust. Using large, cross-national surveys, Kim et al. (Reference Kim, Sommet, Na and Spini2022) find that social class is a better predictor of social trust than income inequality at the regional level. These studies all use Gini coefficients as their measure of income inequality. Studies emphasizing perceptions of inequality more consistently find a relationship between inequality and social trust (Gallego, Reference Gallego2016; Larsen, Reference Larsen2013; Loveless, Reference Loveless2013).

Larsen (Reference Larsen2013) finds that social trust increased in socially democratic welfare states of Denmark and Sweden in the three decades leading up to 2010, but declined in the liberal welfare states of the United Kingdom and United States. Larsen attributes these changes to cognitive perceptions of inequality. In the United Kingdom and the United States, people began to perceive others as more likely to be in the “bottom” of society with untrustworthy, undeserving, or dangerous personas. Conversely, people in Denmark and Sweden formed perceptions of others in the “middle” of society with trusting and deserving personas.

Perceptions of inequality have also continued to play an important role in theory building. Knell and Stix (Reference Knell and Stix2021) argue that when responding to survey questions about “most people” (the wording of social trust questions), respondents will use a reference group that may be biased in some way, and which can only be captured by using direct measures of perceptions of inequality as opposed to calculations that are based on the actual income distribution. To capture perceptions of inequality, Stephany (Reference Stephany2017) uses age-specific Gini coefficients and finds that the relationship between income inequality and trust does extend to within-country regions, where it can be attenuated or intensify based on the spread of the age-specific indicators. This finding offers a promising line of study connecting objective and subjective measures of inequality. In sum, the research on inequality and social trust generally points to a relationship, but the boundaries of that relationship are still under review. Differences in findings are often explained by measurement differences, with many emphasizing the importance of perceptions. This review brings us full circle, pointing to the key role that information – shared understandings, perceptions, beliefs, and framing – plays in policymaking and democratic self-governance.

Information and Inequality as Amplifying Factors

Growing inequality and growing misinformation are two major trends in contemporary democracies. Both act as depressive forces on trust and confidence in democratic institutions. The research on the misinformation–trust link and the inequality–trust link are in separate fields, but there are reasons to expect that misinformation might amplify the corrosive effects of inequality on trust. We suggest two such amplification channels, linking each back to the IAD framework and noting their policy relevance with respect to social policy as an important equalizing policy tool.

First, high levels of inequality are thought to lead to less social mixing, more insular groups, and more cursory interactions with those beyond a given socioeconomic circle. As the logic goes, fewer and more cursory interactions lead to decreased trust. Second, perceptions and beliefs around inequality are just as important as the actual distribution of inequality.

The studies that demonstrate these two paths offer a departure from Robert Putnam’s (Reference Putnam2000) thesis that social trust is conditioned on civic associations and volunteering; Rothstein and Uslaner (Reference Rothstein and Uslaner2005) argue, in contrast, that inequality is the important factor. In fact, in Hochschild’s (Reference Hochschild2016) sociological study of a community in Louisiana that is plagued by the effects of fracking yet distrusts regulators, political institutions in general, and mainstream news media, she finds highly participatory, yet insular, groups. Her interviews show individuals attuned to media brands and political leaders that support their worldview, reinforcing their distrust and detachment from other social groups.

It is a kind of activism and engagement that democratic theorist Tocqueville describes as collective individualism (Allen, Reference Allen2005). As Vasilopoulou and Talving (Reference Vasilopoulou and Talving2023) point out, people living in place-based communities share experiences and take informational cues from one another. Income and especially wealth inequality likewise have important spatial dimensions. Growing income and wealth inequality, especially through its housing- and community-based dimensions, lead to increased insularity. The increased potential for insularity – as much as the conviviality that place-based groups are assumed to have – again points to the importance of communities and information, as highlighted in the IAD framework. Misinformation may reinforce or amplify this insularity: people may not only have fewer interactions with outsiders but also receive messaging with spin, ideological slant, or even verifiably false information about policies, ideologies, and the issues that are targets of false claims.

As more people hold incomplete or incorrect information, this condition may influence their perceptions of inequality levels or shift their tolerance, or intolerance, for inequality. Such effects may extend to more innocuous forms of framing. Initial evidence of the latter can be found in the Yellow Vest movement in France, where evidence exists that the historic collective framing of inequality, which is at odds with fundamental values of solidarity and equality, drives the movement (Jetten, Mols, & Selvanathan, Reference Jetten, Mols and Selvanathan2020). Experimental evidence exists indicating that citizens trust institutions less when they are led to believe that there is greater inequality (Guinjoan & Rico, Reference Guinjoan and Rico2018). This research, too, points to the importance of information in potential processes of amplification.

These two potential amplification channels, along with the effects that both inequality and misinformation have on trust in democratic institutions, have important implications for social policy and the welfare state. As some have noted (e.g. Gärtner & Prado, Reference Gärtner and Prado2016; Habibov, Cheung, & Auchynnikava, Reference Habibov, Cheung and Auchynnikava2018), inequality and trust shape support for social policies, and inequality is associated with a decline in prosocial attitudes among the poor, an important target group of social policies (Gallego, Reference Gallego2016). In other words, support for the very policies designed to redistribute risk and promote greater equality may hinge on levels of inequality and trust, which themselves are at risk because of increased misinformation. People who do not interact, who receive their news from different sources, and who lack social and political trust may not only be less likely to support one another in interactions that necessitate collective action but also in supporting policies that would lead to greater risk sharing.

Conclusion

We have argued that the IAD framework alerts us to the problems of distrust and disaffection from democracy that go far beyond the correction of misinformation and disinformation. Heterogeneities and inequalities teamed with disinformation may today prevent us from reaching collective agreement on the methods of discerning warrantable information, if we no longer share common understandings to enable a collective epistemic choice, or we now live in distinct epistemes (or all of these).

The methodologies of productive contestation and, broadly, individual and collective inquiry, deliberation, reflection, and choice as well as other aspects of the iterative learning and amendment process (action-outcome-observation-institutional development-action) found in the IAD framework are imperiled by disinformation and the erosion of trust in facts, social and political institutions, and fellow community members. What is to be done to defend democracy in an inequality-centric, digital age?

When it comes to information, in several US states and in the EU, various campaigns aim to educate children and adults in ways to spot spin, reporter error, and demonstrably false claims. In the EU and the United Kingdom, slander and libel laws put the onus on those making claims to warrant their validity, enabling regulations prohibiting false statements in, for example, election advertising, while in the United States, constitutionally protected political speech presently includes misleading and even false claims in political advertising (Allen & Stevens, Reference Allen and Stevens2019). The prevailing belief in the United States is that truth will prevail over falsehoods in an unregulated arena of public speech. Yet, the evidence is decidedly against this conventional view (Allen & Stevens, Reference Allen and Stevens2019).

Structural interventions in the United States will necessarily encounter challenges of First Amendment claims against the regulation of political speech. Yet, we suggest – particularly as artificial intelligence (AI)-generated manipulated images and manipulated speech emerge in social media and mainstream media commentary on this novel form of misinformation – regulation is necessary. EU high commissions have begun such policy investigations; in the United States, regulation is left to social media and cable news platforms.

Inequality, in turn, is both place-based and perception-based. This points to two fronts that need to be addressed. Policies in Europe and the United States have enabled and indeed encouraged the marketization and re-familialization of risk, which reinforce growing inequality trends. Such trends occur in communities through both employment sectors and housing markets (or in other words, both income and wealth), two policy areas where governments have introduced greater precarity instead of security. Increased precarity extends beyond low-income households and other socially marginalized groups who have always experienced it, to also include younger generations who have lived their entire lives during an era of state retreat. Policies must be redesigned to open more opportunities for locked-out groups, which almost certainly requires a stronger state role in re-pooling risk.

Given that cognitive perceptions of deservingness, trustworthiness, and fairness vary across countries, it is likely that some countries will meet greater resistance in this endeavor and have further to go in rebuilding trust. This points to the importance of supranational channels in encouraging risk sharing. The EU has indicated some willingness to move in this direction, including through the European Pillar of Social Rights and by further embedding social policy into the European Semester. However, such tools could be used to greater effect. In the United States, the idea of Franklin D. Roosevelts’ second bill of rights – emphasizing the importance of social rights and risk pooling at the societal level – seems a far cry from current political rhetoric. The first step might be to consider the degree to which communities really are operating in distinct epistemes, and if so, at which action level trust rebuilding initiatives might have the best chance of taking hold.

15 Getting a Grip on Disinformation From Distrust to Trust within Learning Communities

This chapter examines how citizens – individually and as part of their community – can be empowered to fight disinformation from the ground up. An experimental pilot project was carried out with three separate regional learning communities in the northern part of the Netherlands (hereafter the Frisian Area) between September 2022 and September 2023 and comprised citizens from different backgrounds and age groups who jointly engaged in investigating and exposing disinformation. These three communities operated as independent entities, with the participants developing their own rules and roles of investigation and information sharing.

The goal of the project, called “De Pit,” was to experiment with methodology to empower communities to recognize and fight disinformation. Our main hypothesis for this experiment was that there is no “silver bullet” or panacea (a common lament from Elinor Ostrom). By comparing the three communities we have learned that citizens will develop their own and different kinds of solutions. Following this approach, the central question of our research was how participants of learning communities develop their own ground rules, roles, and agreements to critically collect, analyze, understand, and report on the information surrounding them in online and offline spaces.

To answer this question, we present a multiple (experimental) case study to discuss and compare the aforementioned aspects of the three different learning communities. Our aim was to provide as much autonomy as possible to these communities so that they could create their own solutions. Considering the different backgrounds, level of education, ages, and geographical locations enabled us to learn if people create similar or different solutions to fight disinformation.

Additionally, Ostrom’s Institutional Analysis and Development (IAD) framework (Reference Ostrom2005) has supported us in understanding these learning communities and how the participants have developed a research approach enabling them to interpret the facts behind information circulating in the public sphere. More importantly, key aspects are identified and discussed on the basis of monitoring and mutual comparison of how the three learning communities develop, interact, and collaborate.

Finally, as part of this chapter, we present a “roadmap” with conditions and recommendations to implement a successful learning community to inspire and support others in setting up similar initiatives.

The chapter is structured as follows. First, we briefly introduce the project context in connection with recent theoretical developments in disinformation and citizen science studies. Second, we present Ostrom’s IAD framework that supported us as a lens to investigate the three learning communities. Third, we present our methodological approach. A multiple (experimental) case study supported us in collecting and analyzing data from (participants of) the three learning communities. Fourth, we discuss the results of our multiple case study through the lens of Ostrom’s IAD framework to identify and compare key aspects of the three learning communities. Finally, and based on the identified key aspects, we present a “roadmap” with conditions and recommendations to build a successful community to support future citizen-driven initiatives in unmasking disinformation on their own level.

Disinformation as a Driver for Learning Communities and Digital Citizenship

Digitization gives us easy access to an (over)abundance of information that we increasingly seem to distrust. In the social-political discourse on the role of information in society, concerns about the negative effects of fake news, filter bubbles, and troll armies would appear to predominate. Information (sharing) plays a crucial role in our society. However, information is not something that “just” happens to us: The role and responsibility of citizens who send and receive information through various media channels is pivotal. Citizens, for example, can unintentionally spread disinformation which in turn can become misinformation.

Based on Benkler, Faris, and Roberts (Reference Roberts2018, p. 6), we define disinformation as “‘the intentional manipulation of beliefs’ and misinformation as ‘the unintentional spread of false beliefs.’” According to the Dutch National Coordinator for Counterterrorism and Security (NCTV) (2023), disinformation can be used to disrupt democratic processes such as elections or to question the political and administrative integrity of parliament and the judiciary. In addition, disinformation can be used to pursue an economic agenda and to spread messages that can lead to unrest. In general, literature on disinformation mainly focuses on the analysis of the threat (e.g., Bastos, Mercea, & Goveia, Reference Bastos, Mercea and Goveia2021; Benkler, Faris, & Roberts, Reference Roberts2018) or response to the threat (e.g., Chan et al., Reference Chan, Jones, Jamieson and Albarracín2017; Stieglitz et al., Reference Stieglitz, Hofeditz, Brünker, Ehnis, Mirbabaie and Ross2022) of dis- and misinformation. Very few academics (Hassain, Reference Hassain2022; Heinrich, Reference Heinrich2019) focus on how to empower citizens through communities to become more resilient to disinformation. In our project we have focused on disinformation, keeping in mind that misinformation of citizens may develop through false information circulated online and offline. Our intent is therefore to support citizens in learning how to identify and investigate disinformation and become more resilient to its threat. As a result, the aim of the Pit project is also to “equip” participants to prevent misinformation. With these developments in mind, we set up three learning communities aiming to encourage and equip citizens to investigate the facts behind (local, regional, or international) news and information circulation in the public, online, and offline domains. Participants in these learning communities work together as citizen journalists to strengthen digital citizenship as part of an applied research community. In that sense, digital citizenship relates to a person who uses the internet regularly and effectively (Mossberger, Tolbert, & McNeal, Reference Mossberger, Tolbert and McNeal2007). However, looking at recent developments such as online polarization and the massive growth either in the spread of dis- and misinformation in online (e.g., online communities on Facebook or Reddit) or offline spaces (e.g., through face-to-face conversations), this definition seems inadequate to cover the whole spectrum of the problem. Hintz, Dencik, and Wahl-Jorgensen (Reference Hintz, Dencik and Wahl-Jorgensen2017, p. 731) discuss digital citizenship as something that is “… typically defined through people’s actions, rather than by their formal status of belonging to a nation-state and the rights and responsibilities that come with it. It denotes citizens creating and performing their role in society.” Citizen journalism, according to Bowman and Willis (Reference Bowman and Willis2003), is based on citizens taking an active role in the process of collecting, reporting, analyzing, and disseminating news and information. According to Papacharissi (Reference Papacharissi2015), citizen journalism can be seen as an alternative to news organizations that dominate the circulation of information. This means that citizen journalism can be a powerful approach to enhance digital citizenship and critical thinking of citizens in an innovative way. According to Koulolias et al. (Reference Koulolias, Jonathan, Fernandez and Sotirchos2018, p. 22), “… governments need to take collaborative action with stakeholders and invest in innovative ways to deal with misinformation.” The authors suggest that citizens should be empowered to fight misinformation by “… creating a trusted environment for citizens with the adequate educational instruments.” The main aim of our experimental project was to enable participants to navigate and understand “complex information landscapes” based on problem-solving approaches as part of a central community (Glassman & Kang, 2012). Moreover, they act as qualitative researchers of their personal experiences, analyzing identities, events, and cultural phenomena (Markham, Reference Markham2019). Based on these insights, we decided to work in three learning communities, based on trustful relationships and with the possibility of using tools that empower participants to unmask or fight disinformation.

Ostrom’s IAD Framework as a Lens to Identify Key Aspects of Learning Communities

Through our experimental research, we aim to disseminate firsthand experiences from three projects to inspire others to establish similar initiatives within the Netherlands and beyond. Additionally, we will provide a comprehensive roadmap of necessary conditions and recommendations to facilitate the establishment of successful learning communities. These communities will serve to empower citizens in their efforts to combat disinformation while enabling them to develop the necessary skills to investigate and effectively interpret information. The results of our research can serve as a valuable guideline or implementation plan, offering easily achievable requirements and conditions for citizens to collaboratively address disinformation within their local communities.

To achieve this goal, we use the IAD framework to identify how the participants in the three learning communities work together, communicate, carry out research, and jointly come up with community rules and roles. In line with Ostrom et al. (2014, p. 68), we research how “… individuals may legally self-organize in voluntary associations and craft their own rules of interaction.” Figure 15.1 presents the IAD framework as presented by Ostrom, Gardner, and Walker (Reference Ostrom, Gardner and Walker1994).

A block diagram of the evaluation criteria based on the interaction patterns and outcomes. Patterns ensue from the action arena where community attributes of physical or material conditions, and the rules-in-use influence the actors and situations.

Figure 15.1 A framework for institutional analysis.

Source: Adapted from E. Ostrom, Gardner, and Walker (Reference Ostrom, Gardner and Walker1994, p. 37).

This research looked into “face-to-face discussions” related to the “rules-in-use” area of the IAD framework, as well as the “Action Arena” of each project investigating the various “Action Situations” in order to explain “regularities in human actions and results” by applying a common set of seven variables of Action Situations to understand of each of the learning communities: (1) the set of participants; (2) the specific positions to be filled by participants; (3) the set of allowable actions and their linkage to outcomes; (4) the potential outcomes that are linked to individual sequences of actions; (5) the level of control each participant has over choice; (6) the information available to participants about the structure of the action situation; and (7) the costs and benefits – which serve as incentives and deterrents – assigned to actions and outcomes (Ostrom, Reference Ostrom2007, pp. 29–30).

Based on our specific “Pit Action Situations,” we added additional conceptual questions (see Table 15.1) to support us in analyzing and understanding each community, inspired by those of Ostrom (Reference Ostrom2007, pp. 29–30) in the context of overharvesting from a common-pool resource situation.

Table 15.1Additional conceptual questions for analyzing the Action Arenas of the three communities
Action SituationsConceptual question(s) based on Ostrom (Reference Ostrom2007, pp. 29–30)
1. The set of participantsWho are the participants (e.g., demographics, background, motivation to participate)?
2. The positionsWhat positions (or roles) exist or how are these appointed as part of the group process (e.g., which roles are developed and how are these roles appointed by the participants)?
3. The set of allowable actions and their linkage to outcomes

Which types of research technologies and tools are used?

What topics are important for the participants (e.g., are they using online research tools, or offline approaches; what kind of training is used or needed; how do participants make sure that they are safe while researching online or offline information; and what topics are chosen by the participants)?

4. The potential outcomesHow do participants collect and share the outcomes of their research? What is the important learning for sharing outcomes?
5. The level of control over choice

Do participants share their outcomes, how and with whom?

What agreements are made within and outside of the community (e.g., before starting research, choosing a topic, conducting research, are there internal or external threads in choosing a topic)?

6. The information availableHow do participants use, engage with, and collect informational resources (e.g., how a research topic is chosen by the group, how data is collected, how do participants bring their results together)?
7. The costs and benefits of actions and outcomesWhat are important elements for participants to participate? (e.g., what is the added value for the participants to join, be part of the community, how often do the participants meet, what are concerns in researching online and offline information, what did the participants learn)?
Methodology

The Digital Citizenship Lab at NHL Stenden University of Applied Sciences is conducting research by monitoring this project. This Digital Citizenship Lab is the fruitful result of a crossover partnership between two professorships at NHL Stenden University of Applied Sciences: Organisations and Social Media and Cybersafety.

Following replication logic, we based our research design on multiple cases (Yin, Reference Yin, Bickman and Rog2009), allowing us to observe different approaches in the development of the three research communities in order to understand “… values, norms, rules and structures that constrain and enable behavior of human actors” (Groenewegen, Reference Groenewegen2011, p. 16). For this multiple (experimental) case study, we have chosen to conduct research in different areas (local library, higher and vocational education) and with participants from different backgrounds and age groups. Three projects were set up to gain insights into the development of each community based on different rules, roles, and actions. These three learning communities were situated in different parts of the Frisian Area: The Library of Drachten (located in the city of Drachten in the Smallingerland region); Firda School for Vocational Education (located in the city of Drachten); and NHL Stenden University of Applied Sciences (located in the city of Leeuwarden). More detailed information about the three organizations and an additional supporting organization can be found in Appendix 1.

Figure 15.2 illustrates the start of each community, the number of participants, their ages and backgrounds, who supported them during the project, and who was part of the research project group.

A classification chart titled, de pit, for 3 local, Frisian learning communities, with their respective age groups, professions, educator roles, and initiatives. See long description.

Figure 15.2 Overview of the three learning communities

Figure 15.2Long description

The 3 local learning communities are de Bibliotheek, a public library for citizens from the region with 4 to 14 adult participants, of ages 20 to 60 years including professionals, retirees, and educators, N H L Stenden for students and lecturers for higher, applied education, younger students from university of applied sciences younger, with 6 to 8 participants, of ages 23 to 27 years, across multiple disciplines, and Firla, a vocational school for students and lecturers and teen students with 9 participants, of ages 16 to 18 years, in media studies. The start date for de Bibliotheek is March 2022, for N H L Stenden is September 2022 and February 2023, and for Firla is March 2023, with 1 project leader, 2 supporting leaders and 2 students, one each from semester 1 and 2, and 1 supporting lecturer and 2 digital citizenship experts, for the 3 learning communities, in order.

Our data collection was based on various data sources such as information shared by and between participants (e.g., emails, announcements, meeting minutes, and presentation sheets). Additionally, we conducted individual semi-structured interviews in May 2023 with five lecturers/experts (these are marked as C1C5) who coordinated or supported the communities of the project and three separate focus group sessions in June and July 2023 with participants (n = 14, these are marked as P1P14) and three coordinators (C1, C2, C4) of each community to gain insight into how they developed, implemented, and evaluated each project. Although two research communities were organized at NHL Stenden, the decision was made to organize one combined focus group with students who had previously participated in the first and/or second semester due to participant dropout.

Throughout the project, at least one coordinator (lecturer/or expert on digital citizenship) from each community joined the research project group meetings to exchange thoughts on the progress of each community and also to ensure clear communication concerning the project aim or practical issues, such as data collection. Two researchers from NHL Stenden coordinated the project, supported by student assistants. Additionally, the coordinators and researchers received support and advice from local experts in the area of digital citizenship (from “Fers,” see Appendix 1).

Participants from each learning community carried out research by collecting information through surveys, interviews, and evaluations with internal and external stakeholders. Because of the experimental nature of this project, we used the focus group meetings and individual interviews with coordinators to identify how research was carried out in practice and how the participants experienced the process of community building, data collection, analysis, and sharing.

In July 2023, we also organized a mini-conference, inviting participants from the three communities to exchange thoughts and experiences. As a part of the program, the participants briefly presented their Pit project and shared the essential learning gained.

All participants received research consent forms via the group coordinators related to data collection and assurance of anonymity. Finally, all collected data were stored in a central and secured digital research environment at NHL Stenden University of Applied Sciences, accessible only by the authors.

Findings

We will present our findings regarding the three Pit learning communities, beginning with a general introduction to this experimental project and the three separate learning community cases. Next, we will present the three cases by describing their Action Arena, research topics, working approach, communication, research approach, and experiences along with a set of learning outcomes of the participants of each community. To illustrate the three cases, we utilize the IAD framework. Specifically, we delve into the participants’ “Action Arena” and examine the “patterns of interaction,” such as the roles and agreements that evolved over time. Lastly, we will provide a concise summary of the collective key insights shared by attendees at the mini-conference held in July 2023.

Introduction to ‘De Pit

In March 2022, a local library in the Frisian Area started a pilot project to help citizens find their way in a society where dis- and misinformation circulates. Following a limited recruitment campaign, a learning community was formed, comprising fourteen participants from the same regional area, but with different backgrounds and educational levels. Essentially, during this pilot, participants were free to decide on which subject they wanted to investigate, be it gossip in a neighborhood watch WhatsApp group, conspiracy theories, or (regional) news. This first learning community was called the “Pit,” a local information evaluation initiative; in Dutch, “Plaatselijke Informatie Toetsing.”

Following this pilot project, two additional Pit projects were set up in September 2022 by engaging with students, lecturers, and researchers from secondary vocational education and higher professional education. Each group was able to investigate one or more self-chosen research topic(s), based on different research approaches. For example, students from NHL Stenden University of Applied Sciences applied hands-on tools (such as reverse image search) within various online channels and platforms to analyze and validate online information, additionally building a Discord community to evaluate their findings. The participants of the secondary vocational Pit at the Firda School chose applied field research, such as interviews with internal and external stakeholders, to gain an understanding of topics closely related to their daily life.

Case 1: The Pit at the Local Library in Drachten
The Action Arena of the Library Drachten Pit

The Pit in the Library of Drachten was set up in March 2022 as a pilot initiative with the aim of developing a larger project with different communities in the Frisian Area. This citizen initiative was started by C1, an expert in innovation in information science at this very library. Participants were recruited by placing an announcement in the local newspaper inviting interested parties to the very first Pit meeting on March 16, 2022 (see Figure 15.3).

A local newspaper announcement of the first meeting of the Pit. The text details the date and agenda of the meeting and invites volunteers to join the team. See long description.

Figure 15.3 Announcement for the first meeting of the Pit in March 2022 (translated from Dutch).

Figure 15.3Long description

The text of the announcement reads, are you a curious or thorough, investigative, philosophical, eloquent interviewer, digital resource researcher, and resident of Smallingerland? Then my question is, do you want to volunteer in a new special team? The library is at the heart of society and belongs to and for every resident. For a special project, we are looking for volunteers who want to think about this. What is happening all around us, your direct surroundings, neighborhood, village, or street? We are looking for people who want to jointly think about this. Perhaps you feel addressed, but you still have a few questions, and you want more information at first. That is possible. There will be an information evening on Wednesday evening, 16 March 2022. On this evening, you will get an impression of the project and the goals of the project. The code name of the project is the PIT. If you have any questions in advance, you can contact, followed by, email address, written within square brackets. To participate in the information evening you can register via, followed by, link, location, and time, each written within square brackets.

Fourteen citizens (seven women and seven men) from various backgrounds (e.g., a retired journalist, teachers with various expertise, a housewife, a company director, and an ICT specialist), aged between thirty and sixty, responded to this call. All participants live in the city of Drachten and exhibit a deep sense of engagement with their immediate living environment. The number of individuals involved varied between four and twelve, with a core group of four participants maintaining regular contact.

The Drachten Pit can be characterized as a local research community with a focus not only on gaining insights into digital disinformation but also in promoting more social cohesion in the neighborhood by discussing quality of life, social well-being, and social and physical safety with local residents. The working language is Dutch.

Participants Roles

Participants find it difficult to define their role within the group. “There is no clear division of roles.” Participants see themselves primarily as (field) “researchers,” while C1 is mainly the driving force and project leader (see Figure 15.2). According to the focus group participants, one member has the role of a digital researcher specifically to double-check information or cross-reference information gathered from the neighborhood and share these insights with the group.

Research Topics

According to C1, the first project was based on a local rumor about young men causing problems by recklessly driving motorboats on a village canal system. Participants had heard different versions of the story and decided to investigate by gathering information through online research, interviews, and by visiting the neighborhood. They discovered that there had only been one incident involving two young men, whereas the rumors circulating in the neighborhood and media had blown it out of proportion. They decided to inform the people in the direct neighborhood and not the press to eliminate polarization and rumors.

The choice of research topics is primarily based on the interests of the participants, often around a specific topic or situation in their direct environment with the aim of discovering what is precisely going on and to what extent the resident(s) in question are causing a nuisance. One example related to a local resident’s messy front garden full of clutter and waste in the neighborhood where P2 lives. P2 decided to conduct a small neighborhood survey. This resulted in determining that clutter is a subjective concept and that those living in the direct vicinity do not experience any nuisance. Although a man with mental health issues lives in the house in question, residents accept him as he is. P2 indicated that her research was “not unbiased.” However, by testing her own ideas and thoughts, she adjusted her initial impression of a resident who she thought had been intentionally causing problems in her neighborhood.

Another approach in choosing research themes is by going into the neighborhood and asking passersby what they think of their living environment, what they are concerned about, and what they would like to change. One topic that recently emerged is loneliness among residents of a recently renovated apartment building. In this complex, the communal meeting space has been removed to make way for new residential units. This means there is no longer an opportunity for apartment residents to meet each other easily and undertake activities together. The Pit staff are now investigating how an alternative communal meeting space can be created. They have also decided to keep in touch with residents on a regular basis.

Interaction Approach

The Pit meets approximately once every four weeks in the Drachten library. Participation is on a voluntary basis. Although participants repeatedly indicate that their working method is “still work in progress,” it usually goes as follows:

  1. 1. Go into the neighborhood to collect data and make a report – in writing or not.

  2. 2. Discuss feedback of findings during Pit meetings.

  3. 3. Discuss with other Pit participants whether and how the findings can be followed up. Can we do anything with this? Is further (online) research necessary? Has adversarial hearing been applied?

  4. 4. In-depth/nuance of the current research or start of new research.

Communication

Communication is mainly analogue among the participants. During the physical meetings, the participants discuss their current and future investigations. A recurring point of discussion is whether to publish research findings. Publishing could be a way to bring certain neighborhood problems to the attention of a broader audience. Publishing could also ensure that the problems are placed on the agenda of policymakers or other relevant actors. Nonetheless, the decision not to publish is made again and again. The reason given is that it could potentially damage the bond of trust with residents who have acted as a source of information. However, there is an explicit desire to share the results of the neighborhood research and additional fact-checking through online research with local residents, particularly those who raised the questions. According to the focus group, residents who have participated can then decide for themselves what they wish to do with the findings. More importantly, this allows them to control the way in which the problems they experience are tackled. Finally, the decision not to publish did lead to the departure of two participants: They did not agree with this decision and considered participation in the Pit no longer relevant.

Outside the meetings, there is incidental contact between participants (telephone, email). This usually concerns contact between an individual participant and the coordinator of the community (Cl). During the Pit meetings, Cl clearly has a leading role. He also has the final say when it comes to decisions, such as removing a participant from the group (should a participant push topics that would only be for personal benefit and not for the local community). He also decides on the themes being investigated or whether to participate or to publish. From the focus group meeting, we therefore learned that participants rely strongly on Cl to make final decisions.

Research Approach

Almost all research activities are analogue. Participants go out alone or sometimes in couples. There is no clear research strategy: “we go into the neighborhood looking for information.” The motto is an open, listening attitude to “find out the truth.” A training course on effective interview skills was initially planned; however, participants feel that this is no longer necessary. They have noticed – sometimes to their own surprise – that people respond openly to their questions. They had also expected to encounter opposition or aggression, but so far this has not happened. The explanation they give for this is that they may not be seen as “officials” (pastor, social worker, or local police officer); they are “just” involved fellow citizens who are genuinely interested in people’s stories. P1 pointed out: “We don’t have a cap (in Dutch ‘pet’), we have the Pit.

Little to no information is recorded (and certainly not systematically) about findings gained from street interviews and door-to-door investigations. However, participants point out that their experience and impressions are always discussed during a following Pit meeting.

The idea of enriching the street interviews with other research, for example, by also asking questions to people from the municipality, police or welfare work, has not really gotten off the ground yet. The idea of feeding a “completed” study back to the stakeholders, in this case the problem holder(s), has yet to be implemented.

Participants’ Experiences

Participants find it fun and educational to be part of the Pit. They feel they are contributing to enhancing connections between people in the neighborhood. People’s openness is mentioned as a positive experience. The participants pointed out that residents like to be able to tell their story, and they enjoy being able to offer a listening ear and thus get a more nuanced picture of what is going on and what people are concerned about.

Participants’ Learnings
  • The PIT community is local, open, independent, and socially relevant.

  • Equality among participants is crucial; everyone must contribute and engage critically.

  • Exploration beyond the secure environment of the Pit is encouraged.

  • The common goal is fostering connectivity within the neighborhood.

Case 2: The Pit at NHL Stenden
The Action Arena of the NHL Stenden Pit

At NHL Stenden, the Pit project was organized twice. Students were recruited via an online intranet announcement and through short guest lectures at various courses to promote the project. The first community was carried out between September 2022 and December 2022 with six students. The second learning community was organized between February 2023 and July 2023 with eight students. On both occasions, the students were supported by student assistants. Both groups were provided with hands-on training sessions in different areas, such as the use of Open Source Intelligence Tools (OSINT). Students also received an OSINT toolbox (developed by the student assistants), with links and examples. According to C2, participants got to know each other via an initial online research exercise by “looking up” one of the team members on the internet and finding as much information about that person as possible. Later, based on the collected information, they introduced the chosen participant to the group without mentioning a name. Besides being an informal introduction, the exercise served as an eye-opener to show participants what kind of information is circulating about them in the public domain. The participants of the first group decided to investigate disinformation in the Russian–Ukrainian war and anti-Western propaganda. Additionally, they developed ground rules for the group based on a group discussion that was held at the beginning of the project (Vissia, Reference Vissia2022):

  • The group’s cooperation is key. The results of the research and working with OSINT come second.

  • To maintain momentum, a fixed pattern in meetings is important. We meet once a week for 1.5 to 2 hours.

  • Respecting each other’s opinions is important in good collaboration.

  • A negative atmosphere and undesirable behavior are the two most important indicators of an unsuccessful research group.

  • Designating roles is not desirable but will be gradual within the collaboration. There is a democratic system in decision-making.

  • The group is autonomous and makes decisions together, but there is an umbrella organization that they can fall back on if necessary.

  • Contact with each other is via Microsoft Teams and WhatsApp.

Based on these ground rules and the observations of the participants during the first semester, six important outcomes were brought forward by one of our student assistants (van der Hooft, Reference van der Hooft2022):

  1. 1. Working together: Create a pleasant ambience between participants. Openness and respect for each other’s opinions and ideas is key.

  2. 2. Leadership: There is an overarching organization that the group can fall back on. The group is autonomous and makes mutual decisions.

  3. 3. (How to) research: Directly address issues brought up by the group members. Choose subjects that can be researched with OSINT tools.

  4. 4. Motivation: Be aware of the influence of intrinsic versus extrinsic motivation to join the project because some students get credits, others do not but join voluntarily.

  5. 5. Apply a structured approach: A permanent structure is important for motivation and continuity. Divide research into phases to keep it structured.

  6. 6. Knowledge: Sufficient knowledge about the research methods and the dynamics of the group is a must. Every group is different! Therefore, ask for feedback per meeting.

Based on these outcomes, we set up another learning community in the second semester. The Pit in the second semester started in February 2023 and closed at the end of the study year in July 2023. Since this Pit community mainly consisted of students who had volunteered, this community has unfortunately not continued into the new study year 2023/2024 due to study-related time constraints, in spite of the enthusiasm felt by the students.

All participants were bachelor students and between twenty-three and twenty-seven years old. During the project they focused on the war in Ukraine and aimed to investigate how and by whom war information (or war propaganda) is shared and to what extent this information is reliable.

From the focus group meeting, we learned that participants had a shared interest that gave direction to their research. In addition, participants pointed to the “diversity” of their community, as it consisted of students from different higher education courses (see Figure 15.1) and with different cultural backgrounds (Indo-Chinese, German, Costa Rican, Dutch, Surinamese Javanese). The working language was English.

Roles of Participants

During the focus group meeting, participants reflected on their role during the project. One participant (P5) called himself an “investigator” but noted his experience acting as a project manager – a role he had never held up to that point, but one which he would like to adopt later. Other participants (P6 and P4) characterized their role as “participant.” Both felt they had done little research up to that point as “the Pit is still in its early stages.” This is also due to other activities at the university. However, the participants indicated a desire to continue with the Pit in the future. One participant (P4) believed that the allocation of roles should be determined democratically if a new setup was to come in September 2023. A student assistant (C2) took on the role of “supervisor” and “expert” to coach and support the community.

Research Topics

To share initial ideas and democratically choose a research topic, the participants used the interactive online tool Mural, a digital whiteboard for brainstorming. Participants were unanimous in their preference for a research topic: the Russian–Ukrainian war. Subsequently, subthemes were introduced: (1) Force on Force – military activities and developments at the front and (2) Humanitarian Impact – consequences of the war for the civilian population. One participant preferred to work individually and focused on disinformation and war propaganda.

Another participant (P5) said he followed international news closely. The invasion of Russia and the subsequent developments had his full attention. He stated the war was a major problem: “this is a mess and this is going to last a long time.” According to this participant, the Pit’s role was to maintain attention on the war and its impact on civilians. “I see that the news about the war is disappearing from the front page and shifting to pages 2, 3, 4 …, but I want to continue following it even if it gets obscured.”

Participant P4 was particularly interested in the Russians’ use of Western technology, such as, the participation of Dutch companies in the construction of the controversial Russian bridge between Russia and Crimea. P5 indicated his personal interest in “modern warfare”: “It raises a lot of questions – and for unanswered questions there is a deep dive needed.”

Finally, P4 recognized the war as a significant conflict. He saw the advantage of this current topic and the many sources available. According to him, this made it relatively easy to conduct research. P5 agreed: “it is world news!

Interaction Approach

The NHL Stenden Pit met approximately every week. Apart from some physical meetings, the meetings were online. Discord, an online community platform, was chosen as a meeting space to communicate and share information with each other. Given the activities (study, work) of the participants, two meetings took place every Friday. Participants were able to decide which timeslot they would like to join to discuss their research, questions, and findings.

Our student assistant and supervisor (C2) started each meeting with a workshop or mini-lecture on, among other things, conducting safe online research, the use of OSINT and legal and ethical frameworks. P5 indicated that he appreciated this very much: “I am not an IVK student (= safety science student) nor IT professional. So, I needed extra help. For example, when installing and working with a TOR browser. I have also learned to handle it carefully: don’t do it alone.” Through the workshops, P4 learned more about security, the use of Proton Mail, VPN, TOR, and virtual machines. He said: “there are enough hackers.” Furthermore, the participants indicated that they also needed help using OSINT. There was therefore a clear need for guidance and explanation in doing online research and C2 provided this with lectures and an OSINT toolbox.

Communication

Initially, Microsoft Teams was chosen to communicate with each other as well as Discord among participants. Occasionally, email was used, for example, to share central announcements or reminders about upcoming meetings. Informal contact took place via the app Signal. One participant also indicated that physical meetings were essential for the community – “that works better” (P4).

Research Approach

Participants indicated that little research had been conducted. Except for C2, participants had little to no experience of doing online research. They indicated that it was important to first have a basis: about dos and don’ts, OSINT usages, and safety issues regarding online investigations. This learning process progressed over several months, with participants noting during the focus group meeting that: “They now feel they know more and want to get started.”

P5 indicated that he saw many parallels between the current war and the situation in 1917. According to him, Russia is unstable. He therefore aimed to find out how citizens talk about the war and what sentiments are at play to get a better picture of the (in)stability in Russia. P5 started studying online comments on VKontakte (VK), the largest Russian social media platform – comparable to Twitter (now X). He used a translation tool to analyze the comments on VK.

No agreements were made about when a study was completed or when certain findings would be suitable for publication. One participant explicitly indicated that he wanted to publish.

Participants Experiences

Participants explained they are intrinsically motivated. This ensures involvement and bonding. The participants also wanted to continue with the Pit. P4 indicated that they learned to work together in a group – not because they happened to be in the same class, but on a voluntary basis. P4 went on to mention that “working with others and making new contacts” was an important experience. Because the Pit is not part of the curriculum, collaboration was more difficult to organize as everyone had their own agenda, which meant that not everyone could always be there at the same time.

Everyone indicated that it was fun to work together on a subject that really interested them. In addition, learning to work with OSINT was a plus for everyone. Participants indicated that a lot of information about the war could be found online, but that it was not always clear what had been posted by whom and to what extent so-called troll farms were active. A Pit experience was to learn to look more critically at a “news fact” by consulting multiple sources (Dutch News NOS, Al Jazeera, etc.) and to search more actively for different perspectives on news. Finally, C2 – a student himself – mentioned supervising a group of students as a learning experience.

Participants’ Learnings
  • Facilitate strong foundation with comprehensive knowledge and advanced skills in online research, security, ethics, and OSINT.

  • Exhibit decisive leadership, provide clear guidance to the group.

  • Establish focused research themes early.

  • Foster genuine interest and motivation.

  • Promote collegiality and familiarity.

  • Maintain consistent communication through scheduled meetings.

  • Encourage collaboration and proactive initiative.

  • Optimal participant number.

  • Set goals and interim deadlines.

Case 3: The Pit at Firda School
The Action Arena of the Firda School Pit

All participants were vocational education students aged between sixteen and eighteen years old who were very interested in what was going on at their school and motivated to investigate which topics were important to fellow students and teachers.

The Firda Pit was started with twelve participants consisting of first-year editorial media students and a coordinator/teacher. Ultimately there were nine active participants, four boys and five girls. The working language was Dutch. Due to time constraints, the Pit was integrated into the regular curriculum with limited contact hours.

The project began in March 2023 with an interactive session in the social sciences program for Firda students hosted by a youth organization. During this session, the topics the students would choose for their Pit project were determined through interaction. Additionally, according to C5, an expert on digital citizenship and project leader on the topic at Firda School and Fers, the students also took part in an awareness workshop about “online filter bubbles” and “online group polarization” provided by an external expert.

Participants’ Roles

A characteristic of the Firda Pit was that participants had a clear shared interest in which they adopted multiple roles and had “respect for each other.”

P9 called herself a “researcher” and conducted research on the theme of poverty. To this end, she created questions for interviews and a podcast. P10 indicated that she had taken on a creative role, for example, creating posters for the podcast. P7 chose an editorial function, for example, contacting guests for the podcast, but also working creatively, recording an intro for the podcast. P9 and P12 both indicated contributing to the theme “discrimination and prejudice” and worked together on the podcast. Regarding the topic, P13 specifically studied a case on a Dutch food brand that developed an anti-poverty campaign by examining various websites and blogs on this case. P11 also chose a dual role as a researcher and creative participant. P14 took on the role of project leader because her fellow students elected her and she was also motivated to take on this job.

Research Topics

To choose a research topic, the aforementioned session was held with an external agency that has experience in brainstorming with young people. Step-by-step and through a joint voting process, the students ultimately chose a main topic, namely, inequality of opportunity. There were three subthemes linked to this: poverty, social prejudice, and discrimination. Poverty was ultimately the most important topic for the students. This was partly because students in the group had personal experience of poverty. P12: “… because I was talking about the influences of what poverty does to children and then you know what. Yes, it wasn’t really poverty for us, but we were tight on cash.” This topic was then divided into specific subthemes: menstrual poverty, money, and life. The students eventually conducted interviews with experts and professionals at school and processed these topics into a podcast with the aim of putting these issues on the school’s policy agenda.

Interaction Approach

The participants worked on the project in close consultation with each other. There were four to five physical meetings during the project.

The coordinator/teacher (C4) supported the students where necessary and, as a coach, also offered space to work on the project. He also helped the group make decisions at crucial moments, for example, creating the podcasts to bring attention to the issues. During the interview, C4 claimed that working together intensively for a short period and keeping the flow going had several advantages over working over a longer period irregularly with occasional contact. The Pit at Firda was partially integrated into regular education and therefore the participants could work on Pit activities without distractions. This resulted in high productivity with concrete output, that is, a podcast and special cabinets for sanitary towels at school. Furthermore, there was an intensive collaboration that fostered effective communication among the participants. Every participant was actively engaged and dedicated to completing their mission within the limited time frame.

Communication

The students communicated via WhatsApp, Microsoft Teams, and Snapchat. The students had different types of app-groups, study-oriented, project related, or more fun and leisure related. They also met during class and individually contacted C4 for advice. In the context of communication, the students also indicated how important it was to create ground rules with each other and to stick to these.

Research Approach

The students opted for interviews with experts and professionals. They had prior knowledge of research skills, and specifically interviewing, due to previously completed assignments in this area with provided literature. In addition, the students also engaged in desk research using search engines such as Google. Examples of keywords they used related to the research themes were “poverty,” as well as the name of the retail brand linked to promoting school breakfasts for underprivileged children to fight hunger. Websites and blogs were also consulted. All students claimed to have “double-checked” information during the research. P13: “Check if something is reported differently somewhere else. Double check. Just look it up separately and then see where the source comes from and whether it is reliable.” The students indicated they are taught this at school and are also reminded by teachers to check their sources. The students also took notes of their collected sources, for example, website URLs.

The information collected by the students was shared directly during the podcast conversations. The students had agreed on this. The aim was that sharing the information would have a surprising effect as P9 states: “… we did it this way, because we had all agreed not to share information for the podcast, because then it would be a surprise and then you can really see the reactions to it.”

Agreements were made to complete the research before the end of the academic year and to share the results via the podcast in the autumn of 2023. It is not yet clear who the podcasts were shared with internally/externally. The students themselves felt that the results/podcast should be made public.

Experiences

Participants were intrinsically motivated, which ensured involvement and bonding. Some of the participants were also experts through their personal experience with poverty and indicated that the research had not specifically generated new knowledge but had helped to make the topic of poverty more visible at their school.

They indicated that it was fun to work together on a subject that really interested everyone. Another advantage was that as first-year editorial media students, they could learn and apply skills related to their studies, such as making a podcast and developing other creative media products such as posters. Besides this, they were also trained in fact-checking.

The participants were less satisfied with the communication at the beginning of the project and would have liked more clarity about the goals of the project. Ultimately, a topic was chosen that was not directly related to disinformation. This was partly due to the lack of clear communication about the project at the outset and about the expectations toward the students.

The students were very proud that their research, particularly the conversations with experts, such as the location manager of Firda, ensured that the topic of menstrual poverty was put on the map and that menstrual products are now available free of charge on every floor of their school building. P13 says about this result: “But I actually think that is the best result. It really had a purpose, so I think that’s what we achieved with it.” They were also proud of the way they worked together and how, as first-year students, they supported each other to just try things out, even though it seemed difficult to them at first, for example, organizing and conducting interviews and making podcasts. The coordinator/teacher C2 also emphasized the importance of a close-knit team, in which members trust each other and work toward a common goal. In this case, although the participants had known each other for quite some time, additional investments in team building were made at the start of the project.

Finally, the students were happy to have worked on a project related to their future work with room for everyone’s opinion and open discussions, as P12 states: “It is also part of our profession, what we will do later, if we join this profession, we also have to experience it. So, I just thought it was nice to see as a class that we quickly agree with each other. And if we didn’t, that we could tell both sides, well different sides, stories and then make a choice.

Participants’ Learnings
  • Start sessions with comprehensive information about online research and disinformation.

  • Foster a professional atmosphere for meetings. Establish communication agreements.

  • Encourage mutual respect and diverse perspectives.

  • Foster an inclusive environment.

  • Encourage teamwork and problem-solving.

  • Define shared objectives.

  • Promote willingness to explore new approaches.

  • Prioritize thorough research and seek assistance when needed.

Joint Learning from the Three Learning Communities

As described earlier, in July 2023 we organized a mini-conference inviting the coordinators and participants from all three learning communities to get to know each other and share their experiences. During interdisciplinary sessions, we asked the participants to share important joint learning gained for future Pit projects. Figure 15.4 provides an overview of these essential learnings of the participants.

A chart presents the joint essential learnings of the participants. See long description.

Figure 15.4 Important joint learnings for future Pit projects.

Figure 15.4Long description

The chart gives a structured list of guidelines for creating effective learning communities.

  1. 1. Safe Working Environment

    • Build a safe working environment by creating clear agreements.

    • Provide training to safeguard or online researchers' activities.

  2. 2. Educational Integration

    • In education, link with an interesting curriculum including credits.

  3. 3. Communication Protocols

    • Clear agreements about communication, online or offline, channels, and apps.

    • Agree on fixed meeting times and consultation moments.

  4. 4. Group Composition

    • Build stable groups with 8-12 participants per community.

  5. 5. Role Allocation

    • Establish a good division of roles.

    • Consider participants' strengths and preferences.

  6. 6. Expectation Management

    • Address mismatches between expectations/outcomes to prevent dropouts.

    • Create a roadmap with joint vision while respecting differences.

  7. 7. Research Framework

    • Define clear research frameworks and joint topics.

    • Provide OSINT/interviewing skills with expert collaboration.

    • Set investigation completion agreements.

    • Evaluate feasibility/desirability through reflection.

  8. 8. Time management and expectation

    • Create a clear research framework. Make agreements when an investigation will be finalized.

    • Evaluate, reflect, map out feasibility, and desirability.

Identifying the Similarities and Differences in the Three Learning Communities

Following the presentation of the three experimental cases and essential outcomes from the coordinators and participants, we present the similarities and differences in the three learning communities based on our additional conceptual questions (see Table 15.1). Figure 15.5 provides a visual overview of the key aspects we have identified in each case. These key aspects and the joint essential outcomes from participants (see Figure 15.4) will also provide the basis for our “roadmap” for future community projects and initiatives to fight disinformation.

A comparative analysis of three cases. It involves set of participants, positions, set of actions, potential outcomes, the level of control over choice, information available, and the cost and benefits of actions and outcomes. See long description.

Figure 15.5 Comparing the three learning communities; visual overview of the key aspects identified in each case.

Figure 15.5Long description

A comparative analysis of 3 cases. It involves the set of participants, the positions, the set of actions, the potential outcomes, the level of control over choice, the information available, and the cost and benefits of the actions and the outcomes.

Case 1 is de Bibliotheek:

  • The set of participants are between the ages 30 and 60, of various educational levels, from different professions, both retired and working, and are those who have investigated local neighborhood topics.

  • The positions include researcher, participant, driving force, project leader

    Self-choice is available except for the role of project leader.

  • The set of actions are as follows.

    • Neighborhood survey and street-door-to-door interviewsOpen, listening attitude to 'finds out the truth'.

    • Interview skills. No training provided.

    • Meso theme is addressing local themes and promoting cohesion among residents. Themes include impoverishment, street lighting, unsafe traffic, and lonely elderly people.

  • The potential outcomes are as follows.

    Data is collected offline from the direct neighborhood. Findings are reported and discussed in group sessions. Outcomes are shared among participants during meetings.

  • The level of control over choice is as follows.

    Options closed to share findings with media. Open to share with citizens who were interviewed or brought up by investigation topics. Different options in the group, project leader decides.

  • The information available is as follows.

    Topic choice via investigating sentiment and questions in the neighborhood, discussing the topics, or by testing own viewpoints. Participants are free to choose a topic individually or as teams. Stepwise process.

  • The costs and benefits of actions and outcomes are as follows.

    • Monthly library meetings in the evening + email.

    • Voluntary participation.

    • Choose a role that matches interests and skills.

    • Time constraints for researching topics.

    • Learn how to engage with local residents and collect or solve problems, or visit presentations.

    • Try new research approaches with group support.

Case 2 is N H L Stenden:

  • The set of participants are of the ages 23 to 27 years, B A students, investigating international topics.

  • The positions are investigator, participant, supervisor, moderator, expert who will be the project leader. Self-choice and switching roles are possible.

  • The set of actions are as follows.

    • Online research workshops on conducting secure research, using Open-Source Intelligence Tools (OSINT), and understanding legal and ethical frameworks.

    • Support from teachers and researchers.

    • Training on research safety using safe browsers.

    • Macro theme is the research on the Russian-Ukrainian war, with sub-themes on military activities, humanitarian impact, and war propaganda.

  • The potential outcomes are as follows.

    Data is collected via various online channels, communities, and platforms. Findings are reported and discussed in group sessions. Outcomes are shared among participants during meetings.

  • The level of control over choice is as follows.

    Option is open on how to share the findings.

  • The information available consists of different opinions and sharing or not sharing them have to be discussed. Topic choice is via a collaborative interactive process using Mural. Stepwise process, funneling ideas and voting are required for the final topics.

  • The costs and benefits of actions and outcomes include the following.

    • Weekly school meetings on Discord, with two time slots.

    • Voluntary participation for credit.

    • Choose a role based on interests and skills.

    • Limited time due to course obligations.

    • Students like meetings with peers from different courses or faculties.

    • Learn O S I N T tools safely.

Case 3 is Firla:

  • The set of participants are aged 16 to 18 years, are media editorial students, of vocational level, who are investigating school topics.

  • The positions include researcher, creative, project leader, coach. Self-choice and combining roles are possible.

  • The set of actions include Expert interviews and desk research, learning to double-check information from online sources, getting support from an external expert and teacher, developing research skills in the curriculum via workshops on filter bubbles and group polarization.

    The micro theme explores what's going on at our school, translating results into media products and solutions.

  • The main theme is social inequality. Subthemes include poverty, prejudice, and discrimination, and finding a solution for period poverty in school.

  • The potential outcomes are as follows. Data is collected via interviews with stakeholders and via online desk research. Findings are reported and discussed during group sessions. Outcomes are shared among participants and translated into media products such as podcasts to share with fellow students and stakeholders at school.

  • The level of control over choice is as follows. The option is open on how to share findings.

  • The information available includes the following.

    Differing opinions require discussing sharing or not sharing. Topic choice is via co-session with professional youth communication organizations and with the support of the coach of the group through a stepwise process.

    Funneling ideas and voting are ideal for the final topics.

  • The costs and benefits of the actions and the outcomes are as follows.

    • Flexible meeting hours based on planned lessons with the coach at school and contact via WhatsApp and e-mail.

    • Mandatory participation.

    • Choosing and combining a role that matches interests and skills.

    • Learning to work closely in a group with respect.

    • Trying new approaches supported by the group in a secure environment.

The participants in this study represented diverse age groups. Participants primarily identified themselves as researchers. However, they were also capable of assuming multiple roles and responsibilities within the community. The participants were free to select their preferred role, while a project leader was appointed to provide guidance and direction. The techniques utilized for investigation may vary, with global subjects being addressed through online data investigation and local subjects through direct inquiry of stakeholders. Nevertheless, an array of interview techniques was utilized throughout the research process by all groups. Extensive discussions were held on research topics and dissemination of findings. Furthermore, the selection of research topics was a collaborative process, with both online and offline approaches being employed to facilitate the decision-making process. Meetings were scheduled regularly, either on a weekly or monthly basis, or in a flexible manner to accommodate the availability of all participants.

Participation was voluntary but could be connected to earning credits or fulfilling course requirements. The participants enjoyed working as a research and learning community, as it allowed for the exchange of ideas and sharing of findings. Acquiring new skills through experts or learning by doing is experienced positively. Mutual encouragement is fostered among the participants, inspiring them to explore novel avenues, while also promoting respectful engagement with all members involved in the community.

A Roadmap for (Future) Learning Communities

After conducting a comprehensive analysis of the three learning communities, we have developed a roadmap (see Figure 15.6) encompassing multiple factors and providing recommendations to establish a highly effective learning community. This roadmap will serve as a valuable resource, for both new and existing projects that seek to empower individuals within their respective (local) communities to fight disinformation at their own level.

A learning community roadmap with goals for community building, topic choice, research skills, doing research, and sharing findings. See long description.

Figure 15.6 A roadmap for (future) learning communities to fight disinformation at their own level.

Figure 15.6Long description

A roadmap of the learning community includes a stepwise goal structure for community building, topic choice, research skills, doing research, and sharing findings.

Community Building includes the following.

  • The goal is to build a trustful community.

  • Invest time in building your learning community.

  • Participants need to get to know each other.

  • Introduce new participants.

  • Think of ways to facilitate the above processes.

  • Work with 8 to 12 participants.

  • Choose a role that interests you.

  • Appoint a coordinator for important decisions.

  • Create and operate within a trustful environment where everyone feels safe.

The topic Choice includes the following.

  • The goal is to choose a topic close to your heart.

  • Think of a process to choose a joint topic that matches with the background of your group.

  • For example, using online brainstorm environments or arranging an in-person interactive session with the group.

  • Develop democratic voting mechanisms to facilitate this decision-making process.

Research Skills include the following.

  • The goal is to provide research training.

  • Decide on research methods in line with your project aims. For example, neighborhood interviews or online investigation.

  • Get the support of experts to train research skills that are in line with your approach.

  • Discuss, and if needed, train and facilitate researchers’ safety.

Doing Research includes the following.

  • The goal is to collect and analyze data, safely, with an open mind.

  • Determine what you want to find out and discuss if the approach is ethical and feasible given the time and availability of your team.

  • Support an open dialogue about collaboration with each other and others, and safety during data collection.

  • Analyze and discuss findings from different viewpoints and double-check with other relevant and valid sources.

Sharing Findings includes the following.

  • The goal is to decide on a joint sharing strategy.

  • To share or not to share must be discussed right from the start of the project.

  • Decide jointly with whom you want to share your findings, such as direct stakeholders, large audience, or the media.

  • Think of creative ways to share your findings, matching the needs of your chosen audience.

  • Engage in critical discussions about the consequences of sharing your findings.

Conclusion and Discussion

In this case study, we have analyzed three experimental learning communities each of which has a central purpose: to become more resilient to the threat of disinformation by jointly exploring information within an offline citizen-driven learning community.

Through this experimental approach, we learned that not every group focused directly on the analysis of disinformation in online environments but also on offline field research and topics close to their hearts. The decision-making process was open for each group, with no interference from the authors on the choice of topic. This made it very interesting to observe how the group process influenced the way participants decided to collect, analyze, and discuss their findings in each of the learning communities.

Throughout, all community participants were engaged in informed discussions on whether they should share their findings internally and/or externally, and what risk this could entail in creating online and offline polarization, endangering participants, or leading to a loss of trust in their community as they collected and analyzed information circulating in the public sphere. Additionally, participants from one learning community were also trained in using OSINT tools, and another was trained in fact-checking all information collected via desk research. One community succeeded in connecting with local citizens by openly communicating as “fellow citizens” with their neighbors despite no prior interview training. A central motivation for each community was their choice in shared subjects of concern under the guidance of a project leader or teacher.

Interestingly, each group chose to share their findings differently with others, either by sharing results internally with other students and teachers or with citizens whom they had previously interviewed during the research. Additionally, students from Firda School shared their findings with internal stakeholders by developing a podcast on the topic of inclusivity. Negative friction in group dynamics was noticeable in all communities in the later stages of their investigation, caused by considerations of whether or not to share findings. A conclusion that can be drawn from this is that future communities should discuss and decide on these issues at the outset of a project.

Even if the three communities had different backgrounds, topics, and approaches, we identified key aspects important to empower citizens to engage in learning communities to fight disinformation on various levels, for example, at a local level supporting their direct neighborhood, at school to improve their direct surroundings, or to understand the larger coherence in a global setting. In these learning communities, critical thinking, learning skills, and tools are essential. However, we noticed that it is even more important that the participants develop a deeper understanding of the impact of information sharing on online and offline spaces. By investigating topics that are close to their heart, they also learn to reflect on their viewpoints and behavior. This supports them to engage with ethical considerations, promoting more conscious and mindful participation in a world where online and offline communication are closely intertwined. It also helps them to become more resilient to disinformation, stops them from drawing premature conclusions, and motivates them to respectfully challenge and question perspectives, identifying biases and assumptions.

We also learned the importance of building communities in an open setting with clear agreements on the organization of space and time and supported by a coordinator (preferably chosen by the participants). For digital investigation team participants, it is of utmost importance to provide a joint training module on online investigation and safety.

Finally, most of the participants indicated that building and becoming a community is not something that comes naturally. Participants should take time to get to know each other, set clear goals for research (under the guidance of a coordinator), democratically decide on the choice of topic, and if and how their findings will be shared. Creating a trusted environment is key for each learning community as with trust comes resilience and empowerment to engage in meaningful online and offline investigations benefiting both individuals and society.

Current policy developments by the European Commission have established ambitious goals for ensuring that 80 percent of the population acquires fundamental digital competencies by the year 2030 (European Commission, 2023). This means more innovative approaches are needed to support local governments and citizens to develop the ability to form and articulate well-informed opinions and actively participate in societal discourse through open dialogue. Our research demonstrates that learning communities, supported by local public institutions such as libraries, (vocational) schools, and universities, can serve as a reliable and safe space for acquiring and practicing these skills in a real-life setting. Ultimately, this can contribute to both online and offline depolarization efforts, effectively engaging individuals from diverse backgrounds and educational levels. Our findings also show that learning communities are not about “sending information” or teaching about disinformation in front of a class or group of citizens. Its success is rooted in providing a secure environment, where individuals can govern themselves with experts’ guidance. In closing, as our experimental research shows, defending democracy may start at the heart of offline communities contributing to an open debate about local as well as global issues we are facing right now.

Footnotes

14 Some Truths About Lies Misinformation and Inequality in Policymaking and Politics

This work was supported by the Luxembourg National Research Fund [14345912].

1 Ostrom’s design principles (Reference Ostrom1990) are clear group boundaries, rules of governance match local conditions, those affected by the rule can participate in rule amendment procedures, self-organizing capacities recognized by outside authorities, local monitoring procedures, graduated sanctions for rule violators, accessible low-cost means of dispute resolution, and nested enterprises.

15 Getting a Grip on Disinformation From Distrust to Trust within Learning Communities

References

References

Allen, B. (2005). Tocqueville on covenant and the democratic revolution: Harmonizing earth with heaven. Lexington Books.Google Scholar
Allen, B., Lawrence, E., Stevens, D., & Sullivan, J. (2016). Partisanship and perceptions of fairness: Ignoring the facts. Journal of Experimental Political Science, 3(1), 3243.10.1017/XPS.2015.6CrossRefGoogle Scholar
Allen, B., & Stevens, D. (2015). What is negative about negative ads? In Nai, A. & Walter, A. S. (Eds.), New perspectives on negative campaigning: Why attack politics matters (pp. 4761). ECPR Press.Google Scholar
Allen, B., & Stevens, D. (2019). Truth in advertising? Lies in political advertising and how they affect the electorate. Lexington Books.Google Scholar
Allen, B., Stevens, D., Fox-Arnold, D., Holtey, B., Vincent, M., & Woollen, B. (2023). Local news in a social capital sixteen years later. Paper presented at the 2023 Annual Conference of the Elections, Public Opinion & Political Parties (EPOP), Southampton University, Southampton, UK, 8–9 September 2023.Google Scholar
Allen, B., Stevens, D., Marfleet, G., Sullivan, J., & Alger, D. (2007). Local news and perceptions of the rhetoric in political advertising American Politics Research, 35(4), 506540.10.1177/1532673X06298717CrossRefGoogle Scholar
Andersson, K. P., & Agrawal, A. (2006). Equity, institutions, and the environment: Socioeconomic aspects of local forest governance. Paper presented at the 11th Conference of the International Association for the Study of Common Property, Bali, Indonesia, June 19–23.Google Scholar
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., et al. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences of the United States of America, 115, 92169221. https://doi.org/10.1073/pnas.1804840115CrossRefGoogle ScholarPubMed
Baland, J., & Platteau, J. P. (1999). The ambiguous impact of inequality on local resource management. World Development, 27(5), 773788. https://doi.org/10.1016/S0305-750X(99)00026-1CrossRefGoogle Scholar
Bartels, L. M. (2020). Ethnic antagonism erodes Republicans’ commitment to democracy. Proceedings of the National Academy of Sciences of the United States of America, 117, 2275222759. https://doi.org/10.1073/pnas.2007747117CrossRefGoogle ScholarPubMed
Bennett, W. L. (2016). News: The politics of illusion (10th ed.). University of Chicago Press.10.7208/chicago/9780226345055.001.0001CrossRefGoogle Scholar
Bienstman, S. (2023). Does inequality erode social trust? Frontiers in Political Science, 5, 119. https://doi.org/10.3389/fpos.2023.1197317CrossRefGoogle Scholar
Bienstman, S., Hense, S., & Gangl, M. (2024). Explaining the “democratic malaise” in unequal societies: Inequality, external efficacy and political trust. European Journal of Political Research, 63, 172–191. https://doi.org/10.1111/1475-6765.12611CrossRefGoogle Scholar
Bobzien, L. (2023). Income inequality and political trust: Do fairness perceptions matter? Social Indicators Research, 169(1–2), 505528. https://doi.org/10.1007/s11205-023-03168-9CrossRefGoogle Scholar
Borah, P., & Xiao, X. (2018). The importance of ‘likes’: The interplay of message framing, source, and social endorsement on credibility perceptions of health information on FacebookJournal of health communication23(4), 399411. https://doi.org/10.1080/10810730.2018/1455770CrossRefGoogle ScholarPubMed
Buttrick, N. R., & Oishi, S. (2017). The psychological consequences of income inequality. Social and Personality Psychology Compass, 11(3). https://doi.org/10.1111/spc3.12304CrossRefGoogle Scholar
Chong, D. (2000). Rational lives: Norms and values in politics and society. University of Chicago Press.10.7208/chicago/9780226104379.001.0001CrossRefGoogle Scholar
Chong, D., & Druckman, J. (2007a). Framing public opinion in competitive democracies. American Political Science Review, 101(4), 637655.CrossRefGoogle Scholar
Chong, D., & Druckman, J. (2007b). Framing theory. Annual Review of Political Science, 10, 103–126.10.1146/annurev.polisci.10.072805.103054CrossRefGoogle Scholar
Claassen, C., & Magalhães, P. C. (2023). Public support for democracy in the United States has declined generationally. Public Opinion Quarterly, 87(3), 719732. https://doi.org/10.1093/poq/nfad039CrossRefGoogle ScholarPubMed
Cohen, M. J., Smith, A. E., Moseley, M. W., & Layton, M. L. (2022). Winners’ consent? Citizen commitment to democracy when illiberal candidates win elections. American Journal of Political Science, 67(2), 261276. https://doi.org/10.1111/ajps,12690CrossRefGoogle Scholar
Cowart, H. S., Blackstone, G. E., & Riley, J. K. (2022). Framing a movement: Media portrayals of the George Floyd protests on Twitter. Journalism and Mass Communication Quarterly, 99(3), 676695. https://doi.org/10.1177/10776990221109232CrossRefGoogle Scholar
Dewilde, C., & Flynn, L. B. (2021). Post-crisis developments in young adults’ housing wealth. Journal of European Social Policy, 31(5), 580596. https://doi.org/10.1177/09589287211040443CrossRefGoogle Scholar
Druckman, J. N. (2001). On the limits of framing effects: Who can frame? Journal of Politics, 63(4), 10411066.10.1111/0022-3816.00100CrossRefGoogle Scholar
Druckman, J. N. (2004). Political preference formation: Competition, deliberation, and the (ir)relevance of framing effects. American Political Science Review, 98, 671686.10.1017/S0003055404041413CrossRefGoogle Scholar
Drutman, L., Goldman, J., & Diamond, L. (2020, June). Democracy maybe: Attitudes on authoritarianism in America. Voter Study Group. https://voterstudygroup.org/publication/democracy-maybeGoogle Scholar
Eddy, K. (2022, June 15). The changing news habits and attitudes of younger audiences. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2022/young-audiences-news-mediaGoogle Scholar
European Commission. (2018). A multi-dimensional approach to disinformation: Report of the independent High Level Group on fake news and online disinformation. Publications Office of the European Union. https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformationGoogle Scholar
Fairbrother, M., & Martin, I. W. (2013). Does inequality erode social trust? Results from multilevel models of US states and counties. Social Science Research, 42(2), 347360. https://doi.org/10.1016/j.ssresearch.2012.09.008CrossRefGoogle ScholarPubMed
Flynn, L. B. (2020). The young and the restless: Housing access in the critical years. West European Politics, 43(2), 321343. https://doi.org/10.1080/01402382.2019.1603679CrossRefGoogle Scholar
Flynn, L. B., & Schwartz, H. M. (2017). No exit: Social reproduction in an era of rising income inequality. Politics & Society, 45(4), 471503. https://doi.org/10.1177/0032329217732314CrossRefGoogle Scholar
Foa, R. S., & Mounk, Y. (2017). The signs of deconsolidation. Journal of Democracy, 28(1), 515. https://doi.org/10.1353/jod.2017.0000CrossRefGoogle Scholar
Gallego, A. (2016). Inequality and the erosion of trust among the poor: Experimental evidence. Socio-Economic Review, 14(3), 443460. https://doi.org/10.1093/ser/mww010Google Scholar
Gärtner, S., & Prado, S. (2016). Unlocking the social trap: Inequality, trust and the Scandinavian welfare state. Social Science History, 40(1), 3362. https://doi.org/10.1017/ssh.2015.80CrossRefGoogle Scholar
Goffman, E. (1974). Frame analysis: An essay on the organization of experience. Harper & Row.Google Scholar
Goubin, S., & Hooghe, M. (2020). The effect of inequality on the relation between socioeconomic stratification and political trust in Europe. Social Justice Research, 33(2), 219247. https://doi.org/10.1007/s11211-020-00350-zCrossRefGoogle Scholar
Greenwood-Hau, J. (2021). The system works fine: The positive relationship between emphasis on individual explanations for inequality and external political efficacy. Frontiers in Political Science, 3, 117. https://doi.org/10.3389/fpos.2021.643165CrossRefGoogle Scholar
Guinjoan, M., & Rico, G. (2018). How perceptions of inequality between countries diminish trust in the European Union: Experimental and observational evidence. Political Psychology, 39(6), 12891303. https://doi.org/10.1111/pops.12541CrossRefGoogle Scholar
Habibov, N., Cheung, A., & Auchynnikava, A. (2018). Does institutional trust increase willingness to pay more taxes to support the welfare state? Sociological Spectrum, 38(1), 5168. https://doi.org/10.1080/02732173.2017.1409146CrossRefGoogle Scholar
Hackett, S., Schlager, E., & Walker, J. (1994). The role of communication in resolving commons dilemmas: Experimental evidence with heterogeneous appropriators. Journal of Environmental Economics and Management, 27(2), 99126. https://doi.org/10.1006/jeem.1994.1029CrossRefGoogle Scholar
Hastings, O. P. (2018). Less equal, less trusting? Longitudinal and cross-sectional effects of income inequality on trust in U.S. states, 1973–2012. Social Science Research, 74, 7795. https://doi.org/10.1016/j.ssresearch.2018.04.005CrossRefGoogle ScholarPubMed
Hochschild, A. (2016). Strangers in their own land: Anger and mourning on the American right. The New Press.Google Scholar
Iyengar, S. (1991). Is anyone responsible?: How television frames political issues. University of Chicago Press.10.7208/chicago/9780226388533.001.0001CrossRefGoogle Scholar
Iyengar, S., & Kinder, D. (1987). News that matters. University of Chicago Press.Google Scholar
Jetten, J., Mols, F., & Selvanathan, H. P. (2020). How economic inequality fuels the rise and persistence of the Yellow Vest movement. International Review of Social Psychology, 33(1), 2. https://doi.org/10.5334/irsp.356CrossRefGoogle Scholar
Kanitsar, G. (2022). The inequality-trust nexus revisited: At what level of aggregation does income inequality matter for social trust? Social Indicators Research, 163(1), 171195. https://doi.org/10.1007/s11205-022-02894-wCrossRefGoogle Scholar
Kim, Y., Sommet, N., Na, J., & Spini, D. (2022). Social class – not income inequality – predicts social and institutional trust. Social Psychological and Personality Science, 13(1), 186198. https://doi.org/10.1177/1948550621999272CrossRefGoogle ScholarPubMed
Kiratli, O. (2023). Social media effects on public trust in the European Union. Public Opinion Quarterly, 87(3), 749763. https://doi.org/10.1093/poq/nfad029CrossRefGoogle ScholarPubMed
Kiser, L., & Ostrom, E. (1982). The three worlds of action: A metatheoretical synthesis of institutional approaches. In Ostrom, E. (Ed.), Strategies of political inquiry (pp. 179222). Sage.Google Scholar
Knell, M., & Stix, H. (2021). Inequality, perception biases and trust. The Journal of Economic Inequality, 19(4), 801824. https://doi.org/10.1007/s10888-021-09490-xCrossRefGoogle ScholarPubMed
Kumkale, G. T., Albarracín, D., & Seignourel, P. J. (2010). The effects of source credibility in the presence or absence of prior attitudes: Implications for the design of persuasive communication campaigns. Journal of Applied Social Psychology, 40(6), 13251356. https://doi.org/10.1111/j.1559-1816.2010.00620.xCrossRefGoogle ScholarPubMed
Larsen, C. A. (2013). The rise and fall of social cohesion: The construction and de-construction of social trust in the US, UK, Sweden and Denmark. Oxford University Press.10.1093/acprof:oso/9780199681846.001.0001CrossRefGoogle Scholar
Lewandowsky, M., & Jankowski, M. (2022). Sympathy for the devil? Voter support for illiberal politicians. European Political Science Review, 15, 3956. https://doi.org/10.1017/S175577392200042XCrossRefGoogle Scholar
Liedke, J., & Gottfried, J. (2022, October 27). U.S. adults under 30 now trust information from social media almost as much as from national news outlets. PEW Research Center. https://pewrsr.ch/3DF4dn1Google Scholar
Lipps, J., & Schraff, D. (2021). Regional inequality and institutional trust in Europe. European Journal of Political Research, 60(4), 892913. https://doi.org/10.1111/1475-6765.12430CrossRefGoogle Scholar
Loveless, M. (2013). The deterioration of democratic political culture: Consequences of the perception of inequality. Social Justice Research, 26(4), 471491. https://doi.org/10.1007/s11211-013-0198-7CrossRefGoogle Scholar
Martens, B., Aguiar, L., Gomez-Herrera, E., & Mueller-Langer, F. (2018, April). The digital transformation of news media and the rise of disinformation and fake news (JRC Digital Economy Working Paper 2018-02). Joint Research Centre, the European Commission’s In-house Science Service. https://joint-research-centre.ec.europa.eu/document/download/0843265e-f418-4b6e-94f7-61d2ba1cba1e_en?filename=jrc111529.pdf10.2139/ssrn.3164170CrossRefGoogle Scholar
Neupane, H. (2003). Contested impact of community forestry on equity: Some evidence from Nepal. Journal of Forest and Livelihood, 2(2), 5561.10.3126/jfl.v2i2.59725CrossRefGoogle Scholar
Newman, N., & Fletcher, R. (2017). Bias, bullshit and lies: Audience perspectives on low trust in the media. Reuters Institute Digital News Project. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2017-11/Nic%20Newman%20and%20Richard%20Fletcher%20-%20Bias%2C%20Bullshit%20and%20Lies%20-%20Report.pdfGoogle Scholar
Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2017). Reuters Institute digital news report 2017. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/Digital%20News%20Report%202017%20web_0.pdfGoogle Scholar
Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2023). Reuters Institute digital news report 2023. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2023-06/Digital_News_Report_2023.pdfGoogle Scholar
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 303330. https://doi.org/10.1007/s11109-010-9112-2CrossRefGoogle Scholar
Olivera, J. (2015). Changes in inequality and generalized trust in Europe. Social Indicators Research, 124(1), 2141. https://doi.org/10.1007/s11205-014-0777-5CrossRefGoogle Scholar
Olson, M. (1965). The logic of collective action: Public goods and the theory of groups. Harvard University Press.CrossRefGoogle Scholar
Ostrom, E. (1990). Governing the commons. Cambridge University Press.CrossRefGoogle Scholar
Ostrom, E. (2005). Understanding institutional diversity. Princeton University Press.Google Scholar
Ostrom, E. (2010). The institutional analysis and development framework and the commons response. Cornell Law Review, 95(3), 807815.Google Scholar
Ostrom, V. (1982). Institutional analysis, policy analysis, and performance evaluation. Paper presented at the Workshop for the Study of Interorganizational Arrangements in the Public Sector, International Institute of Management, Berlin, Germany, 26–28 July 1982.Google Scholar
Ostrom, V. (1993). Epistemic choice and public choice. Public Choice, 77(1), 163176. https://jstor.org/stable/3002721710.1007/BF01049230CrossRefGoogle Scholar
Ostrom, V. (1997). The meaning of democracy and the vulnerability of democracies. University of Michigan Press.10.3998/mpub.15021CrossRefGoogle Scholar
Park, S., Fisher, C., Flew, T., & Dulleck, U. (2020). Global mistrust in news: The impact of social media on trust. International Journal on Media Management, 22(2), 83–96.10.1080/14241277.2020.1799794CrossRefGoogle Scholar
Park, S., Park, J. Y., Kang, J.-H., & Cha, M. (2021). The presence of an unexpected bias in online fact checking. Harvard Kennedy School Misinformation Review, 2(1). https://doi.org/10.37016/mr-2020-53Google Scholar
Pfeffer, F. T., & Waitkus, N. (2021, July 30). The wealth inequality of nations. American Sociological Review, 86(4), 567602. https://doi.org/10.1177/00031224211027800CrossRefGoogle Scholar
Putnam, R. (2000). Bowling alone. Simon & Schuster.Google Scholar
Rabin, M. (1998). Psychology and economics. Journal of Economic Literature, 36(1), 1146.Google Scholar
Rothstein, B., & Uslaner, E. M. (2005). All for all: Equality, corruption, and social trust. World Politics, 58(1), 4172. https://doi.org/10.1353/wp.2006.0022CrossRefGoogle Scholar
Ruttan, L. M. (2006). Sociocultural heterogeneity and the commons. Current Anthropology, 47(5), 843853.CrossRefGoogle Scholar
Rydgren, J. (2008). Immigration sceptics, xenophobes or racists? Radical right‐wing voting in six West European countries. European Journal of Political Research, 47(6): 737765. https://doi.org/10.1111/j.1475-6765.2008.00784.xCrossRefGoogle Scholar
Scheidegger, R., & Staerklé, C. (2011). Political trust and distrust in Switzerland: A normative analysis. Swiss Political Science Review, 17(2), 164187. https://doi.org/10.1111/j.1662-6370.2011.02010.xCrossRefGoogle Scholar
Schlager, E., & Blomquist, W. (1998). Resolving common pool resource dilemmas and heterogeneities among resource users. Paper presented at the 7th Biennial Conference of the International Association for the Study of Common Property, Vancouver, BC, June 10–14.Google Scholar
Searle, J. (1969). Speech acts. Cambridge University Press.10.1017/CBO9781139173438CrossRefGoogle Scholar
Stephany, F. (2017). Who are your Joneses? Socio-specific income inequality and trust. Social Indicators Research, 134(3), 877898. https://doi.org/10.1007/s11205-016-1460-9CrossRefGoogle ScholarPubMed
Stevens, D., Alger, D., Allen, B., & Sullivan, J. L. (2006). Local news coverage in a social capital capital: Election 2000 on Minnesota’s local news stations. Political Communication, 23(1), 6183. https://doi.org.10.1080/10584600500477062CrossRefGoogle Scholar
Stevens, D., Sullivan, J., Allen, B., & Alger, D. (2008). What’s good for the goose is bad for the gander: Negative political advertising, partisanship, and turnout. Journal of Politics, 70(2), 527541.10.1017/S0022381608080481CrossRefGoogle Scholar
Sudulich, L., Wall, M., & Baccini, L. (2015). Wired voters: The effects of internet use on voters’ electoral uncertainty. British Journal of Political Science, 45, 853881. https://doi.org/1017/S0007123413000513CrossRefGoogle Scholar
Savills News. (2023, September 25). Total global value of real estate estimated at $379.7 trillion – almost four times the value of global GDP. Savills News. https://savills.com/insight-and-opinion/savills-news/352068/total-global-value-of-real-estate-estimated-at-$379.7-trillion---almost-four-times-the-value-of-global-gdpGoogle Scholar
Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453458.10.1126/science.7455683CrossRefGoogle ScholarPubMed
Tversky, A., & Kahneman, D. (1986). Rational choice and the framing of decisions. Journal of Business, 59(4), S251S278.10.1086/296365CrossRefGoogle Scholar
Uslaner, E. M. (Ed.). (2018). Oxford handbook of social and political trust. Oxford University Press. https://doi.org/10.1093/oxfordhb/9780190274801.001.0001Google Scholar
Vasilopoulou, S., & Talving, L. (2023). Euroscepticism as a syndrome of stagnation? Regional inequality and trust in the EU. Journal of European Public Policy, 31(6), 1494–1515. https://doi.org/10.1080/13501763.2023.2264891CrossRefGoogle Scholar
Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Report for The Council of Europe. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277cGoogle Scholar
Wilkinson, R. G., & Pickett, K. E. (2009). Income inequality and social dysfunction. Annual Review of Sociology, 35(1), 493511. https://doi.org/10.1146/annurev-soc-070308-115926CrossRefGoogle Scholar
Zimmermann, F., & Kohring, M. (2020). Mistrust, disinforming news, and vote choice: A panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary Election. Political Communication, 37(2), 215237. https://doi.org/10.1080/10584609.2019/1686095CrossRefGoogle Scholar

References

Bastos, M., Mercea, D., & Goveia, F. (2021). Guy next door and implausibly attractive young women: The visual frames of social media propaganda. New Media & Society, 25(8). https://doi.org/10.1177/14614448211026580Google Scholar
Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press. https://doi.org/10.1093/oso/9780190923624.001.0001CrossRefGoogle Scholar
Bowman, S., & Willis, C. (2003). We Media: How audiences are shaping the future of news and information. The Media Center at The American Press Institute. https://ict4peace.org/wp-content/uploads/2007/05/we_media.pdfGoogle Scholar
Chan, M. S., Jones, C. R., Jamieson, K. H., & Albarracín, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28(11), 15311546. https://doi.org/10.1177/0956797617714579CrossRefGoogle ScholarPubMed
Glassman, M., & Kang, M. J. (2012). Intelligence in the internet age: The emergence and evolution of Open Source Intelligence (OSINT). Computers in Human Behavior, 28(2), 673682. https://doi.org/10.1016/j.chb.2011.11.014CrossRefGoogle Scholar
Groenewegen, J. (2011). The Bloomington School and American Institutionalism. The Good Society, 20(1), 1536. https://doi.org/10.5325/goodsociety.20.1.0015CrossRefGoogle Scholar
Hassain, J. (2022). Disinformation in democracies: Improving societal resilience to disinformation. NATO Strategic Communications Centre of Excellence. https://stratcomcoe.org/publications/disinformation-in-democracies-improving-societal-resilience-to-disinformation/241Google Scholar
Heinrich, A. (2019). How to build resilient news infrastructures? Reflections on information provision in times of “fake news.” In I. Linkov, L. Roslycky, & B. D. Trump (Eds.), Resilience and hybrid threats: Security and integrity for the digital world (pp. 174187). IOS Press. https://doi.org/10.3233/NICSP190031Google Scholar
Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2017). Digital citizenship and surveillance society: Introduction. International Journal of Communication (IJOC), 11, 731739. https://ijoc.org/index.php/ijoc/article/view/5521/1929Google Scholar
Koulolias, V., Jonathan, G. M., Fernandez, M., & Sotirchos, D. (2018). Combating misinformation: An ecosystem in co-creation. Organization for Economic Co-operation and Development (OECD).Google Scholar
Markham, , A. N. (2019). Critical pedagogy as a response to datafication. Qualitative Inquiry, 25(8), 754–760. https://doi.org/10.1177/1077800418809470CrossRefGoogle Scholar
Mossberger, K., Tolbert, C. J., & McNeal, R. S. (2007). Digital citizenship: The internet, society, and participation. The MIT Press. https://doi.org/10.7551/mitpress/7428.001.0001CrossRefGoogle Scholar
Nationaal Coördinator Terrorismebestrijding en Veiligheid (NCTV). (2023, February 14). Desinformatie. https://nctv.nl/onderwerpen/desinformatieGoogle Scholar
Ostrom, E. (2005). Understanding institutional diversity. Princeton University Press.Google Scholar
Ostrom, E. (2007). Institutional rational choice: An Assessment of the Institutional Analysis and Development Framework. In Theories of the Policy Process, Second Edition (2de dr.). Routledge.Google Scholar
Ostrom, E., Gardner, R., & Walker, J. (1994). Rules, games, and common-pool resources. University of Michigan Press. https://doi.org/10.3998/mpub.9739Google Scholar
Ostrom, E., Gibson, C., Shivakumar, S., & Andersson, K. (2014). An institutional analysis of development cooperation. In Barrett, S., Mäler, K.-G., & Maskin, E. S. (Red.), Environment and development economics (pp. 117141). Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199677856.003.0004CrossRefGoogle Scholar
Papacharissi, Z. (2015). Affective publics: Sentiment, technology, and politics. Oxford University Press.Google Scholar
Stieglitz, S., Hofeditz, L., Brünker, F., Ehnis, C., Mirbabaie, M., & Ross, B. (2022). Design principles for conversational agents to support Emergency Management Agencies. International Journal of Information Management, 63, 111. https://doi.org/10.1016/j.ijinfomgt.2021.102469CrossRefGoogle ScholarPubMed
van der Hooft, K. (2022). Framework PIT-Project for students. NHL Stenden University of Applied Sciences.Google Scholar
Vissia, J. (2022). Kaders gesteld door de leden van de onderzoeksgroep de Pit, semester 1. NHL Stenden University of Applied Sciences.Google Scholar
Yin, R. K. (2009). Case study research: Design and methods (Bickman, L. & Rog, D. J., Red.; vol. 5). Sage. https://doi.org/10.1097/FCH.0b013e31822dda9eGoogle Scholar
Figure 0

Figure 14.1 Misinformation continua – purpose and accuracy

Figure 1

Figure 15.1 A framework for institutional analysis.

Source: Adapted from E. Ostrom, Gardner, and Walker (1994, p. 37).
Figure 2

Table 15.1 Additional conceptual questions for analyzing the Action Arenas of the three communities

Figure 3

Figure 15.2 Overview of the three learning communitiesFigure 15.2 long description.

Figure 4

Figure 15.3 Announcement for the first meeting of the Pit in March 2022 (translated from Dutch).Figure 15.3 long description.

Figure 5

Figure 15.4 Important joint learnings for future Pit projects.Figure 15.4 long description.

Figure 6

Figure 15.5 Comparing the three learning communities; visual overview of the key aspects identified in each case.Figure 15.5 long description.

Figure 7

Figure 15.6 A roadmap for (future) learning communities to fight disinformation at their own level.Figure 15.6 long description.

Accessibility standard: WCAG 2.2 AAA

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

The HTML of this book complies with version 2.2 of the Web Content Accessibility Guidelines (WCAG), offering more comprehensive accessibility measures for a broad range of users and attains the highest (AAA) level of WCAG compliance, optimising the user experience by meeting the most extensive accessibility guidelines.

Content Navigation

Table of contents navigation
Allows you to navigate directly to chapters, sections, or non‐text items through a linked table of contents, reducing the need for extensive scrolling.
Index navigation
Provides an interactive index, letting you go straight to where a term or subject appears in the text without manual searching.

Reading Order & Textual Equivalents

Single logical reading order
You will encounter all content (including footnotes, captions, etc.) in a clear, sequential flow, making it easier to follow with assistive tools like screen readers.
Short alternative textual descriptions
You get concise descriptions (for images, charts, or media clips), ensuring you do not miss crucial information when visual or audio elements are not accessible.
Full alternative textual descriptions
You get more than just short alt text: you have comprehensive text equivalents, transcripts, captions, or audio descriptions for substantial non‐text content, which is especially helpful for complex visuals or multimedia.
Visualised data also available as non-graphical data
You can access graphs or charts in a text or tabular format, so you are not excluded if you cannot process visual displays.

Visual Accessibility

Use of colour is not sole means of conveying information
You will still understand key ideas or prompts without relying solely on colour, which is especially helpful if you have colour vision deficiencies.
Use of high contrast between text and background colour
You benefit from high‐contrast text, which improves legibility if you have low vision or if you are reading in less‐than‐ideal lighting conditions.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×