Hostname: page-component-857557d7f7-gtc7z Total loading time: 0.001 Render date: 2025-11-21T01:24:48.966Z Has data issue: false hasContentIssue false

International Relations Scholars, the Media, and the Dilemma of Consensus

Published online by Cambridge University Press:  09 September 2025

Rights & Permissions [Opens in a new window]

Abstract

Over the last 15 years, scholars, universities, and foundations have promoted numerous efforts to link the scholarly and policy communities of international relations. Increasing evidence suggests that scholars are succeeding in getting their ideas and findings in the press, and their success bodes well for their ability to influence public and elite opinion. Despite these strides, we know little about when journalists may pick up on academic ideas and evidence or how they will report it in their stories. We seek to fill this gap. To explore the role of media as a conduit for academic knowledge, we surveyed more than 1,000 foreign policy journalists about their views on IR experts and expertise. We asked when, how, and how often respondents seek out IR scholars and scholarship in the course of their reporting. We also asked about the barriers to consuming peer-reviewed, scholarly research, if and how journalists interact with IR scholars on social media, and how IR scholars’ influence compares to that of scholars in other disciplines. Finally, we asked whether respondents cover a story differently if there is consensus among experts than if there is little agreement. In addition to providing empirical answers to these questions, we used our first-of-its-kind survey of foreign policy journalists to test several arguments from literature on the media and experts, including that journalists rely heavily on experts and expertise in developing and writing their stories, they rely more heavily on social science experts than other specialists, and they tend to inaccurately portray the level of consensus among the relevant experts. Our findings largely support these claims. First, foreign policy journalists often seek out IR experts and expertise for use in their stories, suggesting that the media acts as an important conveyor belt for academic knowledge. These journalists use academic expertise at several key stages, especially when researching background information. Second, foreign policy journalists, like journalists more generally, favor social science experts and expertise over experts from other disciplines. Finally, foreign policy journalists are no different than journalists overall in their tendency to create “false balance;” they underrepresent the degree of consensus among experts and oversample dissenters when scholars overwhelmingly favor a particular policy or interpretation of events.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of American Political Science Association

Democracy is premised on the idea that public opinion informs and constrains policymakers and policy. Nowhere are these effects more tenuous than in the foreign and security policy realm, where information about leaders’ policy choices is not readily and directly available to voters. For this reason, the media plays an especially important role in both informing the public and channeling public opinion to policy elites. The media, however, is not simply a transmission belt for sending information to the public and public opinion to elites. Instead, in deciding what to report and how to report it, journalists influence the public’s and policy practitioners’ understanding of international affairs (Holsti Reference Holsti2004; McCombs and Shaw Reference McCombs and Shaw1972). This is particularly true in international affairs, where the public, international relations (IR) scholars, and even many policymakers and practitioners outside the upper reaches of the executive branch often have limited first-hand information about events abroad. Even when they have an informational advantage over journalists and the public, policymakers are often limited in their ability to set the media agenda. In such an environment, journalists often “produce the news, rather than simply report” it (Merkley Reference Merkley2020).

Scholars have long recognized that the interactive relationship among public opinion, the media, and policy elites is an important topic of research. More recently, these experts also have begun to appreciate that the media’s dual roles as conveyor belt and independent actor make it an effective mechanism for communicating academic ideas and findings to practitioners and the public. Over the last 15 years, foundations, universities, and scholars have supported numerous efforts to directly link the scholarly IR community with journalists and reshape academic research to be more amenable to distribution via mass media outlets. The Carnegie Corporation of New York, for example, funded the Bridging the Gap (BtG) Initiative to promote academic contributions to public debate and decision making on international and foreign policy issues. BtG has offered media training workshops to help IR scholars make their work more accessible to the public, policymakers, and journalists by writing op-eds, blog posts, and policy briefs and by teaching young scholars to interact with the press (Carnegie 2023).

Mounting evidence suggests that scholars in many disciplines are succeeding in getting their ideas and findings into the press, and their success bodes well for their ability to influence public and elite opinion (Lynch Reference Lynch2016). It has become commonplace to note a decline in respect for experts and expertise (Nichols Reference Nichols2017), but journalists continue to consult and cite experts in their stories. Albӕk, Christiansen, and Togeby (Reference Albӕk, Christiansen and Togeby2003) argue that journalists’ use of academics, especially social scientists, as sources has increased dramatically in recent years due to changes within society, the media, and the academy. As Camille Limoges (quoted in Boyce Reference Boyce2006) writes, “We believe less and less in experts … (but) we use them more and more.”

These appeals to expertise may pay off. A growing body of work in IR (e.g., Guisinger and Saunders Reference Guisinger and Saunders2017; Maliniak et al. Reference Maliniak, Parajon, Peterson and Powers2024) and climate change (Bolsen and Druckman Reference Bolsen and Druckman2015; Maliniak et al. Reference Maliniak, Parajon and Powers2020; Malka et al. Reference Malka, Krosnick, Debell, Pasek and Schneider2009; van der Linden Reference Van der Linden2015) shows that expert cues, including messages from IR scholars, can increase public awareness of scientific knowledge and shape policy preferences. When consensus among experts is robust, these professional authorities can have a particularly strong influence on public attitudes (Kerr and van der Linden Reference Kerr and van der Linden2021; van der Linden Reference Van der Linden2015).

The potential to influence public and elite opinion gives IR faculty a strong rationale for trying to communicate their work to a general public, but we know little about when journalists facilitate those attempts. To answer this question, in 2019 we surveyed journalists covering international affairs and U.S foreign policy about their views on IR experts and expertise. We asked respondents when, how, and how often they seek out IR scholars and scholarship in the course of their reporting. We also asked about the barriers to consuming research produced by scholars, if and how journalists interact with IR scholars on social media, and how IR scholars’ influence compares to that of scholars in other disciplines. Finally, we used a survey experiment to explore whether foreign policy journalists cover a story differently if there is consensus among the experts than if there is little agreement.

Our goal in this article is twofold. First, we empirically examine the relationship between IR scholars and the media, a topic that has yet to be systematically addressed by IR scholars seeking to bridge the gap between theory and policy. The results of our first-of-its-kind survey provide insight into how foreign policy journalists do their work, the extent to which that work is affected by scholarly research on international affairs, and whether the strategies that scholars now commonly employ to increase their impact are, in fact, effective. Second, we use evidence from foreign policy journalists to test several arguments from existing literature on the media and experts: that journalists rely heavily on experts and expertise; that they rely more heavily on social science experts than other specialists; and that they tend to inaccurately portray the level of consensus among the relevant experts.

Our findings largely support these claims. First, foreign policy journalists often seek out IR experts and expertise for use in their stories, suggesting that the media acts as an important conveyor belt for academic knowledge. We find, moreover, that foreign policy journalists use academic expertise at several key stages or in certain scenarios, including when seeking additional information for a story, seeking to quote or otherwise cite an expert for a story, and especially when researching background information. Although some existing literature shows and many academics assume that journalists cite experts and expertise because academics communicate their own research and suggest a story, we find no evidence to support this. Second, foreign policy journalists, like journalists more generally, favor social science experts and expertise over experts from other disciplines. Finally, unfortunately, foreign policy journalists are no different from journalists overall in their tendency to create “false balance;” they underrepresent the degree of expert consensus and oversample dissenters when scholars overwhelmingly favor a particular policy or interpretation of events.

The remainder of the article is divided into four sections. We briefly explore existing literature on the role of the media and the media’s use of expert sources. We then describe our survey methods and the demographic characteristics of survey respondents, present our findings, and summarize our results, reflecting on their implications for several debates on the role of the media and expertise in foreign policy and for efforts by IR scholars to bridge the gap between the academic and policy communities of IR.

Experts and the Media

In a democratic society, the media attunes national leaders to public opinion and provides information that allows citizens to hold leaders accountable. In the process, students of political science and communications agree that the media shapes public opinion by determining which issues receive the most attention and how those issues are framed (e.g., Entmann Reference Entman2007; McCombs and Shaw Reference McCombs and Shaw1972). It is not surprising, then, that journalists are a prime target for IR scholars and others who want to influence U.S. foreign policy. The media can convey academic arguments and evidence to both the public and policymakers, and it can help shape public and elite opinion based on those arguments and evidence. Nevertheless, although there is a large literature on the role of the media in the making of U.S. foreign policy (see Baum and Potter Reference Baum and Potter2008 for a review), political science and IR have had little to say about the role of academic experts and expertise in foreign policy or the relationship between academic experts and journalists.Footnote 1 The communications and media studies literature, in contrast, has explored two key aspects of the relationship between journalists and scholars: the media’s use of experts and expertise and its portrayal of expert consensus.

There is widespread agreement within the journalism and communications literature that journalists cite experts more frequently than in the past. This increase in journalists’ use of experts and expertises goes back at least 50 years. Use of expert sources in television network news stories increased 300% from 1978 to 1988, and the number of network uses of expert sources in their campaign coverage was seven times as high in 1988 as in any of the four previous presidential elections (Chan Reference Chan1998). A Danish longitudinal study (Albӕk, Christiansen, and Togeby Reference Albӕk, Christiansen and Togeby2003) similarly found an “explosive increase in references to experts in the mass media in the 1990s,” although the rate of increase appears to have slowed in the 1990s and 2000s (Albӕk et al. 2011). Media stories, moreover, increasingly balance their references to experts with non-experts who challenge expertise and often are used to frame stories (Boyce Reference Boyce2006). Still, there is general agreement that, even in an age of skepticism toward experts and expertise (e.g., Nichols Reference Nichols2017), “the presence of academic and other experts in news journalism has increased notably over the last two decades” (Niemi and Pitkänen Reference Niemi and Pitkänen2016; also see Barnhurst Reference Barnhurst2003; Bauer and Gregory Reference Bauer, Gregory, Bauer and Bucchi2007).

Increasingly, the expert sources that journalists most often consult are academic researchers. Between 1985 and 1988 the number of expert sources on Nightline—most of whom were academics, doctors, or lawyers—exceeded the number of policy practitioners, and these same groups of experts comprised almost 60% of live appearances and occupied nearly 50% of airtime on Nightline and McNeil/Lehrer News Hour during a six-month period of study in 1989 (Chan Reference Chan1998). More recently, Albӕk et al. (Reference Albæk2011) found that researchers appear as experts in news articles on election campaigns far more frequently than do members of all other expert groups—including politicians and other practitioners, political commentators, think-tank representatives, and others with specialized knowledge in a given field—combined.

Why such a dramatic increase over the last several decades in the media’s use of experts and expertise, including academic sources? Certainly, we have seen an increase in scholars’ attempts to communicate their work to a broader audience (Lynch Reference Lynch2016). In part, this trend within IR and political science has been driven by an understanding of the role of the social scientist within society, and in part it is a response to larger social and political forces pushing individual academics and universities to justify their relevance (Albӕk, Christiansen, and Togeby Reference Albӕk, Christiansen and Togeby2003).

Albӕk, Christiansen, and Togeby (Reference Albӕk, Christiansen and Togeby2003) note two additional explanations for the change: the increasingly technical and specialized nature of knowledge and policy debates as well as the professionalization of the media, including the move toward greater objectivity in reporting. With the rise of television reporting and increasing skepticism of policymakers’ statements and actions came a turn from event-centered and descriptive news stories to what has been called contextual, interpretative, or explanatory reporting—that is, lengthier, more interpretive, and more analytical stories (Albӕk 2011; Barnhurst and Mutz Reference Barnhurst and Mutz1997; Fink and Schudson Reference Fink and Schudson2014; Hopmann and Strömbäck Reference Hopmann and Strömbäck2010; Patterson Reference Patterson2013). Decades ago, Stephen Hess (Reference Hess1981) called this “social science journalism.” This shift required journalists to provide context for and analysis of social issues on which they may have little expertise, leading them to call on external experts. Citing experts can help journalists to strategically “craft stories that appear objective, accurate, and unbiased and thus protect journalists from outside criticism” (Fleerackers Reference Fleerackers2023). It also can serve a normative and even performative role for journalists: “The selection, citation, and attribution of expert sources is one ritual for performing objectivity. Attributing claims or statements to trusted sources allows journalists to signal that the information they present is credible and that they, themselves, are authoritative enough to access these elite sources” (Fleerackers Reference Fleerackers2023; also see Carlson Reference Carlson2020).

Social scientists, in particular, are cited more often by journalists, and the rate at which they are cited is continuing to grow more quickly than that of other experts (Albӕk, Christiansen, and Togeby Reference Albӕk, Christiansen and Togeby2003; Albaek et al. Reference Albæk, Elmelund-Præstekær, Hopmann and Klemmensen2011; Wien Reference Wien2014). This might seem obvious in a study of foreign policy journalists, but it is true across a range of issues. Jonker and Vanlee (Reference Jonker and Vanlee2023) find, for example, that compared to their representation among university personnel, social scientists are the most highly overrepresented academics in terms of all mentions in the written press. Albӕk et al. (Reference Albæk, Elmelund-Præstekær, Hopmann and Klemmensen2011) remind us that, even though their research focuses on an inherently political topic—elections—social scientists are not necessarily experts on all important topics that journalists might cover during an election campaign, such as climate change or health care.

This suggests at least two reasons members of the media might turn to social scientists more often than other types of experts. First, journalists may have greater training in and awareness of social science knowledge and methods. Most leading journalism schools in the United States started as social science departments at universities, “which brought social science approaches into the training of journalists” (Barnhurst and Mutz Reference Barnhurst and Mutz1997). Wibhey’s (2017) survey of science journalists finds, moreover, that journalists with specialization in a particular area are more likely to consult and cite experts in their stories. Second, regardless of the topic, but particularly in an area like international relations and foreign policy, most stories are about policy. Even stories about science issues or health care are generally about political and policy issues surrounding the technical or scientific problem. In such cases, journalists generally provide their own commentary on the non-political side of the stories but rely on experts to comment on political issues. It is not surprising, then, that social scientists, and political scientists in particular, are the most cited experts (Reinemann and Wilke 2003, as cited in Hopmann and Strömbäck Reference Hopmann and Strömbäck2010). Nor is it surprising that the expert remarks in the media have changed from a focus on communicating the results of academic research to commenting on political issues (Albӕck, Christiansen, and Togbey 2003; Albӕck et al. 2011; Wien Reference Wien2014).

It matters how journalists portray the experts and expertise they use in and for their stories. Page, Shapiro, and Dempsey (Reference Page, Shapiro and Dempsey1987) find that experts exert a strong influence on public opinion, a finding that is confirmed by more recent literature on the effect of expert cues on public opinion. (Maliniak et al. Reference Maliniak, Parajon, Peterson and Powers2024; Johnston and Ballard Reference Johnston and Ballard2016; Lewandowsky et al. Reference Lewandowsky, Ullrich, Seifert, Schwarz and Cook2012; Van der Linden Reference Van der Linden2015).

Unfortunately, another major insight of the journalism and communications literature suggests that the media generally does not accurately portray the degree of consensus among experts on a given issue. Rather, journalists prefer to depict controversy or a balance of opinion among specialists on an issue. For example, despite persistent and overwhelming consensus among IR scholars against the 2003 war in Iraq, and increasingly public representation of that consensus as the war progressed, major U.S. daily newspapers overrepresented pro-war arguments by academic experts throughout the duration of the conflict (Long et al. Reference Long, Maliniak, Peterson and Tierney2015). On climate change, similarly, the media often highlights the views of so-called experts who deny that climate change is real and human-induced (Boykoff and Roberts Reference Boykoff and Roberts2007; Antilla Reference Antilla2005; Boykoff and Boykoff Reference Boykoff and Boykoff2004). “Media critics have expressed concern that the journalistic standard of balance is now so ingrained in reportage that it is reflexively applied even to issues for which the weight of evidence overwhelmingly supports one ‘side,’ a phenomenon that has been labeled false balance” (Koehler Reference Koehler2016; also see Boykoff and Boykoff Reference Boykoff and Boykoff2004).

The literature suggests at least two possible sets of reasons for this media bias. First, in presenting a false sense of balance in expert opinion, journalists cater to media consumers, who prefer novelty, controversy, and balance to fact-based reporting and expert consensus. “[T]he increased commercialization of the media creates incentives for journalists to focus on those aspects with which the largest audience can be reached” (Patterson 1993; also see Bennett Reference Bennett2007; Iyengar Reference Iyengar1991; Merkley Reference Merkley2020; Nelkin Reference Nelkin1996). Unfortunately, consensus is boring; controversy is not. Commercial incentives create “a perverse situation where poorer quality research can garner more news coverage than robust research … as the poorer research is more likely to yield surprising and newsworthy results’’ (Dempster 2022). Moreover, the media system produces incentives to respond to the public’s preference for “bothsideism.” According to a recent survey by the Pew Research Center, 76% of Americans believe that journalists should always try to give all sides of an issue equal coverage (Forman-Katz and Jurkowitz Reference Forman-Katz and Jurkowitz2022). In short, according to demand-side arguments, false balance is the product of a commercial press that must cater to its market.

Second, the media’s tendency to portray false balance may result less from the demands of consumers in the media market than from journalists’ own preferences. As we saw earlier, journalists may strategically use experts and expertise to support and illustrate an idea to which they already are committed, although our findings provide little support for this argument. But journalists also have other strategic and normative reasons for pursuing balance in expert opinion in their stories. The professionalization of the media and its normative commitment to objectivity (e.g., Schudson Reference Schudson2001) increased journalists’ dependence on experts, as noted earlier, but it also contributed to the development of a norm of balance. Balance requires neutrality and fairness; it “requires that reporters present the views of legitimate spokespersons of the conflicting sides in any significant dispute, and provide both sides with roughly equivalent attention” (Entman Reference Entman1990; also see Durham Reference Durham1998; Muñoz-Torres Reference Muñoz-Torres2012). Ironically, in other words, this bias toward false balance was born in part from the need to avoid the appearance of bias. False bias also, finally, may be the deliberate result of attempts by journalists to guard against the fact that the experts, even when they agree, may be wrong. As Conrad (Reference Conrad1999) argues, “[Q]uotes are used to balance enthusiasm for a new finding with a ‘cautionary note’” more often than they are used to provide balance on a controversial issue.

The effects of false balance may be profound. Numerous studies suggest that it may distort the public’s views—causing confusion, uncertainty, and conflict and leading media consumers to believe that the experts are divided—on issues about which there is little or no controversy among the specialists (Koehler Reference Koehler2016; Rietdijk and Archer Reference Rietdijk, Archer, Snow and Vaccarezza2021). We see such effects especially in the areas of climate change (e.g., Corbett and Durfee Reference Corbett and Durfee2004) and the supposed link between vaccines and autism (e.g., Dixon and Clarke Reference Dixon and Clarke2013).

In the rest of this paper we examine how well some of the claims of the journalism and communications literature apply to the role of IR scholars in the foreign policy media. Specifically, we ask:

  1. 1. Do foreign policy journalists use IR scholars and scholarship in their stories? How heavily do they rely on these experts, and why?

  2. 2. Do foreign policy journalists depend more heavily on social scientists than on experts from other disciplines?

  3. 3. Do foreign policy journalists seek to accurately depict the level of consensus among IR scholars on particular issues, or do they seek balance in their coverage?

We turn next to a description of the survey we use to answer these questions.

About the Survey

The Teaching, Research, and International Policy (TRIP) Project at the Global Research Institute at William & Mary launched the TRIP Foreign Policy Journalist Survey on April 9, 2019, and the survey remained open until June 8, 2019 (Entringer et al. Reference Entringer García Blanes, Gillooly, Peterson, Powers and Tierney2025). The survey was programmed and hosted on the online survey platform Qualtrics. In this section, we discuss the sample of respondents to whom the survey was sent, methods for collecting this sample, timing of the survey, the response rate, and demographic characteristics of our respondents.

We sent the survey to a broad range of foreign policy journalists. We gathered the sample via a three-pronged strategy. First, we identified foreign policy journalists who worked on topics related to international affairs and were employed by news outlets identified in the 2018 edition of the Pew Research Center’s “State of the News Media” report. These outlets included the most widely read newspapers and magazines (e.g., The New York Times, Wall Street Journal, The New Yorker, The Atlantic), digital-only news sources (e.g., Vox, Slate, Buzzfeed, Axios), radio and podcasting networks (e.g., NPR, Voice of America, Public Radio International), and broadcast and cable television news networks (e.g., NBC News, CNN, Fox News). Second, we supplemented our sample by incorporating journalists who were included in the sample of the 2019 Chicago Council on Foreign Relations survey of foreign policy opinion leaders. Third, we used the Leadership Library database to identify individuals with expertise in “international affairs/foreign affairs” or “defense” who were employed by news media organizations. We removed any duplicates across sources and then used Leadership Library, LinkedIn, personal websites, and other online resources to secure contact information.

These efforts yielded a sample of 1,059 foreign policy journalists. We invited these journalists to participate in the survey via e-mail. Our invitation was endorsed by two prominent foreign policy journalists, David Sanger and Eric Schmitt, but we offered no inducements to encourage participation. A total of 183 journalists responded, resulting in a response rate of 17.3%. The sample of respondents is representative of our larger population in terms of age, gender, and education. Table 1 displays the demographic characteristics of respondents to the survey.

Table 1 Demographic characteristics of respondents

Results

We divide our findings into two parts. In the first section, we report on foreign policy journalists’ use of IR experts and research in media stories. Next, we explore the ways these journalists represent the level of expert consensus in their work: do they accurately reflect expert consensus when it is present, or do journalists instead seek to represent a balance of expert opinion, even when no such balance exists among the experts?

Media Use of IR Experts and Expertise

As we noted earlier, there are good theoretical and empirical reasons to believe that journalists’ reliance on experts and expertise is strong and growing. To further explore this issue, we asked foreign policy journalists how and how often they consult IR experts, at what stages and for what reasons they consult experts, and what, if any, obstacles exist to using academic knowledge. Using a single survey we cannot test whether journalists’ reliance on experts is increasing over time, as the literature suggests, and we do not directly test whether journalists cite experts in the social sciences at higher rates than experts in other fields. We can speak, however, to the media’s overall reliance on experts, make some initial comparisons between the social sciences and other areas, and explore their reasons for citing experts, including greater communication from academics.

Overall, we find a high level of engagement between foreign policy journalists and IR experts, consistent with the more general claim that journalists frequently cite experts and expertise. Again, consistent with the literature, foreign policy journalists favor social scientific sources and knowledge above others. Interestingly, however, we find that they do not cite academic experts largely because these experts are adept at communicating their research to the media.

Frequency of engaging experts. We began by asking foreign policy journalists how often they consult academic expertise and, more specifically, social science research, and their answers support prevailing arguments about the media’s use of experts and expertise. In response to the question, “How often do you, or did you, relate the arguments and evidence made in social science research to your work?” 93.4% of the foreign policy journalists reported using such research at least a few times a year. This includes 36.8% who said they use social science research daily or a few times a week. The results were very similar when we asked, “How often do you utilize the arguments and evidence from scholarly research on foreign policy or international politics topics in the work you do as a journalist?” The one difference in responses between the two formulations of the question is that the percentage of journalists who reported using social scientific research on a daily basis was more than 6 percentage points higher than those who reported using IR research. At the same time, we found that the percentage of respondents who reported using IR research a few times a week was nearly 5 points higher than those who said they used social science scholarship. Our findings across the questions on the utility of research are consistent with those from surveys of U.S.-based journalists (Wihbey Reference Wihbey2017), journalism professors (Ordway (2020), and health and science journalists (Gesualdo et al. Reference Gesualdo, Weber and Yanovitsky2020) (see figure 1).

Figure 1 Frequency of engagement with IR knowledge, policymakers and journalists

Note: “TRIP 2017 Policymaker and 2019 Journalist Surveys”

Social science experts. At first glance, it might seem that a group of foreign policy journalists, who presumably have greater knowledge of the social sciences than of non-social-science fields, would not need to rely heavily on social science experts. Indeed, foreign policy journalists are more likely to be trained in fields like political science or international affairs than science or business. Substantial numbers (13.7%) of the journalists we surveyed report that they hold a degree in political science, while 16.9% have a degree in international affairs. These may not seem like particularly large percentages except when we consider that the only categories with higher percentages are journalism (25%) and all other, non-listed fields combined (18.5%). By comparison, only 0.8% of respondents say they have a degree in the sciences, and none hold a business degree.

Our findings are consistent with the consensus within the media literature that journalists look to social scientists more often than other types of experts. The literature suggests that members of the media cite social scientists most often because journalists are most likely to consult in their own area of expertise (Wibhey 2017) and because they most often seek commentary on political issues (Reinemann and Wilke 2003, as cited in Hopmann and Strömbäck Reference Hopmann and Strömbäck2010), perhaps to avoid the appearance of bias. We asked respondents, “How useful are the arguments and evidence used in the following disciplines?” and we provided a list of twelve different fields of study. As Figure 2 shows, journalists report that most disciplines generally considered part of the social sciences are useful to their work.Footnote 2 Sociology and anthropology are the exceptions: only 68.5% of journalists find the former and only 50.8% find the latter to be useful. In contrast, between 86.9% and 96.1% find the other social sciences to be helpful. Cognate disciplines, like history and law, also fare well, as do area studies, which draw from both the humanities and the social sciences. This is in stark contrast to the natural sciences (53.6%) and business (57.6%). Even psychology, while part social science, performs poorly with only 52.4% of journalists saying it is useful. This is true despite the fact that psychology is increasingly conceived of and practiced as a hard science field.

Figure 2 How useful are the arguments and evidence used in the following disciplines?

Note: “TRIP 2019 Journalist Surveys”

At the same time, we see variation in journalists’ use of experts from different fields depending on the type of news outlet in which journalists’ work primarily appears. Journalists at magazines and radio, although they only comprise 11% and 4.5% of our sample respectively, value sociology, anthropology, and business more highly than do other journalists. Magazine journalists also find psychology and the physical sciences to be more useful than do their colleagues at other types of news outlets.

Stage of writing. As another measure of the media’s use of IR experts and expertise, we asked foreign policy journalists, “At which stages of the writing process are you likely to seek out academic research?” Much of the literature focuses only on the direct citation of or commentary by experts in stories, although several existing surveys of journalists explore the myriad ways journalists can make use of academic knowledge (e.g., Albӕk 2011; Wihbey Reference Wihbey2017).

Our findings, displayed in Figure 3, are largely consistent with the findings of these studies. First, we find that foreign policy journalists make use of academic expertise in many ways. Majorities of survey respondents reported that they pursue expert knowledge when “seeking supplemental information for a story prior to publication,” “seeking to quote or otherwise cite an expert for a story,” and “researching background information for a story.” A sizable minority (28.1%) also report that they consult knowledge experts “when looking for topics for a story.” Wihbey (Reference Wihbey2017) similarly finds that majorities of journalists find academic research “very helpful” in providing story context, improving accuracy, countering misleading claims, improving the framing of a story, and sparking new story ideas. Second, our results show that journalists are most likely to seek out experts and expertise when they conduct background research for a story. In that case, journalists may or may not also directly cite the academic expert in the story. In Wihbey’s (Reference Wihbey2017) study, 74% of journalists report that academic research is very helpful. Finally, we do not find evidence that journalists cite experts and expertise because academics are reaching out to members of the media to communicate their own research and suggest or drive a media story, as Albӕk et al. (Reference Albӕk, Christiansen and Togeby2003) suggest might be true.

Figure 3 Responses to “At which stage of the writing process are you likely to seek out academic research”

Note: “TRIP 2019 Journalist Surveys”

Reasons for using expertise. To further understand when and why journalists use academic knowledge, we asked survey respondents about some specific scenarios in which they might look to experts and expertise. These scenarios included: “when I need external support or validation for content I am creating,” “when I need an ‘outside perspective’ on content I am creating,” “when I need more substantive expertise,” and “when I need to fact check claims made by those who hold public office.” Figure 4 presents the results for this question.

Figure 4 Responses to “How often do you seek out the results of academic research in each of the following situations?”

Note: “TRIP 2019 Journalist Surveys”

Again, a few findings stand out. First, foreign policy journalists turn to academic scholarship relatively often in all four scenarios; between 76.8% and 96.3% of respondents reported that they sometimes, often, or very often sought out academic expertise in these situations. Second, by far the circumstance under which journalists most often look to academic research is when they need substantive expertise. Third, we find only limited evidence that foreign policy journalists pursue academic research to provide external support for their own arguments. Although 39.6% of respondents said they often or very often use academic expertise in this way, they do so in this situation at the lowest rate of any of the scenarios provided. This finding suggests that, contrary to some of the journalism literature (e.g.,Weiss and Singer Reference Weiss and Singer1988; Rich Reference Rich2001), validating one’s own conclusions in an already planned story does not appear to be an especially strong reason that journalists look to academic experts and expertise.

Obstacles to using academic research. The accessibility of academic scholarship provides a potentially important constraint on journalists’ ability to use this expertise in their stories. In recent years, numerous IR scholars and other commentators have explored the real and perceived obstacles to sharing academic research with broader audiences, including journalists. Most of these observers blame what they see as a growing gap between the academic and policy communities of IR on the academy for fostering a “cult of irrelevance” within IR and political science more generally (Desch Reference Desch2019). Such a cult produces research that is too abstract and theoretical to be of significant use to policy practitioners (e.g., Gallucci Reference Gallucci2012; Nye Reference Nye2009; Jentleson Reference Jentleson2002). Critics have targeted much of their ire at what John Mearsheimer calls the “mathematization” of the discipline (Miller Reference Miller2001).

Much of these critiques are borne out by the small number of previous surveys of journalists to address the obstacles to journalists’ use of academic research. One such study conducted in 2021 found that 60% of journalists said that paywalls were an obstacle to finding and using such research, while 58% reported that academic jargon limited their access to academic work. In addition, 31% said they did not have time to look for this research, while 54% said they didn’t have time to read lengthy academic articles (Ordway Reference Ordway2022). A recent review of how journalists use social science research also points to “generally low levels of numeracy and statistical knowledge among journalists” as a significant obstacle.

To fully understand the media’s use of IR scholarship, we asked foreign policy journalists about their views on various obstacles to using academic research in their work. The question asked respondents to rate each obstacle as “very significant, significant, slightly significant, or not significant at all.” We present the results in figure 5, grouped by either significant (somewhat or very significant) or not significant (slightly significant, not significant at all). Two features of these results are worth highlighting. First, there remain significant obstacles to using research, especially with respect to the timeliness and availability of research, time needed for journalists to keep up with new academic work, and the high level of abstraction at which academic knowledge is often generated. Second, in contrast to findings of the journalism literature and critiques of the IR discipline, the least significant obstacle reported by journalists was that “academic work is too quantitative.” This finding is less surprising when viewed in the context of the increasing prevalence of “data journalism,” an emerging computational journalism field that involves the analysis and visualization of large datasets, often originally assembled by scholars, to tell media stories (e.g., Fink and Anderson Reference Fink and Anderson2015; Gray, Chambers, and Bounegru Reference Gray, Bounegru and Chambers2012).

Figure 5 For your colleagues, how significant are the following potential obstacles to using academic knowledge in their work? (Grouped by significant and not significant)

Note: “TRIP 2019 Journalist Surveys”

At the same time, however, the foreign policy journalists we surveyed seemed troubled by the tendency of academics to write for each other in a language and using methods that are not easily understood by those outside the profession. Although the use of quantitative analyses in academic research was the least significant of the obstacles mentioned, 39.4% of journalists still said it was a significant obstacle to using experts and expertise in their stories. Large numbers of respondents (65.9%) also agreed that the “too jargony” academic terminology impedes their use of research. Even if they are not overly worried about quantitative data and analysis, in short, foreign policy journalists see the cliquish nature of academic disciplines as a real barrier to using academic experts and expertise in media stories.

Engagement on social media. Citing academic knowledge in stories is not the only way that journalists may use experts and expertise. Correspondents and other journalists also regularly follow academics on social media platforms, for example, as a means of tracking commentary on international issues and events and learning about new IR research. A whopping 91.4% of respondents said that they follow scholars who write about foreign affairs or international relations on social media platforms such as TwitterFootnote 3 or Facebook. Of the journalists who follow IR scholars on social media, 63.8% reported that they view content from IR scholars on social media “daily,” and a further 25.2% noted that they view such content “a few times a week.” Only 3.2% reported that they “never” view such content or view it only “a few times a year.” These findings suggest that a primary—if not the main—way in which foreign policy journalists interact with IR scholars is via social media platforms.

Sources of expertise. Our final measure of foreign policy journalists’ use of expertise examines the sources of information for their stories. Conventional wisdom suggests that journalists, policymakers, and the public are more likely to garner information on foreign affairs from blog posts, op-eds, and policy reports than from peer-reviewed research generated by members of the academy, since the former are more available and accessible to a wider audience than the more esoteric, academic publications (Avey et al. Reference Avey, Desch, Petrova and Wilson2021b; Tama et al Reference Tama, Barma, Goldgeier and Jentelson2023). To test the veracity of this claim, we asked journalists how important a variety of different information sources are to their work. The results are displayed in Figure 6. Survey respondents indicated that the most important sources were newspapers and news magazines, think tank and NGO reports, policy journal articles, and interviews with academic experts. The least important source—and the only source that a majority of respondents indicated was not important to their work—was television and radio. Just under 80% of the journalists reported that social media is important to their work.

Figure 6 Responses to “How important are the following sources of information to the work you do as a journalist?”

Note: “TRIP 2019 Journalist Surveys”

Although we were unable to test whether the media’s use of experts and expertise is increasing over time using a single survey, we find ample evidence that foreign policy journalists frequently use academics and academic expertise in their stories despite continuing obstacles to doing so. Our results also show that the journalists surveyed cite social science sources at a greater rate than expert sources in other fields. At the same time our respondents did not report that they cite experts and expertise because academic experts are sharing their research with members of the media. In short, we find supporting evidence for most arguments from the literature on media and expertise. We turn now to a test of whether journalists accurately depict the level of consensus among experts on a given topic.

The Dilemma of Consensus

As we noted earlier, recent literature in IR explores the impact of expert cues on public opinion, while related literature in journalism studies focuses on the influence of different levels of expert consensus on the way in which journalists portray that consensus in the media. The journalism literature suggests that members of the media have little incentive to accurately portray expert consensus. Rather, to attract large audiences for their work, appear objective, and guard against potentially bad research, journalists generally seek balance in their coverage, even when the experts are not balanced in their views (e.g., Merkley Reference Merkley2020; Bennett Reference Bennett2007; Boykoff and Boykoff Reference Boykoff and Boykoff2004; Iyengar Reference Iyengar1991). To assess these arguments, we explore whether foreign policy journalists are likely to cover a topic differently when scholars agree on the best course of action compared to when the experts are divided.

We study these questions using a survey experiment. In the experiment, we informed respondents about the level of support among scholars for a proposed international treaty and then asked the respondents what percentage of the expert sources they would cite or quote in their reporting as being in favor of the agreement. In short, how might they balance supporting and opposing views? Because the treatments were randomly assigned, any differences in responses across treatment conditions can be interpreted as the average causal effect of each treatment on our respondents’ reported sourcing behavior.

All respondents saw a question about IR scholars’ views on a proposed arms control agreement and a question about economists’ views on a proposed trade agreement. This also allowed us to explore whether foreign policy journalists find IR scholars’ views to be more or less credible or convincing than the views of other social scientists. In both cases, we randomly and independently varied the level of consensus among scholars about the wisdom of the policy under discussion. In the low consensus treatment, respondents learned that 52% of experts on the issue supported the agreement, while in the high consensus treatment, respondents learned that 93% of experts on the issue supported the agreement. Following treatment, we asked our respondents what percentage of the expert sources they would quote or cite as being in favor of the agreement. Would they cite and quote sources on both sides of a policy question equally, regardless of the balance of opinion within the academy, or would they shift their sourcing depending on the actual level of consensus among the experts?

All respondents saw the question about IR scholars first and the question about economists second. The two questions were separated by a short battery of questions on issues unrelated to the views of experts. The level of support varied randomly and independently in both questions so that some respondents saw the same levels of support among both sets of scholars, while others saw different levels. The exact wording of the two questions was [randomized treatment appear in brackets]:

Suppose you learned that [52/93] percent of scholars of international relations supported an international arms control agreement. If you were choosing expert sources to cite or quote in an article on this issue, what percentage of those sources do you estimate would favor such an agreement?

Suppose you learned that [52/93] percent of scholars of economics supported an international trade agreement. If you were choosing expert sources to cite or quote in an article on this issue, what percentage of those sources do you estimate would favor such an agreement?

Looking only at the average treatment effect, the results are consistent with a strong preference for accurately reflecting consensus within the academy. Respondents used a slider to indicate their response, which allowed whole number inputs from 0 to 100 percent. The results are displayed in Figure 7. For the question on IR scholars, respondents said that about 54.5% (95% CI: 50.7, 58.3) of the sources they cited or quoted would be in favor of the agreement when they learned that 52% of IR scholars favored the agreement. In contrast, respondents said that they would recruit sources so that about 74.9% (95% CI: 69.45, 80.32) of them favored the agreement when they were told that 93% of IR scholars favored the agreement. This difference of 20.4 percentage points is large and statistically significant, but it also is smaller than we might expect if journalists had an unconditional preference for reflecting the actual distribution of opinion among scholars.

Figure 7 Effect of high consensus treatment split by scholar type

Note: “TRIP 2019 Journalist Surveys”

Plotting the distribution of responses by treatment group instead of just the means (see Figure 8) yields the striking result that, while there is significant consensus among journalists on the distribution of sources to cite or quote when scholars are split, there is significant disagreement when scholars overwhelmingly favor a particular policy. Some respondents indicated that they would want the distribution of responses to reflect scholarly opinion, while many others indicated a willingness to, in effect, oversample dissenters.

Figure 8 Note: “TRIP 2019 Journalist Surveys”

We see similar results for the question about the views of economists on an international trade agreement. When respondents were told that 52% of economists favored an international trade agreement, they indicated that they would recruit sources so that about 55.2% (95% CI: 51.8, 58.6) of them favored the agreement. When told that 93% of economists favored the agreement, respondents indicated that they would recruit sources such that 73.3% (95% CI: 68.3, 78.3) of them were also in favor. Again, this difference is large and statistically significant but smaller than what we might expect if journalists preferred to reflect the actual distribution of scholars’ opinions on a subject on which scholarly consensus is high. As before, we also see consensus among our respondents on the distribution of expert sources when scholarly opinion is split but significant disagreement when scholars overwhelmingly favor a policy.

What do these results mean for the role of expert views in media coverage of foreign policy issues? It appears that foreign policy journalists have a preference for reflecting the level of consensus when scholars are split, and they are responsive to changes in the level of consensus, but there is still a large market among foreign policy journalists for sources that hold contrarian views. Members of the media, it seems, are willing to highlight potential critics of a policy even if those critics are very much in the minority among academic experts. These findings support the journalism literature’s conclusions on media coverage of expert consensus. It also supports the findings of the climate change literature, which documents a tendency among members of the media to highlight the views of those who disagree with the scientific consensus that climate change is real and caused by humans (Antilla Reference Antilla2005; Boykoff and Boykoff Reference Boykoff and Boykoff2004; Boykoff and Roberts Reference Boykoff and Roberts2007). This tendency biases the public’s understanding of the scientific community’s views and, consequently, undermines support for policies meant to address climate change (Malka et al, Reference Malka, Krosnick, Debell, Pasek and Schneider2009).

Read narrowly, the effect we observe appears to be bad news. Even when scholars come to a consensus on an issue, their collective ability to shape public opinion, policy debates, and the views of policymakers through the media may be limited. One high-profile example is the debate surrounding the invasion of Iraq in 2003. IR scholars overwhelmingly opposed the war, but very little media coverage throughout that conflict actually reflected scholarly consensus. Rather, most stories included a balance of pro- and anti-invasion views (Long et al. Reference Long, Maliniak, Peterson and Tierney2015).

If we take a broader view, however, the willingness of foreign policy journalists to present contrarian views could be read as evidence of a preference to provide a hedge against the possibility of faulty science or premature consensus by exposing readers to arguments and evidence from both sides of an issue. In his book, The Ideas Industry, Drezner (Reference Drezner2017) argues that it is the role of scholars acting as true “public intellectuals” to hold accountable “thought leaders,” who are more “evangelists” than scholars, by pointing out when “the emperor has no clothes.” Journalists could be playing a similar role by oversampling dissenting or contrarian views.

It is also possible that our analogy to the climate debate may be strained. The vignettes that we displayed to respondents asked about sourcing for a story about a specific policy, not underlying scientific facts. This may seem a subtle distinction, but it is an important one. IR scholars, for example, are generally united in their view that international cooperation is a good thing for society, but they might disagree in productive (and sometimes unproductive) ways about the particulars of a proposed policy—like an arms control or trade agreement—that is meant to promote that kind of cooperation. Foreign policy journalists who highlight contrarian views might lower the probability that a policy is adopted, but they might also force changes to the proposed policy that are Pareto-improving.

All that said, we read our results as representing a dilemma of consensus in which the scholarly community has a hard time communicating its collective views via the media when there is broadly shared agreement among them. Ironically, under these conditions, the best course of action may be for scholars to take their case directly to readers via op-eds, social media, and blog posts in prominent outlets. In the extreme, scholars might feel the need to pay-to-play by purchasing ads in major newspapers as a number of prominent political scientists did in the lead up to the Iraq War and others did to register their disapproval of the Trump administration’s rejection of the U.S.-led multilateral liberal order.

We also note that our respondents provided similar answers in response to variation in the level of consensus among scholars, whether we labeled them as scholars of international relations or economics. This suggests that concerns that journalists and others may take economists “more seriously” than social scientists with different training or substantive expertise (Drezner Reference Drezner2017; Fourcade, Ollion, and Algin Reference Fourcade, Ollion and Algin2015) may be overblown. Although this is encouraging for IR scholars, the question of differential credibility is distinct from whether or not economists exercise more influence in policy debates that play out in both the media and government.

Conclusion

Despite the importance to democratic societies of the foreign policy press and the increasing recognition among IR scholars of the role of the media in communicating academic ideas and evidence to foreign policy practitioners, little attention has been given to the questions of whether, when, how, and why foreign policy journalists engage IR scholars and scholarship. Academics have made significant efforts to widen their relationship with the media through initiatives like Bridging the Gap, but they have not turned the tools of their trade to systematically studying the relationship between the academy and the media. To address this omission, in 2019 we surveyed journalists covering IR and U.S foreign policy about their views on IR experts and expertise. Our findings provide support for more general trends in and arguments about journalists’ use of experts and expertise, the extent to which they rely on social science experts above others, and the media’s portrayal of expert consensus and dissensus. Foreign policy journalists do often seek out IR experts and expertise for use in their stories and coverage, as the media literature suggests. This finding implies that the media acts as an important conveyor belt for academic knowledge, making this area of research vital for understanding how the scholarly community can communicate and share its work effectively. Engagement between journalists and experts is, in fact, robust. Foreign policy journalists also favor social science experts and expertise, as the literature on media and experts suggests they should. This finding is inconsistent, however, with what some scholars see as a “cult of irrelevance” fostered by IR and political science research (Desch Reference Desch2019).

Journalists do seek out expert opinion in a variety of contexts, particularly when they need background information or want to cite an expert in a story, but there are significant obstacles to using academic work. In particular, journalists find the time required to keep up with forthcoming academic work, the jargony writing of academia, and the lack of timeliness of academic work to be challenging. Finally, we find that foreign policy journalists underrepresent the degree of consensus among experts and oversample experts who hold a minority dissenting view when scholars decisively favor a particular policy. The scholarly community, in short, may have a hard time communicating its collective views via the media when there is broadly shared agreement among the experts, something we call the dilemma of consensus.

Overall, the increased tendency among foreign policy journalists to cite experts and expert knowledge could result in more informed news stories. Journalists often turn to academics when they need more substantive expertise in a story. They point to a number of obstacles, however, to effectively using expert knowledge, including the timeliness and availability of research, time commitment required for journalists to keep up with new and forthcoming academic work, and the high level of abstraction at which academic work is oftentimes generated.

These obstacles must be reduced if foreign policy journalists are to effectively and accurately portray experts and expert knowledge on IR. Public-facing scholarship by academic experts, such as op-eds or blogs about academic research and more applied or empirical works, can help lower barriers for the media. Indeed, producing such public-facing works has become the dominant approach to promoting engagement between the academic and policy communities of IR and a method taught by the Bridging the Gap Initiative to faculty who seek to articulate and disseminate their research to policy practitioners. We’ve seen that journalists don’t generally use academic research when scholars reach out to suggest a story, so lowering the barriers for journalists by providing accessible, timely versions of academic research may be an effective way that scholars can communicate their ideas and findings to members of the media.

Unfortunately, when expert opinion arrives in the public square, sometimes things can go wrong; IR scholars’ attempts to produce public-facing scholarship may influence the use of academic knowledge by journalists in ways not intended by the scholars who produce the research. In particular, we find that foreign policy journalists often under-represent the degree of consensus among experts and oversample those who hold a contrary view when there is overwhelming agreement about a particular policy or theme. We did not directly ask respondents how many experts they tend to cite or the extent to which those experts reflect the consensus versus dissenting views, but the results of our survey experiment clearly demonstrate that the media prefers balance over consensus. Whether this is a conscious decision on the part of journalists—that is, they deliberately rely on and present the work of a small number of scholars, rather than report on the consensus among IR scholars—or the journalists themselves do not realize the degree of consensus among the experts, the end result is the same: the public receives the impression that the field is divided on an issue, when it is not. Moreover, representing balance in the media when there is none among experts can lead to the dissemination of “bad science” to members of the public and policy practitioners, particularly around issues like climate change denial or vaccine misinformation.

The news is not all bad. The media tendency toward false balance could provide an important check on incorrect or premature consensus on international issues. Our findings suggest, moreover, that foreign policy journalists are responsive to changes in the level of academic consensus; that is, they will change the way they report on a topic if expert consensus solidifies or splits, reflecting changes in expert opinion more accurately. Nevertheless, there remains a large market among foreign policy journalists for sources that hold contrarian views, even if those contrarian experts are few. In short, it can be difficult for IR scholars to convey collective knowledge.

The survey results we report here contribute to a broader research program that examines how academics can influence policy and policymakers, and the mechanisms by which scholars can communicate their ideas and data to policymakers and the public. TRIP’s larger research program examines the mechanisms by which academic knowledge influences policy and policymakers. Researchers explore how academic knowledge is received by the public (Maliniak, Pazrajon, and Powers Reference Maliniak, Parajon and Powers2020) and how those scholarly ideas and data can be communicated directly or effectively to policymakers (Avey et al. Reference Avey, Desch, Parajon, Peterson, Powers and Tierney2021a; Maliniak et al. Reference Maliniak, Parajon and Powers2020). Ongoing research also examines how non-governmental organizations and think tanks serve as conduits for academic knowledge to enter the public and policy spheres, particularly on international cooperation and global governance issues. Our research also explores the ways that academic knowledge reaches policymakers and how they utilize that expert knowledge. Previous work examines how practitioners use or are influenced by academic ideas and data (Avey and Desch Reference Avey and Desch2014; Avey et al. Reference Avey, Desch, Parajon, Peterson, Powers and Tierney2021a), the policy relevance of publications across academic journals (Hoagland et al. Reference Hoagland, Oakes, Parajon and Peterson2020), whether and to what extent faculty members see their work as policy-relevant (Maliniak et al. Reference Maliniak, Oakes, Peterson and Tierney2011; Reference Maliniak, Peterson, Powers and Tierney2018), and how policy ideas in IR travel (Maliniak et al. Reference Maliniak, Peterson, Powers and Tierney2018).

Ongoing TRIP research also explores how academic ideas may prematurely “escape” from the confines of the academy and become endemic in policy circles. The existence of the dilemma of consensus means that expert opinion is often misrepresented to the public and in the press, increasing the possibility that poorly vetted or refuted ideas may find their way into policy circles. This ongoing research stream seeks to better understand how scholars might effectively communicate the uncertainty or scope conditions surrounding particular theories and academic work to prevent such “lab leaks” (Musgrave Reference Musgrave2021).

Finally, the TRIP research agenda examines the IR discipline to address the larger questions of what knowledge IR scholars try to communicate to public and policy audiences and how they communicate those ideas and data. These questions, in turn, compel us to ask who constitutes the IR discipline. To answer these questions, TRIP researchers and other scholars explore the politics of citations and citation imbalance in the discipline (Dion, Sumner, and Mitchell Reference Dion, Sumner and Mitchell2018; Maliniak, Powers, and Walter Reference Maliniak, Powers and Walter2013), whose work is “seen” both in and out of the academy (Hardt et al. Reference Hardt, Kim, Smith and Meister2019, Smith, Gillooly, and Hardt Reference Smith, Gillooly and Hardt2022), and who self-selects into different subfields, methodologies, or subject areas (Smith et al. Reference Smith, Gillooly and Hardt2022). Because racial and gender biases influence how expert knowledge is formed and disseminated within academic and policy circles (Gillooly, Hardt, and Smith Reference Gillooly, Hardt and Smith2021; Rublee et al. Reference Rublee, Jackson, Parajon, Peterson and Duncombe2019; Zvobgo and Loken Reference Zvobgo and Loken2020), researchers have explored how expertise is developed, understood, and influenced by the lived realities of those experts who produce it. In short, this research asks who is an expert, whose work is seen by the field, and whose research is disseminated to the public through traditional conduits like the media.

The broad TRIP research program addresses how the IR field is practiced and disseminated. We ask if and how expert knowledge impacts public opinion and policymakers. Building on that program, the current article explores the relationship between the media and expert knowledge: if, why, and how foreign policy journalists engage with IR scholars and their knowledge. The media is an important transmission belt for academic knowledge, and understanding how expert knowledge is utilized when it enters the public sphere has important implications for the study and practice of international relations.

Data Replication

Data replication sets are available in Harvard Dataverse at: https://doi.org/10.7910/DVN/DFY9AK.

Acknowledgments

The authors would like to thank Eric Parajon and Emily Jackson for their constructive feedback on an early version of this paper and for helping us organize a conference at William & Mary’s Global Research Institute in November 2019. They would also like to thank the Carnegie Corporation of New York for supporting this research.

Footnotes

1 For an important early exception, see Weiss and Singer Reference Weiss and Singer1988.

2 We combine the response options, “very useful” and “somewhat useful,” and report these combined responses in figure 2 as “useful.” Similarly, we combine “not very useful” and “not useful at all,” and we report those responses as “not useful.”

3 The survey was conducted before Twitter was rebranded as “X.”

References

Albæk, Erik. 2011. “The Interaction between Experts and Journalists in News Journalism.” Journalism 12(3): 335–48.10.1177/1464884910392851CrossRefGoogle Scholar
Albӕk, Erik, Christiansen, Peter Munk, and Togeby, Lise. 2003. “Experts in the Mass Media: Researchers as Sources in Danish Daily Newspapers, 1961–2001.” Journalism & Mass Communication Quarterly 80(4): 937–48.10.1177/107769900308000412CrossRefGoogle Scholar
Albæk, Erik, Elmelund-Præstekær, Christian, Hopmann, David Nicolas, and Klemmensen, Robert. 2011. “Experts in Election News Coverage: Process or Substance?Nordicom Review 32(1): 4558.CrossRefGoogle Scholar
Antilla, Liisa. 2005. “Climate of Skepticism: US Newspaper Coverage of the Science of climate Change.” Global Environmental Change 15(4): 338–52.10.1016/j.gloenvcha.2005.08.003CrossRefGoogle Scholar
Avey, Paul C., and Desch, Michael C.. 2014. “What Do Policymakers want from Us? Results of a Survey of Current and Former Senior National Security Decision Makers.” International Studies Quarterly 58(2): 227–46.10.1111/isqu.12111CrossRefGoogle Scholar
Avey, Paul C., Desch, Michael C., Parajon, Eric, Peterson, Susan, Powers, Ryan, and Tierney, Michael J. 2021a. “Does Social Science Inform Foreign Policy? Evidence from a Survey of US National Security, Trade, and Development Officials.” International Studies Quarterly 66(1). https://doi.org/10.1093/isq/sqab057Google Scholar
Avey, Paul C., Desch, Michael, Petrova, Ana, and Wilson, Steven. 2021b. “Narrowing the Academic-Policy Divide: Will New Media Bridge the Gap? Political Science Quarterly, 136(4): 607–39.10.1002/polq.13243CrossRefGoogle Scholar
Barnhurst, Kevin G. 2003. “Queer Political News: Election-Year Coverage of the Lesbian and Gay Communities on National Public Radio, 1992–2000.” Journalism 4(1): 528.10.1177/1464884903004001439CrossRefGoogle Scholar
Barnhurst, Kevin G., and Mutz, Diana. 1997. “American Journalism and the Decline in Event-Centered Reporting.” Journal of Communication 47(4): 2753.CrossRefGoogle Scholar
Bauer, Martin W., and Gregory, Jane. 2007. “From Journalism to Corporate Communication in Post-War Britain.” In Journalism, Science and Society – Science Communication between News and Public Relations, ed. Bauer, M.W. and Bucchi, M., 3352. Oxford, UK: Routledge.Google Scholar
Baum, Matthew, and Potter, Philip. 2008. “The Relationship between Mass Media, Public Opinion, and Foreign Policy: Toward a Theoretical Synthesis.” Annual Review of Political Science 11:3965.10.1146/annurev.polisci.11.060406.214132CrossRefGoogle Scholar
Bennett, W.L. 2007. News: The Politics of Illusion. 7th ed. New York: Longman.Google Scholar
Bolsen, Toby, and Druckman, James N.. 2015. “Counteracting the Politicization of Science.” Journal of Communication 65: 745–69. doi:10.1111/jcom.12171CrossRefGoogle Scholar
Boyce, Tammy. 2006. “Journalism and ExpertiseJournalism Studies 7(6): 889906.10.1080/14616700600980652CrossRefGoogle Scholar
Boykoff, Maxwell T., and Boykoff, Jules M.. 2004. “Balance as Bias: Global Warming and the US Prestige Press.” Global Environmental Change 14(2): 125–36.10.1016/j.gloenvcha.2003.10.001CrossRefGoogle Scholar
Boykoff, Maxwell T., and Roberts, J. Timmons. 2007. “Media Coverage of Climate Change: Current Trends, Strengths, Weaknesses.” United Nations Development Programme~Human Development Report 2007/2008—Background Paper.Google Scholar
Carlson, Matt. 2020. “Journalistic Epistemology and Digital News Circulation: Infrastructure, Circulation Practices, and Epistemic Contests.” New Media & Society 22(2): 230–46.10.1177/1461444819856921CrossRefGoogle Scholar
Carnegie Corporation of New York. 2023. “Website: Scholarship & Policy.” Retrieved July 27 (https://www.carnegie.org/our-work/category/peace-security/tag/scholarship-policy/).Google Scholar
Chan, Sophia. 1998. “Media Use of Expert Sources and Its Effects on Public Opinion.” PhD dissertation. Graduate School of the University of Wisconsin-Madison.Google Scholar
Conrad, Peter. 1999. “Use of Expertise: Sources, Quotes, and Voice in the Reporting of Genetics in the News,” Public Understanding of Science 8:285302.10.1088/0963-6625/8/4/302CrossRefGoogle Scholar
Desch, Michael C. 2019. Cult of the Irrelevant. Princeton, NJ: Princeton University Press.Google Scholar
Dion, Michelle L., Sumner, Jane Lawrence, and Mitchell, Sara McLaughlin. 2018. “Gendered Citation Patterns across Political Science and Social Science Methodology Fields.” Political Analysis 26(3): 312–27.CrossRefGoogle Scholar
Drezner, Daniel W. 2017. The Ideas Industry. New York: Oxford University Press.Google Scholar
Durham, Meenakshi. 1998. ‘‘On the Relevance of Standpoint Epistemology to the Practice of Journalism: The Case for ‘Strong Objectivity’.’Communication Theory 8:117–40.CrossRefGoogle Scholar
Corbett, Julia B., and Durfee, Jessica L.. 2004. “Testing Public (Un)Certainty of Science: Media Representations of Global Warming.” Science Communication 26(2): 129–51.10.1177/1075547004270234CrossRefGoogle Scholar
Dixon, Graham N. and Clarke, Christopher E.. 2013. “Heightening Uncertainty around Certain Science: Media Coverage, False Balance, and the Autism–Vaccine Controversy.” Science Communication 35(3): 358–82.CrossRefGoogle Scholar
Entman, Robert M. 1990. Democracy without Citizens: Media and the Decay of American Politics. New York: Oxford University Press.10.1093/oso/9780195065763.001.0001CrossRefGoogle Scholar
Entman, Robert M.. 2007. “Framing Bias: Media in the Distribution of Power.” Journal of Communication 57(1): 163–73.CrossRefGoogle Scholar
Entringer García Blanes, Irene, Gillooly, Shauna N., Peterson, Susan, Powers, Ryan, Tierney, Michael J.. 2025. “Replication Data for: International Relations Scholars, the Media, and the Dilemma of Consensus.” Harvard Dataverse. https://doi.org/10.7910/DVN/DFY9AKGoogle Scholar
Fink, Katherine, and Anderson, C.W., 2015. “Data Journalism in the United States: Beyond the ‘Usual Suspects.’Journalism Studies 16(4): 467–81.10.1080/1461670X.2014.939852CrossRefGoogle Scholar
Fink, Katherine, and Schudson, Michael. 2014. “The Rise of Contextual Journalism, 1950s–2000s.” Journalism 15(1): 320.10.1177/1464884913479015CrossRefGoogle Scholar
Fleerackers, Alice. 2023. “Why and How Journalists Report on Research: A Review.” Medium, December 20. Retrieved July 2, 2024. (https://medium.com/@alicefleerackers/why-and-how-journalists-report-on-research-a-review-51c28facb13f ).Google Scholar
Forman-Katz, Naomi, and Jurkowitz, Mark 2022. “U.S. Journalists Differ from the Public in Their Views of ‘Bothsidesism’ in Journalism.” Pew Research Center, July 13. Retrieved June 30, 2024. (https://www.pewresearch.org/short-reads/2022/07/13/u-s-journalists-differ-from-the-public-in-their-views-of-bothsidesism-in-journalism/).Google Scholar
Fourcade, Marion, Ollion, Etienne, and Algin, Yann. 2015. “The Superiority of Economists.” Journal of Economic Perspectives 29(1): 89114.10.1257/jep.29.1.89CrossRefGoogle Scholar
Gallucci, Robert L. 2012. “How Scholars Can Improve International Relations.” Chronicle of Higher Education, November 26.Google Scholar
Gesualdo, Nicole, Weber, Matthew S., and Yanovitsky, Itzhak. 2020. “Journalists as Knowledge Brokers.” Journalism Studies 21(1): 127–43.10.1080/1461670X.2019.1632734CrossRefGoogle Scholar
Gillooly, Shauna N., Hardt, Heidi, and Smith, Amy Erica. 2021. “Having Female Role Models Correlates with PhD Students’ Attitudes toward Their Own Academic Success.” Plos One 16(8): e0255095.CrossRefGoogle ScholarPubMed
Guisinger, Alexandra, and Saunders, Elizabeth N.. 2017. “Mapping the Boundaries of Elite Cues: How Elites Shape Mass Opinion across International Issues.” International Studies Quarterly 61(2): 425–41.10.1093/isq/sqx022CrossRefGoogle Scholar
Gray, Jonathan, Bounegru, Liliana, and Chambers, Lucy. 2012. The Data Journalism Handbook: How Journalists Can Use Data to Improve the News. Sebastopol, CA: O’Reilly Media, Inc.Google Scholar
Hardt, Heidi, Kim, Hannah June, Smith, Amy Erica, and Meister, Philippe. 2019. “The Gender Readings Gap in Political Science Graduate Training.” Journal of Politics 81(4): 1528–32.10.1086/704784CrossRefGoogle Scholar
Hess, Stephen. 1981. “The Washington Reporters and Their World.” The Brookings Bulletin, 17 (3): 1518.Google Scholar
Hoagland, Jack, Oakes, Amy, Parajon, Eric, and Peterson, Susan. 2020. “The Blind Men and the Elephant: Comparing the Study of International Security across Journals.” Security Studies 29(3): 393433.CrossRefGoogle Scholar
Holsti, O.R. 2004. Public Opinion and American Foreign Policy. Ann Arbor, MI: University of Michigan Press.Google Scholar
Hopmann, David Nicholas, and Strömbäck, Jesper. 2010. “The Rise of the Media Punditocracy? Journalists and Media Pundits in Danish Election News 1994–2007,” Media, Culture & Society 32(6): 943–60.10.1177/0163443710379666CrossRefGoogle Scholar
Iyengar, Shanto. 1991. Is Anyone Responsible? How Television Frames Political Issues. Chicago: University of Chicago Press.10.7208/chicago/9780226388533.001.0001CrossRefGoogle Scholar
Jentleson, B. 2002. “The Need for Praxis: Bringing Policy Relevance Back In.” International Security. 26(4), 169183.10.1162/016228802753696816CrossRefGoogle Scholar
Johnston, Christopher D., and Ballard, Andrew O.. 2016. “Economists and Public Opinion: Expert Consensus and Economic Policy Judgments.” Journal of Politics 78(2): 443–56.CrossRefGoogle Scholar
Jonker, Hans, and Vanlee, Florian. 2023. “A First Snapshot of Academics’ Media Mentions and Policy Citations in Flanders Belgium.” Proceedings:27th International Conference on Science, Technology and Innovation Indicators, Leiden, September 27–29.Google Scholar
Kerr, John R., and van der Linden, Sander. 2021. “Communicating Expert Consensus Increases Personal Support for COVID-19 Mitigation Policies.” Journal of Applied Social Psychology 52:1529.10.1111/jasp.12827CrossRefGoogle ScholarPubMed
Koehler, Derek J. 2016. “Can Journalistic ‘False Balance’ Distort Public Perception of Consensus in Expert Opinion?Journal of Experimental Psychology: Applied 22(1): 24.Google ScholarPubMed
Lewandowsky, Stephan, Ullrich, K.H. Ecker, Seifert, Colleen M., Schwarz, Norbert, and Cook, John. 2012. “Misinformation and Its Correction: Continued influence and Successful Debiasing.” Psychological Science in the Public Interest 13(3): 106–31.10.1177/1529100612451018CrossRefGoogle ScholarPubMed
Long, James D., Maliniak, Daniel, Peterson, Susan M., and Tierney, Michael J.. 2015. “Knowledge without Power: International Relations Scholars and the US War in Iraq.” International Politics 52(1): 2044.10.1057/ip.2014.38CrossRefGoogle Scholar
Lynch, Marc. 2016. “Political Science in Real Time: Engaging the Middle East Policy Public.” Perspectives on Politics 14(1) (March): 121131.CrossRefGoogle Scholar
Maliniak, Daniel, Oakes, Amy, Peterson, Susan, and Tierney, Michael J.. 2011. “International Relations in the US Academy.” International Studies Quarterly 55(2): 437–64.10.1111/j.1468-2478.2011.00653.xCrossRefGoogle Scholar
Maliniak, Daniel, Parajon, Eric, Peterson, Susan, and Powers, Ryan. 2024. “Knowledge Experts, Political Leaders, and Public Support for International Cooperation.” Unpublished manuscript (University of North Carolina: chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://ericparajon.com/files/papers/Experts_Public_Support_International_Cooperation.pdf).Google Scholar
Maliniak, Daniel, Parajon, Eric, and Powers, Ryan. 2020. “Epistemic Communities and Public Support for the Paris Agreement on Climate Change.” Political Research Quarterly, 74(4): 866–81.10.1177/1065912920946400CrossRefGoogle Scholar
Maliniak, Daniel, Peterson, Susan, Powers, Ryan, and Tierney, Michael J.. 2018. “Is International Relations a Global Discipline? Hegemony, Insularity, and Diversity in the Field.” Security Studies 27:(3): 448–84.10.1080/09636412.2017.1416824CrossRefGoogle Scholar
Maliniak, Daniel, Peterson, Susan, Powers, Ryan, and Tierney, Michael J.. 2020. “Explaining the Theory-Practice Divide in International Relations.” In Bridging the Theory-Practice Divide in International Relations, ed. Maliniak, D., Peterson, S., Powers, R., and Tierney, MJ, 116. Washington, DC: Georgetown University Press.Google Scholar
Maliniak, Daniel, Powers, Ryan, and Walter, Barbara F.. 2013. “The Gender Citation Gap in International Relations.” International Organization 67(4): 889922.CrossRefGoogle Scholar
Malka, Ariel, Krosnick, Jon A., Debell, Matthew, Pasek, Josh, and Schneider, Daniel. 2009. “Featuring Skeptics in News Media Stories about Global Warming Reduces Public Beliefs in the Seriousness of Global Warming.” Woods Institute for the Environment, Stanford University, Technical Paper. (http://woods.stanford.edu/research/global-warming-skeptics.html).Google Scholar
McCombs, Maxwell E., and Shaw, Donald L.. 1972. “The Agenda-Setting Function of the Mass Media.” Public Opinion Quarterly 36:3176–87.10.1086/267990CrossRefGoogle Scholar
Merkley, Eric. 2020. “Are Experts (News) Worthy? Balance, Conflict, and Mass Media Coverage of Expert Consensus.” Political Communication 37(4): 530–49.CrossRefGoogle Scholar
Miller, D.W. 2001. “Storming the Palace in Political Science.” Chronicle of Higher Education, September 21.Google Scholar
Muñoz-Torres, Juan Ramón. 2012. “Truth and Objectivity in Journalism: Anatomy of an Endless Misunderstanding,” Journalism Studies 13(4): 566–8210.1080/1461670X.2012.662401CrossRefGoogle Scholar
Musgrave, Paul. 2021. “Political Science Has Its Own Lab Leaks.” Foreign Policy, July 3.Google Scholar
Nelkin, D. 1996. “An Uneasy Relationship: The Tensions between Medicine and the Media. Lancet 347(9015):1600–3.10.1016/S0140-6736(96)91081-8CrossRefGoogle ScholarPubMed
Nichols, Tom. 2017. The Death of Expertise: The Campaign against Established Knowledge and Why It Matters. London: Oxford University Press.Google Scholar
Niemi, M., and Pitkänen, V., 2016. “Gendered Use of Experts in the Media: Analysis of the Gender Gap in Finnish News Journalism.” Public Understanding of Science 26(3): 355–68.10.1177/0963662515621470CrossRefGoogle ScholarPubMed
Nye, Joseph. 2009. “Scholars on the Sidelines.” Washington Post, April 13.Google Scholar
Ordway, Denise-Marie. 2022.”1 in 4 Journalists Surveyed Rarely or Never Seek Out Peer- Reviewed Research to Learn about Beat Topics.” The Journalist’s Resource, February 9. Retrieved July 1, 2024 (https://journalistsresource.org/home/user-survey-journalists-research-habits/#:~:text=An%20important%20takeaway%3A%20Nearly%20half,or%20almost%20never%20do%20that).Google Scholar
Page, Benjamin I., Shapiro, Robert Y., and Dempsey, Glenn R.. 1987. “What Moves Public Opinion?American Political Science Review 81(1): 2344.10.2307/1960777CrossRefGoogle Scholar
Patterson, Thomas E. 2013. Informing the News: The Need for Knowledge-Based Journalism. New York: Vintage.Google Scholar
Rich, Andrew. 2001. “The Politics of Expertise in Congress and the News Media.” Social Science Quarterly 82(3): 583601.10.1111/0038-4941.00044CrossRefGoogle Scholar
Rietdijk, Natascha, and Archer, Alfred. 2021. “Post-Truth, False Balance and Virtuous Gatekeeping.” In Virtues, Democracy, and Online Media: Ethical and Epistemic Issues, ed. Snow, Nancy and Vaccarezza, Maria Silvia. Abingdon-on-Thames, UK: Routledge.Google Scholar
Rublee, Maria Rost, Jackson, Emily, Parajon, Eric, Peterson, Susan, and Duncombe, Constance. 2019. “Do You Feel Welcome? Gendered Experiences in International Security Studies.” Journal of Global Security Studies 5(1): 216–26.10.1093/jogss/ogz053CrossRefGoogle Scholar
Schudson, Michael. 2001. “The Objectivity Norm in American Journalism.” Journalism 2(2): 149–70.CrossRefGoogle Scholar
Smith, Amy Erica, Gillooly, Shauna N., and Hardt, Heidi. 2022. “Assessing Racial/Ethnic and Gender Gaps in Political Science PhD Students’ Methodological Self-Efficacy.” PS: Political Science & Politics 55(1): 165–70.Google Scholar
Tama, Jordan, Barma, Naaz, Goldgeier, James, and Jentelson, Bruce. 2023. “Bridging the Gap in a Changing World: New Opportunities and Challenges for Engaging Practitioners and the Public.” International Studies Perspectives 24(3): 285307.10.1093/isp/ekad003CrossRefGoogle Scholar
Van der Linden, Sander. 2015. “The Social-Psychological Determinants of Climate Change Risk Perceptions: Towards a Comprehensive Model.” Journal of Environmental Psychology 41:112–24.CrossRefGoogle Scholar
Weiss, Carol H., and Singer, Eleanor. 1988. Reporting of Social Science in the National Media. New York: Russell Sage Foundation.Google Scholar
Wihbey, John. 2017. “Journalists’ Use of Knowledge in an Online World: Examining Reporting Habits, Sourcing Practices and Institutional Norms.” Journalism Practice 11(10). Accessed July 2,2024. Retrieved (https://ssrn.com/abstract=3066829).CrossRefGoogle Scholar
Wien, C. 2014. “Commentators on Daily News or Communicators of Scholarly Achievements? The Role of Researchers in the Danish Media.” Journalism 15(4): 427–45.10.1177/1464884913490272CrossRefGoogle Scholar
Zvobgo, Kelebogile, and Loken, Meredith. 2020. “Why Race Matters in International Relations.” Foreign Policy, June 19.Google Scholar
Figure 0

Table 1 Demographic characteristics of respondents

Figure 1

Figure 1 Frequency of engagement with IR knowledge, policymakers and journalistsNote: “TRIP 2017 Policymaker and 2019 Journalist Surveys”

Figure 2

Figure 2 How useful are the arguments and evidence used in the following disciplines?Note: “TRIP 2019 Journalist Surveys”

Figure 3

Figure 3 Responses to “At which stage of the writing process are you likely to seek out academic research”Note: “TRIP 2019 Journalist Surveys”

Figure 4

Figure 4 Responses to “How often do you seek out the results of academic research in each of the following situations?”Note: “TRIP 2019 Journalist Surveys”

Figure 5

Figure 5 For your colleagues, how significant are the following potential obstacles to using academic knowledge in their work? (Grouped by significant and not significant)Note: “TRIP 2019 Journalist Surveys”

Figure 6

Figure 6 Responses to “How important are the following sources of information to the work you do as a journalist?”Note: “TRIP 2019 Journalist Surveys”

Figure 7

Figure 7 Effect of high consensus treatment split by scholar typeNote: “TRIP 2019 Journalist Surveys”

Figure 8

Figure 8 Note: “TRIP 2019 Journalist Surveys”

Supplementary material: Link

Entringer García Blanes et al. Dataset

Link