Hostname: page-component-cb9f654ff-pvkqz Total loading time: 0 Render date: 2025-08-25T14:13:35.716Z Has data issue: false hasContentIssue false

Not Getting the Message on Climate? Attention as a Key Barrier to Mass-Marketing Experimentally-Validated Messages

Published online by Cambridge University Press:  11 August 2025

Nicholas Carnes*
Affiliation:
Sanford School of Public Policy, Duke University, Durham, NC, USA
Geoffrey L. Henderson
Affiliation:
School for Environment and Sustainability, University of Michigan, Ann Arbor, MI, USA
*
Corresponding author: Nicholas Carnes; Email: nicholas.carnes@duke.edu
Rights & Permissions [Opens in a new window]

Abstract

Scholars often use survey experiments to evaluate political messages’ persuasive effects, but messages developed in the lab do not always persuade in real-world campaigns. In this research note, we report three experiments on one central obstacle in lab-to-field messaging applications: getting people’s attention. We first analyze a large-scale direct mail campaign run by an established non-profit that promotes conservative solutions to climate change. In this experiment, postcards with messages based on extant survey-experimental research did not cause changes in key climate attitudes. In a follow-up survey experiment, identical postcards induced attitude change— Re but only when participants were required to pay attention to them. A final field experiment highlights the difficulty of inducing attention; in another real-world campaign, postcards with eye-catching scratch-off panels performed no better than standard postcards. These findings illustrate the crucial role of attention and the complexity of translating messages developed in survey experiments into effective real-world campaigns.

Information

Type
Letter
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Why do messages that have large persuasive effects in lab and survey experiments often have smaller effects in field experiments and real-world messaging campaigns?

This question has serious implications for both research and practice. Scholars have shown that lab and survey experiments tend to yield larger effects than field experiments testing similar messages (Barabas and Jerit Reference Barabas and Jerit2010; Jerit, Barabas and Clifford Reference Jerit, Barabas and Clifford2013; Coppock and Green Reference Coppock and Green2015). Even so, campaigns and activist organizations routinely use lab-based studies to identify messages that will resonate with voters. In the 2024 U.S. presidential election, for instance, Kamala Harris’s campaign invested heavily in poll-tested messages (Schleifer and Goldmacher Reference Schleifer and Goldmacher2024), but surveys in the weeks leading up to the election suggested that many of those messages may not have had the desired effects on targeted voters (Bolton Reference Bolton2024).

In this letter, we argue that one explanation for these kinds of lab-to-field disconnects is the simple difference in attention across the different settings.Footnote 1 In a lab or survey experiment, participants are usually required or strongly encouraged to pay attention to the messages the researchers are testing; in the real world, getting people’s attention can be immensely challenging. People must receive a message before they can accept it and incorporate it into their own political views (Zaller Reference Zaller1992). But in a crowded and competitive information marketplace, even a message that seems to persuade sceptics in the lab may have virtually no impact if most target participants miss it, ignore it, or habitually tune out unsolicited political communications. This fundamental difference may help explain why lab- or survey-based experiments often report larger effects than field experiments or real-world messaging campaigns.

To study the role of attention in the lab-to-field disconnect, we focus on messaging campaigns around one of the gravest threats to humanity: climate change.Footnote 2 The literature on climate messaging has largely been optimistic; in survey- and lab-based experiments, researchers regularly find that people who are disengaged, doubtful, or dismissive tend to update their views when exposed to well-designed messages. When participants are told how the greenhouse effect works (Ranney and Clark Reference Ranney and Clark2016) or that 97 per cent of scientists agree that climate change is happening and human-caused (Van der Linden et al. Reference Van der Linden, Leiserowitz, Feinberg and Maibach2015), they are more likely to report that they believe that climate change is happening, human-caused, or concerning. There are even indications that well-designed messages can promote support for government interventions to reduce greenhouse gas pollution or promote clean energy (Myers et al. Reference Myers, Nisbet, Maibach and Leiserowitz2012; Stokes and Warshaw Reference Stokes and Warshaw2017; Hart and Feldman Reference Hart and Feldman2021). While some survey experiments have produced ambiguous or null results (McCright et al. Reference McCright, Charters, Dentzman and Dietz2016; for a review, see Hart and Feldman Reference Hart and Feldman2021), the climate communication literature largely argues that the right messages can change people’s minds about climate change.

However, there is almost no research on whether climate messages developed ‘in the lab’ (in survey- or lab-based experiments with captive audiences) have the same persuasive effects in real-world climate messaging campaigns. We know of just one published study that has reported a successful climate messaging field experiment: Goldberg et al. (Reference Goldberg, Gustafson, Rosenthal and Leiserowitz2021) find that a month-long online advertisement campaign delivered to 1,600 moderate and conservative respondents through Facebook, YouTube, and web-based ads caused five- to ten-percentage-point increases in the shares of respondents who gave expected responses to survey items about climate change. However, even this well-done study had natural limitations: as the authors note, the study was confined to two congressional districts, the randomization and data collection occurred at the level of zip codes (not individuals), and the study could only measure short-term impacts.

In this letter, we investigate whether lab-tested climate messages work in the field, then return to a survey experimental format, where we can control attention, to determine whether it serves as a mediator. As a final extension, we return to the field to assess whether attention gaps can be addressed with standard real-world techniques for attracting attention. As Goldberg et al. (Reference Goldberg, Gustafson, Rosenthal and Leiserowitz2021, 573) note, ‘[survey] experiments are conducted within a controlled laboratory setting, where the respondents are asked to devote their full attention and are aware of the artificial nature of the situation. The real world is messier.’ In that mess, do messages developed in the lab actually get people’s attention?

Study 1: From the Lab to the Field

We first conducted a large-scale field experiment to estimate how a conservative organization’s messages affected right-leaning Americans’ support for climate action. This organization drew on survey-experimental findings in the climate communication literature to develop a large-scale direct mail campaign tailored to a conservative audience. We know of no prior published research that has used individual-level randomization or a within-subjects design to measure the real-world impacts of a large-scale climate messaging campaign.

For this study, we partnered with republicEn, a non-profit organization that promotes conservative solutions to climate change. The organization was ideal because it is a conservative group headed by a former Republican member of Congress, Representative Bob Inglis from South Carolina. Moreover, Inglis’s own messaging strategy has previously performed well in survey experiments. The successful Goldberg et al. (Reference Goldberg, Gustafson, Rosenthal and Leiserowitz2021) field experiment used messages developed by Inglis, and in another survey experiment analyzing recorded messages from a dozen climate activists, Inglis’s recording had the largest impact (Commercon et al. Reference Commercon, Goldberg, Rosenthal and Leiserowitz2021).

In late spring 2022, republicEn worked with experts in political marketing to develop four postcards rooted in climate messaging strategies supported by past survey-experimental research:

  1. (a) Shared conservative identity emphasized how shared conservative values support action on climate change,

  2. (b) Adversarial conservative identity emphasized shared conservative values and a concern that the political left would ‘get it wrong’ on climate change,

  3. (c) Scientific consensus emphasized that 97 per cent of scientists agree that climate change is happening and human-caused, and

  4. (d) Health effects emphasized the negative health effects associated with climate change.

Each of the four main postcards used different images, fonts, and text. To separate the effects of formatting from message effects, republicEn also developed three variants of the shared conservative identity, scientific consensus, and health effects postcards that mirrored the fonts, images, and colours from the adversarial conservative identity postcard. To separate message effects from messenger effects, they also developed two variants of the scientific consensus and health effects postcards that removed any reference to shared conservative values and changed the institutional affiliation of the sender from republicEn to the George Mason Energy and Enterprise Initiative. (For postcard images, see Appendix A.)

We then sent lettersFootnote 3 to 62,885 Republican voters in a conservative Southern US congressional district inviting recipients to participate in an ostensibly unrelated online survey; 4,222 people did so. A month after our survey invitation, republicEn sent postcards to people who opted to participate in our survey, following the Broockman, Kalla and Sekhon (Reference Broockman, Kalla and Sekhon2017) framework for individual-level field experiments with survey outcomes. Each participant was randomly assigned one of the nine postcard designs (sending the same card three times within one week, per the advice of republicEn’s advertising firm) or to a control group that received no postcard. A few days after the last postcard, we sent an ostensibly unrelated survey to determine how recipients’ attitudes had changed; 2,353 completed the second-wave survey (56 per cent, an acceptable response rate by the standards of similar studies; Broockman, Kalla and Sekhon Reference Broockman, Kalla and Sekhon2017). (For additional technical details about each of our experiments, see Appendix C.) Using data from republicEn, we also studied the rates at which respondents engaged in the actions republicEn hoped to elicit: following the QR code link on the postcard to watch climate-related videos and signing up for the organization’s email list.

We expected voters who received these postcards to be more likely to report four of the beliefs that have been most often studied in the literature on climate communication; namely, the views that climate change is (1) happening, (2) human-caused, (3) concerning, and (4) the most important problem facing the country. In Study 1, we also tested the hypotheses that recipients would be more likely to think that climate change was (5) concerning to other Republicans (for question wording, see Appendix C) and that they would be (6) more likely to engage with republicEn (by signing up for the organization’s email list).Footnote 4

However, none of the hypothesized effects were clearly supported by the results of the study. We first analyzed data from republicEn on engagement. Each postcard invited recipients to help republicEn by using a QR code, which directed users to a website where they could rate three videos on climate change. The partner organization supplied us with data on QR code use and whether research subjects took the low-stakes but significant step of signing up to receive emails from republicEn (thereby becoming members of the organization). Of the 4,222 people who completed the first-wave survey, 3,797 received one of the nine versions of the postcard (425 were in the control group). However, in the two weeks after the first postcard arrived, the QR codes on the postcards were used to access the republicEn landing page just ninety-six times (at most, 2.5 per cent of recipients, assuming that no recipient visited the site more than once), and only 14 of the 3,797 postcard recipients signed up for the republicEn listserv (0.4 per cent), a difference relative to the control group that was not statistically significant even at p < 0.20. That is, registrations for republicEn’s email list were so few that, in a statistical sense, we could not reject the hypothesis that the postcards failed to cause anyone to sign up for republicEn (although, obviously, fourteen participants did in fact sign up).

Of course, postcards that fail to move people to action might still nudge them to update their views. However, comparing first- and second-wave surveys did not suggest that meaningful attitude change occurred. Figure 1 graphs the average effect of each postcard (relative to the control group) on each of the six survey items (Happening is shaded black, Human-caused is shaded dark grey, and so on). To facilitate comparison, we rescaled each survey item to range between 0 and 1, so the effects in the figure represent the percentage of the maximum effect a postcard could have had. All of the estimates were calculated so that higher values were in line with our expectations —for example, higher values in the graph indicate that the average participant was more likely to believe that climate change is happening, human-caused, or concerning after receiving the postcard. (Complete regression model results are listed in Appendix Table A1.)

Figure 1. Postcards Had Little Consistent Effect on Recipient Attitudes.

Notes: Each outcome variable is rescaled to range between zero and one. N= 2,353. Regression models are listed in Appendix Table A1.

For the vast majority of postcards and survey measures, there were no effects: the estimated associations were small (usually around 2 percentage points on the average participant’s score on that measure) and not statistically significant. There were just five combinations of survey measures and postcard types that generated statistically significant differences (we would expect three by chance alone), and no clear patterns were evident that would indicate that one postcard performed better than the others. The estimated differences associated with all nine postcards (the bottom group of bars in the graph) were around 2 percentage points on most survey items and were not statistically significant. In other words, the average postcard did not have an effect, and there was no ‘standout’ postcard that consistently performed well. Most estimates were positive —on most of our six survey measures, the average participant was slightly (but not significantly) more likely to give the expected response. Overall, however, a professionally designed postcard campaign rooted in extant survey-experimental findings and fielded by a conservative organization with a track record of success in past survey experiments ultimately had little short-term impact in this real-world direct mail campaign.Footnote 5

Study 2: Back to the Lab Again

Could attention be the problem with the postcards in Study 1? To find out, we conducted a survey experiment to determine whether our non-findings in the first study were because the messages themselves were unpersuasive or perhaps because recipients simply did not pay attention to the postcards. To facilitate comparisons, we used the four main postcards from Study 1 and made only minor revisions to our survey questions about climate attitudes. In effect, we asked whether we could use the same materials but experimentally manipulate attention to generate the non-findings in our field experiment and the positive findings common in survey experiments.

Using the Cint online platform, we sent a survey to 1,982 self-identified Republicans across the USA in December 2022 and January 2023. At the outset, we randomly assigned each respondent to either a control group, a low-attention group, a medium-attention group, or a high-attention group. The control group was not shown a republicEn postcard but still answered all the core survey questions, including about their climate views. The low-attention group was shown one of the four main republicEn postcards (randomizing which card) and answered an additional question about whether they had received a similar political mailer in the past; this item simulated a real-world setting in which people are free to briefly glance at postcards and move on. The medium- and high-attention groups were shown one of the four postcards and forced to remain on that screen for a full minute, then asked two factual questions about the postcard’s topic (climate change) and messenger (republicEn.org). The high-attention group was also asked whether the postcard was persuasive and to write a brief open-ended evaluation of it.Footnote 6 Survey experiments might generate larger effects than field experiments because they require participants to receive the message in question, which induces attention to the messages and/or causes inattentive people to drop out of the study at higher rates (by asking them to do something they do not wish to do —pause and pay attention to the study materials —or screens them out with pre- or post-manipulation attention checks), leaving behind only more attentive people who have received the message.

Figure 2 plots the effects of viewing a postcard; as in Figure 1, the outcome variables are scaled to range between zero and one. The topmost group of four bars compares the participants in the low-attention condition (who had the option to view the postcard only momentarily) to the control group (who did not view any postcards). The second group of bars compare the control group to respondents in the medium-attention group (who were required to view the postcard for one minute and answer two factual questions about it) and the bottom group of bars compares the control group to the high-attention group (who were required to view the postcard for a full minute, answer two factual questions, and then answer two evaluation questions).

Figure 2. Attention Was a Key Mediator in Postcard Effects.

Notes: Each outcome variable is rescaled to range between zero and one. Control group n = 441. Regression model results are listed in Appendix Table A2.

In line with the results of Study 1, respondents in the low-attention group were 3 to 5 percentage points more likely to report that they believed that climate change was happening, human-caused, and concerning, and the differences were not statistically significant. That is, respondents who were free to ignore the message did not exhibit measurable short-term changes in their climate attitudes. However, when respondents were forced to view the postcards for just one minute, they exhibited statistically significant differences in their climate attitudes, more in line with past survey-experimental research. Respondents in the medium-attention condition who were required to view the postcards for one minute were 7 to 14 percentage points more likely to say that climate change is happening, human-caused, and concerning (relative to people in the control group who did not see postcards), and the differences were statistically significant. Respondents in the high-attention group were 5 to 7 percentage points more likely to report that climate change was happening and concerning, and the differences were statistically significant (except on the item about whether climate change is human-caused).

Our findings were similar in robustness tests that added controls, estimated two-stage models using attention check questions, separately compared participants who were and were not primary election voters, and analyzed the four postcard designs separately (see Appendix C). We found little evidence of differential attrition in this context (see footnote 6); we believe the differences noted here reflect differences in induced attention. When we allowed people to ignore the postcards (in a setting that mimicked the real world of large-scale messaging campaigns), our findings mimicked the null results of the field experiment in Study 1. When we compelled respondents to pay attention to the postcards (in a setting that mimicked recent successful survey experiments), our findings mimicked the encouraging results of recent successful survey experiments. Attention, then, would seem to be one key to understanding why climate messages that work in the lab do not necessarily generate large effects in the field.

Study 3: Can Direct Mail Induce Attention?

If attention is a key obstacle to translating messages developed in the lab into a direct mail campaign, can activists simply generate more attention in the field? To test one way to do so, we worked with republicEn to conduct a third experiment. In December 2023, republicEn selected a congressional district for a direct mail campaign, and in January 2024 we sent emails (rather than letters, as in Study 1) inviting 50,000 registered Republicans in that district to participate in an ostensibly unrelated survey; 538 completed the initial survey (a far lower response rate than in Study 1, likely owing to our switch from mailing physical invitations in Study 1 to emailing survey invites in Study 3).

In February 2024, republicEn sent each of our 538 survey participants one of two postcards (there was no control group): the Scientific Consensus postcard from Study 1 or a modified version designed to be more attention-grabbing (for an image, see Appendix C). Instead of opening with ‘More than 97 per cent of climate scientists have concluded …,’ the alterative postcard began ‘What % of Climate Scientists agree, based on evidence, that humans are causing climate changes?’ then featured three brightly-coloured scratch-off panels with the numbers 38 per cent, 63 per cent, and 97 per cent printed on them. If participants scratched off either of the lower numbers, they revealed text that read ‘Higher!’ A few days after the last postcard arrived, we sent a second wave of ostensibly-unrelated surveys, offering a $5 Amazon gift card to those who completed them; 308 participants did so (a 57 per cent re-interview rate, similar to Study 1).Footnote 7

However, we found no differences in effects between the original and more attention-grabbing postcards. Relative to participants who received the standard postcard, those who received the scratch-off were about half a percentage point less likely to say that climate change was happening, 6 percentage points more likely to say that it was human-caused, 1 percentage point higher on the Six-Americas concern scale, and half a percentage point less likely to say that they were concerned about climate change on the Pew scale (the first four measures in Figure 1). None of the differences were statistically significant. No respondent in the post-survey indicated that climate change was one of the most important problems facing the country (the fifth measure in Figure 1). (Regression results are listed in Table A7.) Attention seems to be a key obstacle to translating lab-based climate messages into large-scale campaigns —and overcoming that obstacle (at least in the context of a direct mail campaign) is no small task.

Attention, Climate Messaging, and the Lab-to-Field Disconnect

Our findings suggest that the climate messages tested in survey- and lab-based experiments may be difficult for advocates to effectively use in a real-world setting. Moreover, our findings highlight one reason why: attention. As Study 1 shows, lab-tested messages do not translate readily into a standard direct mail campaign. However, as Study 2 shows, attention seems to be a key driver; people who are forced to pay attention exhibit large effects, while those who are free to quickly move past them (like people in the real world) do not. Unfortunately, getting people’s attention is challenging in practice, as Study 3 shows. If the first step in political attitude formation and change is that people must receive a message (Zaller Reference Zaller1992) —that is, must actually notice it —attention should be front-and-centre in academic and practitioner conversations about lab- and survey-based messaging interventions.

Lab and survey experiments can identify effective messages, but if activists and practitioners wish to put those messages to work, they must overcome the messy realities of real-world political persuasion, including the simple fact that it is hard to get people to pay attention to messages they don’t want to receive (a challenge that is well-known to advertisers, politicians, and scholars of persuasion). In that sense, this study joins a growing body of academic research that finds that large-scale political messaging campaigns usually do not work (Cardy Reference Cardy2005; Bailey, Hopkins and Rogers Reference Bailey, Hopkins and Rogers2016; for a meta-analysis of forty-nine field experiments, see Kalla and Broockman Reference Kalla and Broockman2018) and that political persuasion is generally challenging (for example, Cohen, Aaronson and Steele Reference Cohen, Aaronson and Steele2000; Iyengar and Hahn Reference Iyengar and Hahn2009; Sigelman and Sigelman Reference Sigelman and Sigelman1984; Zaller Reference Zaller1992; McCright et al. Reference McCright, Charters, Dentzman and Dietz2016).

Of course, caveats abound. We focus here on direct mail, but many other media are suitable for large-scale political marketing (although canvassing and phone banks seem to perform little better; see Kalla and Broockman Reference Kalla and Broockman2018). More intense multi-media campaigns might be more effective (Goldberg et al. Reference Goldberg, Gustafson, Rosenthal and Leiserowitz2021 used a multi-platform online messaging strategy over the course of a month); so might well-designed door-to-door campaigns (for example, Broockman and Kalla Reference Broockman and Kalla2016). Moreover, researchers and activists may be able to devise more attention-grabbing direct mail materials than we have studied here, though this possibility seems unlikely to us, given that our partner organization tried ten different postcard designs in all. Our point is not that direct-mail or other large-scale messaging campaigns cannot be effective; it is that they are not nearly as effective as what we might expect based on a naïve reading of the optimistic results of recent survey experiments on climate communication. Large-scale messaging campaigns about climate —or any other topic —need to be sensitive to the reality that attention is easy to generate in a lab or survey and hard to generate in the real world.

We would also note that this study uses three different experiments with slightly different approaches and samples. This is partly by design; to understand differences between lab- and survey-based experiments, Studies 1 and 3 use the within-subjects approach often used in field experiments, and Study 2 uses the between-subjects design that is more common in lab- and survey-based studies. By necessity, moreover, the populations we sampled from could not be identical in each study: Study 3 could not be carried out using the same population as Study 1 (because repeating the experiment in the same state would undoubtedly alter the findings), and Study 2 was conducted using an online survey panel. We have done our best in all three cases to identify relevant participants —Republicans —but of course these differences in samples and experimental designs create some risks relative to studies that can be carried out using essentially identical participant populations and experimental frameworks.

Another word of caution we would add is that this study has focused on the role of attention, but not other obstacles in lab-to-field messaging, including the problem of durability. Direct mail and other mass-media campaigns fight an uphill battle to win the attention of targeted recipients; they fight yet another to durably change opinions (Broockman and Kalla Reference Broockman and Kalla2016). Our experiments suggest that a well-designed postcard campaign did not move attitudes in the short term; moving them in the long term would surely pose further difficulties.

Of course, researchers and practitioners who care about climate change —or any issue, party, or candidate —should continue developing persuasive messages in survey- and lab-based experiments. But there is still a great deal of work to be done to ensure that real-world messaging campaigns actually get people’s attention —and ultimately persuade them.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/S000712342510063X

Data availability statement

Replication data for this article can be found in Harvard Dataverse at: https://doi.org/10.7910/DVN/ZCPZGF.

Acknowledgements

The authors are grateful for advice and feedback from Ed Maibach, John Kotcher, Megan Mullin, David Broockman, and Bob Inglis.

Financial support

This material is based on work supported by the National Science Foundation under Grant No. 2139557. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Competing interests

None

Ethical standards

These studies were approved by the Duke University Campus IRB.

Footnotes

1 Another is durability: even when practitioners get audiences’ attention, they may struggle to influence people more than fleetingly given the barrage of persuasive communications people encounter on a daily basis (see, for instance, Jerit, Barabas and Clifford Reference Jerit, Barabas and Clifford2013).

2 Without dramatic reductions in greenhouse gas pollution, the Earth is poised to enter a period of profound environmental catastrophe characterized by intensifying hurricanes, droughts, floods, wildfires, famine, ecological collapse, and social and political destabilization (IPCC 2018, 2021).

3 Study 1 was designed and carried out by the first author, Study 2 was designed and carried out by both, and Study 3 was designed by both but carried out by the first author. For simplicity, we use the first-person plural when discussing all three.

4 The study was pre-registered through Open Science Framework at https://osf.io/962kr/.

5 The results were similar when we examined several subgroups separately, including people who identified more and less strongly as Republicans or as more and less conservative, who had more and less formal education, who consumed more and less political media, in different categories of the ‘Six Americas’ framework in the first wave, and younger and older people.

6 The study was pre-registered through Open Science Framework at https://osf.io/qvchw/. We conducted one set of non-pre-registered auxiliary tests to check for differential attrition of ‘skimmers’, people who try to move quickly through surveys, who might drop out at higher rates when required to pause for a full minute on a question. First, we examined the number of people who stopped completing the survey during the treatment questions (which required participants in the medium- and high-attention conditions to stop for one minute); of the 2,230 respondents who reached the postcard section of the survey, only five dropped out at that point (did not answer the question immediately after the postcard section of the survey), three in the medium- or high-attention conditions and two in the low-attention or control conditions. Second, we found that 90.6 per cent of people in the medium- and high-attention groups correctly answered the post-intervention attention check, almost the same as the 89.0 per cent among participants in the low-attention and control groups. Third, we found that the average participant in the medium- and high-attention conditions completed the survey in approximately 10 minutes and 18 seconds, about 2 minutes longer than the participants in the low-attention and control groups, who took about 8 minutes and 4 seconds on average; this is about what we would expect given that the medium- and high-attention conditions required participants to spend a full minute viewing the postcard and then to answer a few additional factual and opinion questions. (These results ignore the small number of participants who took more than 30 minutes to complete the survey, who we suspect may have stopped partway through and then resumed work later.) Together, these findings suggest that our results do not reflect differential attrition—inattentive participants dropping out at higher rates in the medium- and high-attention conditions—but rather differences in induced behaviour (attention to the postcards).

7 The study was pre-registered through Open Science Framework at https://osf.io/gq43f/.

References

Bailey, MA, Hopkins, DJ and Rogers, T (2016) Unresponsive and unpersuaded: The unintended consequences of a voter persuasion effort. Political Behavior 38, 713746.10.1007/s11109-016-9338-8CrossRefGoogle Scholar
Barabas, J and Jerit, J (2010) Are survey experiments externally valid? American Political Science Review 104, 226242.10.1017/S0003055410000092CrossRefGoogle Scholar
Bolton, A (2024) Democrats alarmed Harris’s economic message isn’t breaking through. Yahoo! News, October 29th, 2024.Google Scholar
Broockman, D and Kalla, J (2016) Durably reducing transphobia: A field experiment on door-to-door canvassing. Science 352, 220224.10.1126/science.aad9713CrossRefGoogle ScholarPubMed
Broockman, D, Kalla, J and Sekhon, JS (2017) The design of field experiments with survey outcomes: A framework for selecting more efficient, robust, and ethical designs. Political Analysis 25, 435464.10.1017/pan.2017.27CrossRefGoogle Scholar
Cardy, EA (2005) An experimental field study of the GOTV and persuasion effects of partisan direct mail and phone calls. The Annals of the American Academy of Political and Social Science 601, 2840.10.1177/0002716205278051CrossRefGoogle Scholar
Carnes, N (2025) Replication data for ‘Not getting the message on climate? Attention as a key barrier to mass-marketing experimentally-validated messages’, https://doi.org/10.7910/DVN/ZCPZGF, Harvard Dataverse, V1.CrossRefGoogle Scholar
Cohen, GL, Aaronson, J and Steele, CM (2000) When beliefs yield to evidence: reducing biased evaluation by affirming the self. Personality and Social Psychology Bulletin 26, 11511164.10.1177/01461672002611011CrossRefGoogle Scholar
Commercon, F, Goldberg, M, Rosenthal, S and Leiserowitz, A (2021) Radio stories increase conservatives’ beliefs that Republicans are worried about climate change. Yale University. New Haven, CT: Yale Program on Climate Change Communication. Available online from https://climatecommunication.yale.edu/publications/radio-stories-increase-conservatives- Google Scholar
Coppock, A and Green, DP (2015) Assessing the correspondence between experimental results obtained in the lab and field: A review of recent social science research. Political Science Research and Methods 3, 113131.10.1017/psrm.2014.10CrossRefGoogle Scholar
Goldberg, MH, Gustafson, A, Rosenthal, SA and Leiserowitz, A (2021) Shifting Republican views on climate change through targeted advertising. Nature Climate Change 11, 573577.10.1038/s41558-021-01070-1CrossRefGoogle Scholar
Hart, PS and Feldman, L (2021) The benefit of focusing on air pollution instead of climate change: How discussing power plant emissions in the context of air pollution, rather than climate change, influences perceived benefits, costs, and political action for policies to limit emissions. Science Communication 43, 199224.10.1177/1075547020980443CrossRefGoogle Scholar
Intergovernmental Panel on Climate Change (2018) Special report: Global warming of 1.5 oC: Summary for policymakers. Intergovernmental Panel on Climate Change. Available from https://www.ipcc.ch/sr15/chapter/spm/ 10.1017/CBO9780511546013.003Google Scholar
Iyengar, S and Hahn, KS (2009) Red media, blue media: Evidence of ideological selectivity in media use. Journal of Communication 59, 1939.10.1111/j.1460-2466.2008.01402.xCrossRefGoogle Scholar
Jerit, J, Barabas, J and Clifford, S (2013) Comparing contemporaneous laboratory and field experiments on media effects. Public Opinion Quarterly 77, 256282.10.1093/poq/nft005CrossRefGoogle Scholar
Kalla, JL and Broockman, DE (2018) The minimal persuasive effects of campaign contact in general elections: Evidence from 49 field experiments. American Political Science Review 112, 148166.10.1017/S0003055417000363CrossRefGoogle Scholar
Layzer, J (2012) Open for Business: Conservatives’ Opposition to Environmental Regulation. Cambridge: MIT Press.10.7551/mitpress/8550.001.0001CrossRefGoogle Scholar
McCright, AM, Charters, M, Dentzman, K and Dietz, T (2016) Examining the effectiveness of climate change frames in the face of a climate change denial counter-frame. Topics in Cognitive Science 8, 7697.10.1111/tops.12171CrossRefGoogle ScholarPubMed
Myers, TA, Nisbet, MC, Maibach, EW and Leiserowitz, AW (2012) A public health frame arouses hopeful emotions about climate change. Climatic Change 113, 11051112.10.1007/s10584-012-0513-6CrossRefGoogle Scholar
Ranney, MY and Clark, D (2016) Climate change conceptual change: Scientific information can transform attitudes. Topics in Cognitive Science 8, 4975.10.1111/tops.12187CrossRefGoogle ScholarPubMed
Schleifer, T and Goldmacher, S (2024) Inside the secretive $700 million ad-testing factory for Kamala Harris. The New York Times, October 17.Google Scholar
Sigelman, L and Sigelman, CK (1984) Judgments of the Carter-Reagan debate: The eyes of the beholders. Public Opinion Quarterly 48, 624628.10.1086/268863CrossRefGoogle Scholar
Stokes, LC and Warshaw, C (2017) Renewable energy policy design and framing influence public support in the United States. Nature Energy 2, 16.10.1038/nenergy.2017.107CrossRefGoogle Scholar
Van der Linden, SL, Leiserowitz, AA, Feinberg, GD and Maibach, EW (2015) The scientific consensus on climate change as a gateway belief: Experimental evidence. PLoS ONE 10, e0118489.10.1371/journal.pone.0118489CrossRefGoogle ScholarPubMed
Zaller, JR (1992) The Nature and Origins of Mass Opinion. New York: Cambridge University Press.10.1017/CBO9780511818691CrossRefGoogle Scholar
Figure 0

Figure 1. Postcards Had Little Consistent Effect on Recipient Attitudes.Notes: Each outcome variable is rescaled to range between zero and one. N= 2,353. Regression models are listed in Appendix Table A1.

Figure 1

Figure 2. Attention Was a Key Mediator in Postcard Effects.Notes: Each outcome variable is rescaled to range between zero and one. Control group n = 441. Regression model results are listed in Appendix Table A2.

Supplementary material: File

Carnes and Henderson supplementary material

Carnes and Henderson supplementary material
Download Carnes and Henderson supplementary material(File)
File 5.5 MB
Supplementary material: Link

Carnes and Henderson Dataset

Link