Hostname: page-component-7dd5485656-npwhs Total loading time: 0 Render date: 2025-10-23T22:59:09.333Z Has data issue: false hasContentIssue false

AI and generational amnesia: An ecological approach to ‘new memory’ regimes

Published online by Cambridge University Press:  20 October 2025

Harlan Morehouse*
Affiliation:
Department of Geography and Geosciences, University of Vermont , Burlington, VT, USA

Abstract

This article argues that the environmental contexts of memory are vulnerable to Artificial Intelligence (AI)-generated distortions. By addressing the broader ecological implications for AI’s integration into society, this article looks beyond a sociotechnical dimension to explore the potential for AI to complicate environmental memory and its role in shaping human–environment relations. First, I address how the manipulation and falsification of memory risks undermining intergenerational transmission of environmental knowledge. Second, I examine how AI-generated blurring of boundaries between real and unreal can lead to collective inaction on environmental challenges. By identifying memory’s central role in addressing environmental crisis, this article places emerging debates on memory in the AI era in direct conversation with environmental discourse and scholarship.

Information

Type
Short Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press

“The result of a consistent and total substitution of lies for factual truth is not that the lie will now be accepted as truth and truth be defamed as a lie, but that the sense by which we take our bearings in the real world—and the category of truth versus falsehood is among the mental means to this end—is being destroyed.” — Hannah Arendt, “Truth and Politics” (Reference Arendt1967, 78).

Introduction

On October 11, 2024, a group of United States House Democrats from hurricane-impacted states wrote a letter to seven social media and technology companies urging executives to do more to combat disinformation surrounding natural disasters. Their requests were as follows: increase the monitoring and rapid removal of misinformation and disinformation; enhance fact-checking; strengthen algorithms to flag and prevent the spread of harmful conspiracy theories; and implement stronger safeguards against scams (Ross et al. Reference Ross, Castor, Williams and Nickel2024). In the wake of hurricanes Helene and Milton, affected communities encountered a wave of disinformation that hindered relief efforts. This included claims from conspiracy theorist Alex Jones that Hurricane Helene was aimed at North Carolina to clear space for lithium mining, Artificial Intelligence (AI)-generated videos of storm surges overwhelming oceanfront properties, and a widely shared AI-generated photo of a traumatized child in a life jacket clutching a sodden puppy.

It did not require much to debunk this content as manipulative and conspiratorial. However, once it circulated on social media, it proved challenging to retract and correct the record – a testament to Mark Twain’s quip, ‘How easy it is to make people believe a lie, and how hard it is to undo that work again’ (Twain Reference Twain2013, 302). With the general public having been subjected to disinformation campaigns for years, many people were primed for conspiratorial narratives around catastrophic events. AI-generated images reinforced some of those conspiracies. During relief efforts, this culminated in a small group of armed people harassing US Federal Emergency Management Agency (FEMA) relief workers in rural Tennessee, forcing the agency to make operational changes to keep its relief workers safe.

This example is but one of many that confirms deep confusion around environmental dynamics in a rapidly evolving technological age characterized by disinformation and conspiracy. When seeking to explain pollution, wildfires, or climate change, evidence-based environmental knowledge must contend with greater amounts of information with little to no empirical basis. This confusion is becoming more widespread with the integration of AI into social, political, and informational domains. AI’s ability to generate misleading texts and visual media is beginning to influence how society understands and experiences its relationship to environmental change.

This article takes as its starting point a contemporary condition wherein memory, AI, and the environment are bewilderingly entangled. It takes a different path from emerging critical environmental scholarship that addresses AI’s environmental impact, spotlighting the material inputs required to sustain the vast infrastructures that facilitate AI’s integration and evolution (Strubell et al. Reference Strubell, Ganesh and McCallum2020; Crawford Reference Crawford2021; De Vries Reference De Vries2023; Luers et al. Reference Luers, Koomey, Masanet, Gaffney, Creutzig, Lavista Ferres and Horvitz2024). While it is important to further develop political ecological approaches to AI, this article, instead, weaves together epistemological, aesthetic, technological, and political concerns.

Drawing on Andrew Hoskins’s observation that memory can take on uncanny qualities when confronted with ‘that which seems strangely familiar yet unreal’ (Hoskins Reference Hoskins2024, 3), this article explores how AI produces a terrain of memory that collapses distinctions between real and unreal. Specifically, this article addresses the distortion of environmental memory under conditions shaped by opaque technological systems. It argues that the ecological contexts of memory – the where of memory – are vulnerable to AI-generated distortions. By extending Hoskins’s theorization of memory into environmental domains, this article highlights the ecological stakes of AI’s integration into society to investigate how it reshapes environmental memory and human–environment relations.

This article proceeds as follows. First, to foreground environmental concerns, it engages ideas centred on environmental memory. Second, it examines the concepts ‘shifting baseline syndrome’ (SBS) and ‘environmental generational amnesia’ (EGA), which describe a gradual change in the accepted norms of environmental conditions ‘due to lack of past information or lack of experience of past conditions’ (Soga and Gaston Reference Soga and Gaston2018, 222). Third, it reviews emerging scholarship on AI and memory, drawing on recent work published in this journal and beyond. Fourth, this article explores AI’s impact on environmental memory through mnemonic, aesthetic, and political perspectives to uncover its broader implications for human–environment relations.

Memory and the environment

Reflecting on a longstanding tendency to reduce the environment to a backdrop of human affairs, Serres observes, ‘Our culture abhors the world’ (Serres Reference Serres1995, 3). This repugnance stems from an anthropocentric conceit that simultaneously refuses the idea that nonhuman natures possess forms of intelligence and rejects the notion that culture is possible only by virtue of the material conditions that sustain it. It is, in other words, predicated on a deep – and willful – misunderstanding of the constitutive, albeit nondeterministic, role the environment plays in the production and evolution of culture (Castree Reference Castree, Castree and Braun2001; Latour Reference Latour2004; Whatmore Reference Whatmore2006).

Of the many attempts to recentre the environment as a historical actor on a grand scale, some have identified memory as a key point at which ecology and culture intersect (Till Reference Till, Agnew, Mitchell and Toal2003). Gülüm et al. invite us to understand ‘memory and environment as embodied co-constitutive and co-constructed’ (Gülüm et al. Reference Gülüm, Leworthy, Tabaszewska and Teichler2024, 9). ‘Memory’, writes Kenny, ‘needs a place, a context’ (Kenny Reference Kenny1999, 421). Whether individual or collective, memory coheres through the reciprocal relations fostered with the world over time. In the absence of that world, memory is impossible. This carries two implications. First, there is an expansive ecological dimension of memory that exceeds the experiencing subject. This dimension not only contains the elemental conditions – for example, water, air, sunlight, and so forth – that sustain life, it also contains subtler characteristics, like the contours, sights, smells, tastes, sounds, and so forth of familiar landscapes. This is memory’s where, the broader ecological contexts of lived experience that make memory possible.

Second, the notion of environmental memory decentres the human as the sole carrier of memory and recentres human–environment relations at the locus of meaning-making. Recalling Serres, rather than a mute backdrop, the environment is the dynamic stage upon which human dramas unfold co-evolutionarily. ‘Thought, like memory’, writes Macfarlane, ‘inhabits external things as much as the inner regions of the human brain’ (Macfarlane Reference Macfarlane2007,100). In externalizing memory and situating it as a product of human–environment relations, it follows that as environments undergo change, so too do the stories we tell of them and the memories we hold of them. Macfarlane continues, ‘When the physical correspondents of thought disappear, then thought, or its possibility, is also lost. When woods and trees are destroyed…imagination and memory go with them’ (Macfarlane Reference Macfarlane2007, 100.) In this sense, the environmental crisis is also a crisis for memory and culture.

SBS and EGA

Environmental memory’s vulnerability is not, however, a mere theoretical abstraction. There is an empirical basis that shows how ecology and memory are connected. Towards this end, engaging with two concepts – SBS and EGA – will make these connections legible.

SBS and EGA help us understand ecological degradation and the erosion of environmental memory. Both phenomena describe how perceptions of environmental norms change over time, often resulting in diminished awareness of ecological loss and reduced incentive for conservation. Coined by Pauly (Reference Pauly1995), ‘SBS’ refers to the phenomenon in which each generation perceives the environment it encounters in its youth as ‘normal’, thereby failing to recognize the extent of ecological degradation that occurred in previous generations. Writing about marine fisheries, Pauly observed that each generation of fisheries scientists accepts as a baseline the stock size and species composition that occurred at the beginning of their careers, leading to a ‘creeping disappearance of resource species’ (Pauly Reference Pauly1995, 430). As degraded baselines are normalized, conservation targets become softer, masking the true scale of biodiversity loss.

Complementing Pauly’s conservation-based perspective, Soga and Gaston (Reference Soga and Gaston2018) review broader causes and consequences of SBS, loosely categorizing them into psychological, sociocultural, and informational domains. They note that urbanization reduced direct contact with the environment, and reliance on short-term datasets contributes to shifting perceptions of what constitutes a ‘healthy’ environment. Moreover, the authors warn that SBS not only hampers biodiversity conservation but also undermines public support for environmental policy by eroding experiential knowledge and understanding (2018, 228).

Closely related to SBS is ‘EGA’, which Kahn describes as the tendency for people to accept the degraded environmental conditions they grow up with as normal (Kahn Reference Kahn2002, 113). Kahn explores how children’s experiences with nature influence cognitive and emotional development. As environments degrade, children form emotional bonds with increasingly artificial or contaminated settings, perpetuating an unconscious acceptance of environmental loss. This process contributes to baseline shift at individual and collective levels, wherein the full extent of what is lost is forgotten, further reinforcing cycles of ecological neglect. Similarly, Papworth et al. (Reference Papworth, Rist, Coad and Milner‐Gulland2009) demonstrate how generational shifts in expectations can undermine conservation outcomes. Their study shows that younger individuals often have less accurate perceptions of wildlife abundance compared to older individuals, illustrating how memory and lived experience shape ecological knowledge (see also Pyle Reference Pyle1993; Miller Reference Miller2005; Glassberg Reference Glassberg2014; Craps Reference Craps2024).

This research reveals how SBS and EGA are driven by experiential limits, cultural discontinuities, and informational gaps. By drawing attention to risks posed by ecological forgetting and the erasure of environmental knowledge, this research underscores the need to preserve historical memory and re-establish experiential connections. Further, SBA and EGA identify two related pathways for forgetting. First, they show how environmental change influences stories told about the places we live. Second, they show how the failure to transmit information about environmental change over time can lead to forgetting. This second pathway involves communication strategies and technologies that facilitate, or thwart, the passage of information and the cohesion of memory. This is an increasingly confounding matter to which I now turn.

AI and memory

Scrutinizing the relationships between memory and technology is essential for understanding the rapidly shifting terrain of social experience. AI’s integration into everyday life has sparked a surge of scholarly attention, particularly around its socio-economic and ethical implications. While extensive research has explored AI’s impact on labour, algorithmic governance, and digital equity (Frey and Osborne Reference Frey and Osborne2017; Noble Reference Noble2018; Amoore Reference Amoore2020), less examined – but increasingly vital – are AI’s effects on the architecture of memory. Recent scholarship has begun to investigate how AI mediates and co-produces memory, reshaping how individuals and societies remember.

Central to this discussion is Hoskins’s (Reference Hoskins2024) conceptualization of a ‘third way of memory’, which involves a ‘mixing of the machinic and human in new ways’ (Hoskins Reference Hoskins2024, 3). For Hoskins, AI systems do more than store information. They can generate synthetic pasts, reframe histories, and foster new modes of memory. There is promise here for novel forms of expression and creativity. Yet, there is also danger in ‘the potential of AI to consort with, challenge, and also replace the agency of human remembering and forgetting’ (Hoskins Reference Hoskins2024, 3). Memory cannot be reduced to mental or social phenomena but rather is increasingly and confusingly co-constructed through human–machine entanglements.

AI systems working in tandem with powerful sorting algorithms mediate how people engage with their pasts. Platforms such as Facebook, Apple Photos, and Google Memories surface content via automated prompts, often without user consent. These algorithmically generated memories prioritize machine-learned patterns over personal significance, presenting users with emotionally charged moments filtered through encoded principles of engagement rather than recollection (Henriksen Reference Henriksen2024). Remembering becomes less an act of reflection and more a response to predictive systems designed to maximize user engagement. Under AI, the line between sorting and distorting is blurred. The result is a form of algorithmic recall wherein users are confronted with memories they neither chose nor expected to revisit. The automated resurfacing of past events represents a shift in memory from active recollection to passive encounter. The emotional contexts of this shift are illuminated by Jacobsen and Beer’s (Reference Jacobsen and Beer2021) concept of ‘quantified nostalgia’. In their analysis, the personal past is made visible through metrics – for example, likes, shares, and comments – that influence which memories are elevated and which recede.

Novel changes in memory also manifest in shared experiences. Consider Hoskins’s (Reference Hoskins2024) notion of the ‘conversational past’, which describes how AI enables a co-constructed digital memory space where humans and machines produce narratives in real time. While this may foster new forms of social participation and engagement with the past, it also introduces instability, as algorithmically generated reconstructions of the past are vulnerable to manipulation, falsification, and may lack grounding in shared experience. Pilkington (Reference Pilkington2024) echoes this concern through the concept of ‘myopic memory’, which identifies the erasure of critical and historical understanding under AI and platform capitalism. Algorithmic emphasis on personalization fragments collective experience, narrowing exposure to alternative viewpoints, and reducing memory to a placeholder for emotionally resonant but politically dispassionate content. In this configuration, collective memory is not grounded in critical reflection or dissent but in algorithmic replication.

These concerns are elaborated in Smit et al.’s (Reference Smit, Jacobsen and Annabell2024) framing of ‘platformed remembering’. They argue digital platforms have become dominant memory infrastructures, organizing public and private pasts through opaque systems governed by commercial interests. This framing resonates with Hoskins’s notion of ‘grey memory’ that addresses how ‘contemporary technologies push out of individual human reach a conscious, active, willed memory, through obscuring the risks of the ownership, use, access, costs, and finitude of digital data’ (Hoskins and Halstead Reference Hoskins and Halstead2021; Hoskins Reference Hoskins2024, 13). The implications are profound: what societies remember – and what they forget – is increasingly shaped by systems designed for data extraction, behavioural prediction, and profit maximization.

This research identifies a crucial transformation: in the age of AI, memory must be understood not as an inner archive or shared tradition, but as an entangled process shaped by machine learning, platform architectures, and algorithmic influence. While the externalization of memory via technological prostheses has long been the case – from stone tablets to personal digital assistants – the introduction of AI has accelerated the degree to which technologies mediate, filter, and produce memory according to logics external to the remembering subject (Hoskins Reference Hoskins, Prescott and Wiggins2023).

AI and environmental memory

Previous pages have outlined two theoretical perspectives on memory: environmental memory and AI-mediated memory. The remaining aim is to identify their points of contact and articulate a deeper sense of the epistemological, aesthetic, and political challenges environmental memory confronts in the age of AI. If it is the case that sociocultural memories are increasingly manipulated, fragmented, and falsified, it would not be outrageous to suggest that environmental memory – how communities remember and experience environmental change over time – faces similar threats.

As SBS and EGA suggest, the intergenerational transmission of environmental memory is difficult even under ideal conditions. Environmental change typically unfolds at a pace that exceeds lived human perception. Although gradual change may be disrupted by acute disasters like hurricanes, wildfires, and floods, the broader challenge remains: the slow subtlety of environmental transformation complicates how experiences are remembered and passed on. Therefore, the conveyance of environmental memory is critical for understanding how environments change. Stories of fisheries populations a generation before, of glaciers advancing their mass, of the richness of a buzzing summertime meadow, provide a depth of environmental understanding not easily captured by quantitative environmental data alone (Macfarlane Reference Macfarlane2019; Farrier Reference Farrier2020; Morehouse and Cigliano Reference Morehouse and Cigliano2020). As Haraway notes, ‘It matters what stories make worlds, what worlds make stories’ (Haraway Reference Haraway2016, 12). Stories are important not only because they reflect the world, but they also produce it.

However, a loss of confidence in memory’s where is not just a matter of how we relate to past environments. More than fostering an understanding of where we were, sharing environmental memories clarifies where we are and where we might be heading. If the past recedes because ecological contexts have changed or because AI is generating hallucinations by invoking non-existent events, the dislocation of environmental memory from its constitutive where undermines the capacity to mobilize memory to avert future catastrophe, a point developed below. This empties memory of its transformative potential in a manner that aligns with Pilkington’s ‘myopic memory’ (Pilkington Reference Pilkington2024). It also locks us into what Crary refers to as a ‘shallow present’ marked by a systematic erasure of the past and a withdrawal of the future (Crary Reference Crary2013, 41).

AI and uncanny environments

Beyond reconfiguring stories that emerge through human–environment interactions, AI presents an aesthetic challenge in its manipulation of environmental images. AI tools like ChatGPT and Midjourney are becoming more adept at generating hyperrealist media that pass as genuine models of reality. Granted, there is often an uncanny tell that evokes Fisher’s sense of ‘the eerie’, which ‘occurs either when there is something present where there should be nothing, or there is nothing present when there should be something’ (Fisher Reference Fisher2016, 61). An AI-generated group photo might feature an errant arm. An AI-generated sugar maple leaf image might have four lobes instead of five. Sunlight in an AI-generated forest scene might land in a peculiar manner. I asked ChatGPT to create a ‘nostalgic image of a tree fort’, and the resulting structure was precariously nestled in the crotch of a tree, bathed in comforting light.

Despite its imperfections, a hyperrealist AI aesthetic is often sufficient for conjuring a ‘good enough’ version of reality. Admittedly, a single AI-generated image of a tree fort is unlikely to supplant my own memories of the tree forts of my youth. However, the amassing of AI-generated images at scale is more likely to distort shared perceptions of reality and influence lived experiences and memories. Algorithms play a critical role in this distortion. ‘The past,’ writes Hoskins, ‘is caught up in this algorithmic narrowing of information, knowledge, life’ (Hoskins Reference Hoskins, Prescott and Wiggins2023, 12). The way algorithms aid in the sorting and proliferation of AI-generated media risks polluting memory, lending it an uncanny quality.

Hypothetically, this algorithmically fueled capture of the real can conjure aesthetic versions of the past that dislocate memory from its environmental contexts. It is not so much the memory that is targeted, but the memory’s where. As Paglen observes, AI systems influence our perceptions of the world towards particular ideological ends: ‘These influencing machines generate hallucinations, uncanny ways of seeing, that make you see something in a particular way or believe something in a particular way’ (Paglen Reference Paglen and Downey2024, 128). While it is possible to generate innumerable variations of AI environments, there are three potentially influential versions that come to mind. First, AI can situate the environment in ahistorical and idealized terms, invoking an Edenic version of nature that always has been and always will be. Second, AI can construct environmental pasts that suggest ecological change is not occurring. AI-generated images of glacier stability in Greenland, for example, could reinforce climate denialism. Third, AI can produce catastrophic versions of previous environments, implying that environmental crisis was a thing of the past, and that current conditions are comparatively better.

This is an inherently speculative line of thought. Yet, given the nascent phase of AI’s integration into everyday life, sometimes speculation is the only thing to lean on. Still, a distinction can be made between prophecy and prognosis. As AI becomes more adept at creating hyperrealist images algorithmically distributed at scale, AI-generated images may colonize environmental memory and shape past, present, and future environmental relations.

Information, scepticism, and the future

Thus far, I have explored how the integration of AI complicates environmental memory, focusing on cultural narratives and hyperrealist aesthetics. This section examines AI’s political dimensions, focusing on how the transmission of environmental memory is complicated by the nature of information in the digital age. A wide range of scholarship has examined challenges posed by disinformation and conspiracy, emphasizing their outsized role in undermining sociopolitical stability (Bennett and Livingston Reference Bennett and Livingston2020; Kuo and Marwick Reference Kuo and Marwick2021), public health (Neylan et al. Reference Neylan, Patel and Erickson2022), and climate change (Treen et al. Reference Treen, Williams and O’Neill2020; Lewandowsky Reference Lewandowsky2021). Just as memory’s untethering from its where compromises future pathways, when experiential knowledge is untethered from reality, it becomes vulnerable to manipulation and weaponization.

There are numerous examples of disinformation and conspiracy pertaining to environmental change. Recall US Representative Marjorie Taylor Greene’s insistence that 2024’s Hurricane Helene was deliberately created by Democrats to improve election outcomes, posting: ‘Yes, they can control the weather. It’s ridiculous for anyone to lie and say it can’t be done’ (Marjorie Taylor Greene 2024). Climate catastrophes have been submitted as evidence of global political conspiracy, like ‘The Patriot Voice’, which advanced a theory that the 2025 Los Angeles wildfires were started to clear land so officials could make way for a ‘fully operational AI-based Smart City’ (The Patriot Voice Reference Voice2025). FEMA has had to make public statements denouncing allegations that post-catastrophe outreach is a deep state objective to establish concentration camps for imprisoning and exterminating citizens in order to impose a New World Order (FEMA 2024). It is tempting to dismiss these as fringe theories, but it is important to note that: (a) such theories no longer dwell at the margins of political thought but are now at its centre; (b) this broad range of conspiracies often cohere around anti-science discourses targeting methods that generate awareness of environmental change; and (c) the aim for much of these theories is engagement, which is increasingly determined by an algorithmic logic more focused on virality than intelligibility.

It is worth exploring this last point. Information that promotes environmental awareness is polluted by algorithmic processes optimized for user engagement. Studies suggest that an effective strategy for increasing user engagement is to appeal to resentment, secrecy, and conspiracy rather than reason (Ledwich and Zaitsev Reference Ledwich and Zaitsev2020; Carroll et al. Reference Carroll, Chan, Ashton and Krueger2023; Rodilosso Reference Rodilosso2024; Milli et al. Reference Milli, Carroll, Wang, Pandey, Zhao and Dragan2025). Worldviews are increasingly shaped by falsehoods propagated online, which yield a splintered reality overwhelmed by informational contradictions. Granted, the problem of disinformation is hardly a new one. Powerful state actors, working in conjunction with media, have long sought control over the form, content, and distribution of information to distort individual and collective memory. Often, these efforts focus on alienating people from their material conditions and cultivating identities in keeping with dominant forms of power. However, at present, it is unclear whether recent iterations of this longstanding tendency represent a difference in degree or kind. Algorithms, for example, permit a level of informational specificity that allows for the tailoring of propaganda at mass and individual scales, like the company GoLaxy, which can ‘deploy humanlike bot networks and psychological profiling to target individuals’ (Goldstein and Benson Reference Goldstein and Benson2025). Such granular degrees of information manipulation will likely have significant consequences for how individuals, groups, and societies construct relationships with the world. There is, thus, a novel cause for concern around disinformation and its implications for social, political, and environmental stability.

To combat this, many advocate for data literacy (Gummer and Mandinach Reference Gummer and Mandinach2015; Wolff et al. Reference Wolff, Gooch, Montaner, Rashid and Kortuem2016; Laupichler et al. Reference Laupichler, Aster, Schirch and Raupach2022). This strategy assumes that with sufficient instruction, people will be better able to interpret data and pivot to truthfulness. Calls for increased literacy, however, often overlook strong cultural tendencies towards disavowal, which take the form of ‘I know well, but all the same…’ (Mannoni Reference Mannoni, Rothenberg, Foster and Zizek2003; Zupanačič Reference Zupanačič2024, 2). People sometimes enjoy inhabiting worlds with a tenuous connection to reality. This is the appeal of conspiracies: by providing a convenient scapegoat, they make the world seem less chaotic and more controllable (Lewandowsky Reference Lewandowsky2021).

Disinformation alone is a challenging issue. Adding elements like AI-generated memories and environments makes it a monstrous problem, spawning a form of scepticism that is either unable or unwilling to differentiate between real and unreal. This boundary is becoming destabilized in a manner commensurate with Hoskins’s assertion, ‘the recent rapid development and accessibility of AI and related technologies and services, heralds a new battleground between humans and computers in the shaping of reality’ (Hoskins Reference Hoskins2024,1). Recalling the L.A. wildfire conspiracy, the operative causal issue concerns worsening conditions under climate change. Yet, this is obscured and reconfigured as evidence of a global political plot. In making room for political delusions, such conspiracies effectively ward off the real, relegating it to an illusory background so that ideological machinations take centre stage.

As suggested in this article’s epigraph quoting Arendt, the most destructive consequence of organized disinformation is not that ‘the lie will…be accepted as truth and truth be defamed as a lie’ (Arendt Reference Arendt1967, 78). Rather, it is that a relentless flow of disinformation provokes a fundamental shift in our relationship with the real world. Arendt’s position is all too relevant for current social, political, environmental, and technological conditions. As more people become aware that AI’s generative capacities are increasingly realistic and persuasive, false claims that real content is AI-generated will become more persuasive, too. This phenomenon, referred to as ‘the liar’s dividend’ (Chesney and Citron Reference Chesney and Citron2019; Schiff et al. Reference Schiff, Schiff and Bueno2025), has implications for social and political stability, an issue already witnessed in deep fakes of prominent politicians.

Here, a question emerges as to whether visual documentation of ecological degradation is sufficient for rallying people around an environmental cause if evidence can be easily dismissed as AI-generated slop. This has far-reaching repercussions for the mobilization for better environmental futures. As discussed, intergenerational transmission of environmental memory is crucial for developing strategies that link past ecologies with present and future ones. This is a challenge in ideal circumstances and an even greater one in view of memory’s fragmentation. If memory is dislocated from its where, it is no longer certain how it can function in the world. Indeed, building a common consensus around environmental issues is unthinkable if neither the crisis nor reality is recognized as real.

Conclusion

Pessimism is an understandable reaction to current sociotechnical and environmental conditions, but it is a poor destination. As this article suggests, AI can distort memory, experience, and knowledge to such an extent that it destabilizes how we relate to the world. In facing these unreal circumstances, we might think creatively about pursuing new kinds of human-technological relationships in the spirit of Haraway’s cyborgism (Reference Haraway2013), which refuses a distinction between organism and machine (1990, 149). As some posthumanism scholars suggest, sensors, codes, and algorithms can offer novel opportunities to think and sense with to cultivate new intimacies with the world (Gabrys Reference Gabrys2016; Richardson and Zolkos Reference Richardson and Zolkos2022; Turnbull et al. Reference Turnbull, Searle, Hartman Davies, Dodsworth, Chasseray-Peraldi, von Essen and Anderson-Elliott2023; Richardson Reference Richardson2024). However, pursuing the path of hybridity without accounting for the epistemological, aesthetic, and political concerns raised in this essay carries risk.

At the very least, this discussion highlights the need for robust AI regulation in public and private sectors. There is too much at stake in our collective fate to entrust AI development to a corporatocratic model content to ‘move fast and break things’. This raises the question of what can be done to shield environmental memory from AI’s hallucinatory influences. Calls for re-establishing connections with the environment offer strategies for combating environmental forgetting. However, there are great disparities in environmental access, and the privilege of remembering ought not to be an exclusive one. In the attempt to restore relationships with the real world in order that we might strategise for better futures, perhaps it is best to identify what survival requires in the most elemental sense: attunement. Starting here might allow us to address the pervasive silence around environmental loss and recognize that environmental memory is a matter of meaning-making, intergenerational storytelling, and mourning. How else are we to resist forgetting than to allow ourselves to be wholly, vulnerably, and imperfectly human amidst increasingly inhuman circumstances?

Acknowledgements

The author would like to extend gratitude to the students in his Spring 2025 Political Media Ecologies course, who helped foster a deeper understanding of technology’s impacts on memory. The author thanks the two anonymous reviewers of the article for their constructive comments and suggestions. All errors and omissions are the author’s alone.

Funding statement

The author received no financial support for the research, authorship, and/or publication of this article.

Competing interests

The author declares none.

Harlan Morehouse is an assistant professor of Geography and Geosciences and co-director of Environmental Studies at the University of Vermont. His research interests lie at the intersection of political ecology and environmental philosophy, with a current focus on disinformation and its broad social, political, and environmental impacts.

References

Amoore, L (2020) Cloud Ethics: Algorithms and the Attributes of Ourselves and Others. Durham, NC: Duke University Press.Google Scholar
Arendt, H (1967) Truth and politics. The New Yorker, February 25, 4988.Google Scholar
Bennett, WL and Livingston, S (2020) The Disinformation Age: Politics, Technology, and Disruptive Communication in the United States. Cambridge: Cambridge University Press.Google Scholar
Carroll, M, Chan, A, Ashton, H and Krueger, D (2023) Characterizing Manipulation from AI Systems. Proceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization, Boston, MA, USA. Available at https://doi.org/10.1145/3617694.3623226.Google Scholar
Castree, N (2001) Socializing nature: Theory, practice, and politics. In Castree, N and Braun, B (eds), Social Nature: Theory, Practice, and Politics. Hoboken, NJ: Wiley-Blackwell, pp. 121.Google Scholar
Chesney, R and Citron, D (2019) Deepfakes and the new disinformation war: The coming age of post-truth geopolitics. Foreign Affairs 98, 147.Google Scholar
Craps, S (2024) Lost words and lost worlds: Combatting environmental generational amnesia. Memory Studies Review 1 (aop), 120.10.1163/29498902-20240001CrossRefGoogle Scholar
Crary, J (2013) 24/7: Late Capitalism and the Ends of Sleep. London: Verso.Google Scholar
Crawford, K (2021) Atlas of AI. New Haven, CT: Yale University Press.Google Scholar
De Vries, A (2023) The growing energy footprint of artificial intelligence. Joule 7(10), 21912194.10.1016/j.joule.2023.09.004CrossRefGoogle Scholar
Farrier, D (2020) Footprints: In Search of Future Fossils. New York: Farrar, Straus and Giroux.Google Scholar
FEMA (2024) Rumor: FEMA established secure sites near disaster-affected areas to operate as “FEMA camps” to detain people or to prioritize our responders over the needs of survivors. FEMA. https://www.fema.gov/node/rumor-responder-lodgingGoogle Scholar
Fisher, M (2016) The Weird and the Eerie. London: Repeater Books.Google Scholar
Frey, CB and Osborne, MA (2017) The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change 114, 254280.10.1016/j.techfore.2016.08.019CrossRefGoogle Scholar
Gabrys, J (2016) Program Earth: Environmental Sensing Technology and the Making of a Computational Planet. Minneapolis, MN: University of Minnesota Press.Google Scholar
Glassberg, D (2014) Place, memory, and climate change. The Public Historian 36(3), 1730.10.1525/tph.2014.36.3.17CrossRefGoogle ScholarPubMed
Goldstein, BJ and Benson, BV (2025) The era of a.I. Propaganda has arrived, and America must act. The New York Times August 5.Google Scholar
Gülüm, E, Leworthy, P, Tabaszewska, J and Teichler, H (2024) Memory and environment. Memory Studies Review 1(1), 315.10.1163/29498902-20240007CrossRefGoogle Scholar
Gummer, ES and Mandinach, EB (2015) Building a conceptual framework for data literacy. Teachers College Record 117(4), 122.10.1177/016146811511700401CrossRefGoogle Scholar
Haraway, DJ (2013) A cyborg manifesto: Science, technology, and socialist-feminism in the late twentieth century. In Simians, Cyborgs and Women: The Reinvention of Nature. New York: Routledge, 149181.10.4324/9780203873106CrossRefGoogle Scholar
Haraway, DJ (2016) Staying with the Trouble: Making Kin in the Chthulucene. Durham, NC: Duke University Press.Google Scholar
Henriksen, EE (2024) Algorithmically generated memories: Automated remembrance through appropriated perception. Memory, Mind & Media 3, e11.10.1017/mem.2024.8CrossRefGoogle Scholar
Hoskins, A (2023) New memory and the archive. In Prescott, A and Wiggins, A (eds), Archives: Power, Truth, and Fiction. Oxford: Oxford University Press.Google Scholar
Hoskins, A (2024) AI and memory. Memory, Mind & Media 3, e18. https://doi.org/10.1017/mem.2024.16.CrossRefGoogle Scholar
Hoskins, A and Halstead, H (2021) The new grey of memory: Andrew Hoskins in conversation with Huw Halstead. Memory Studies 14(3), 675685.10.1177/17506980211010936CrossRefGoogle Scholar
Jacobsen, BN and Beer, D (2021) Quantified nostalgia: Social media, metrics, and memory. Social Media+ Society 7(2), 20563051211008822.10.1177/20563051211008822CrossRefGoogle Scholar
Kahn, PH (2002) Children’s affiliations with nature: Structure, development, and the problem of environmental generational amnesia. Children and nature: Psychological, sociocultural, and evolutionary investigations 93, 116.Google Scholar
Kenny, MG (1999) A place for memory: The interface between individual and collective history. Comparative Studies in Society and History 41(3), 420437.10.1017/S0010417599002248CrossRefGoogle Scholar
Kuo, R and Marwick, A (2021) Critical disinformation studies: History, power, and politics. Harvard Kennedy School Misinformation Review 2(4), 111.Google Scholar
Latour, B (2004) Politics of Nature: How to Bring the Sciences into Democracy. Cambridge, MA: Harvard University Press.10.4159/9780674039964CrossRefGoogle Scholar
Laupichler, MC, Aster, A, Schirch, J and Raupach, T (2022) Artificial intelligence literacy in higher and adult education: A scoping literature review. Computers and Education: Artificial Intelligence 3, 100101.Google Scholar
Ledwich, M and Zaitsev, A (2020) Algorithmic extremism: Examining YouTube’s rabbit hole of radicalization. First Monday 25(3).Google Scholar
Lewandowsky, S (2021) Conspiracist cognition: Chaos, convenience, and cause for concern. Journal for Cultural Research 25(1), 1235.10.1080/14797585.2021.1886423CrossRefGoogle Scholar
Lewandowsky, S (2021) Climate change disinformation and how to combat it. Annual Review of Public Health 42(1), 121.10.1146/annurev-publhealth-090419-102409CrossRefGoogle Scholar
Luers, A, Koomey, J, Masanet, E, Gaffney, O, Creutzig, F, Lavista Ferres, J and Horvitz, E (2024) Will AI accelerate or delay the race to net-zero emissions? Nature 628(8009), 718720.10.1038/d41586-024-01137-xCrossRefGoogle ScholarPubMed
Macfarlane, R (2007) The Wild Places. London: Penguin.Google Scholar
Macfarlane, R (2019) Underland: A Deep Time Journey. New York: W. W. Norton & Company.Google Scholar
Mannoni, O (2003) I know well, but all the same…. In Rothenberg, MA, Foster, DA and Zizek, S (eds), Perversion and the Social Relation. Durham, NC: Duke University Press.Google Scholar
Miller, JR (2005) Biodiversity conservation and the extinction of experience. Trends in Ecology & Evolution 20(8), 430434.10.1016/j.tree.2005.05.013CrossRefGoogle ScholarPubMed
Milli, S, Carroll, M, Wang, Y, Pandey, S, Zhao, S and Dragan, AD (2025) Engagement, user satisfaction, and the amplification of divisive content on social media. PNAS nexus 4(3), pgaf062.10.1093/pnasnexus/pgaf062CrossRefGoogle ScholarPubMed
Morehouse, H and Cigliano, M (2020) Cultures and concepts of ice: Listening for other narratives in the Anthropocene. Annals of the American Association of Geographers, 18.Google Scholar
mtgreenee [@mtgreene] (2024) Yes they can control the weather. It’s ridiculous for anyone to lie and say it can’t be done. 3 October 2024. Available at https://x.com/mtgreenee/status/1842039774359462324.Google Scholar
Neylan, JH, Patel, SS and Erickson, TB (2022) Strategies to counter disinformation for healthcare practitioners and policymakers. World Medical & Health Policy 14(2), 428436.10.1002/wmh3.487CrossRefGoogle ScholarPubMed
Noble, SU (2018) Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press.10.18574/nyu/9781479833641.001.0001CrossRefGoogle Scholar
Paglen, T (2024) Influencing machines. In Downey, A (ed), Adversarilly Evolved Hallucinations. London: Sternberg Press, pp. 114137.Google Scholar
Papworth, SK, Rist, J, Coad, L and Milner‐Gulland, EJ (2009) Evidence for shifting baseline syndrome in conservation. Conservation Letters 2(2), 93100.10.1111/j.1755-263X.2009.00049.xCrossRefGoogle Scholar
Pauly, D (1995) Anecdotes and the shifting baseline syndrome of fisheries. Trends in ecology and evolution 10(10), 430.10.1016/S0169-5347(00)89171-5CrossRefGoogle ScholarPubMed
Pilkington, D (2024) Myopic memory: Capitalism’s new continuity in the age of AI. Memory, Mind & Media 3, e24.10.1017/mem.2024.21CrossRefGoogle Scholar
Pyle, RM (1993) The Thunder Tree: Lessons from an Urban Wildland. Corvallis, OR: Oregon State University Press.Google Scholar
Richardson, M (2024) Nonhuman Witnessing: War, Data, and Ecology after the End of the World. Durham, NC: Duke University Press.Google Scholar
Richardson, M and Zolkos, M (2022) Witnessing after the human. Angelaki 27(2), 316.10.1080/0969725X.2022.2046355CrossRefGoogle Scholar
Rodilosso, E (2024) Filter bubbles and the unfeeling: How AI for social media can foster extremism and polarization. Philosophy & Technology 37(2), 71.10.1007/s13347-024-00758-4CrossRefGoogle Scholar
Schiff, KJ, Schiff, DS and Bueno, NS (2025) The liar’s dividend: Can politicians claim misinformation to evade accountability? American Political Science Review 119(1), 7190.10.1017/S0003055423001454CrossRefGoogle Scholar
Serres, M (1995) The Natural Contract. Ann Arbor, MI: University of Michigan Press.10.3998/mpub.9725CrossRefGoogle Scholar
Smit, R, Jacobsen, B and Annabell, T (2024) The multiplicities of platformed remembering. Memory, Mind & Media 3, e3.10.1017/mem.2024.3CrossRefGoogle Scholar
Soga, M and Gaston, KJ (2018) Shifting baseline syndrome: Causes, consequences, and implications. Frontiers in Ecology and the Environment 16(4), 222230.10.1002/fee.1794CrossRefGoogle Scholar
Strubell, E, Ganesh, A and McCallum, A (2020) Energy and Policy Considerations for Modern Deep Learning Research. Proceedings of the AAAI Conference on Artificial Intelligence 34(09), 1369313696.10.1609/aaai.v34i09.7123CrossRefGoogle Scholar
Till, KE (2003) Place and memory. In Agnew, J, Mitchell, K and Toal, G (eds), A Companion to Political Geography. Malden, MA: Blackwell Publishing, pp. 289301.10.1002/9780470998946.ch19CrossRefGoogle Scholar
Treen, KMI, Williams, HT and O’Neill, SJ (2020) Online misinformation about climate change. Wiley Interdisciplinary Reviews: Climate Change 11(5), e665.Google Scholar
Turnbull, J, Searle, A, Hartman Davies, O, Dodsworth, J, Chasseray-Peraldi, P, von Essen, E and Anderson-Elliott, H (2023) Digital ecologies: Materialities, encounters, governance. Progress in Environmental Geography 2(1-2), 332.10.1177/27539687221145698CrossRefGoogle Scholar
Twain, M (2013) Autobiography of mark Twain . Volume 2, 1st edn. Mark Twain Papers. Berkeley: University of California Press.Google Scholar
Voice, TP [@TPV_John] (2025) Los Angeles Pacific Palisades is totally engulfed in flames and very interestingly, LA is positioned to be an AI Smart City by 2028. January 8. Available at https://x.com/TPV_John/status/1876881137772990953.Google Scholar
Whatmore, S (2006) Materialist returns: Practising cultural geography in and for a more-than-human world. Cultural Geographies 13(4), 600609.10.1191/1474474006cgj377oaCrossRefGoogle Scholar
Wolff, A, Gooch, D, Montaner, JJC, Rashid, U and Kortuem, G (2016) Creating an understanding of data literacy for a data-driven society. The Journal of Community Informatics 12(3), 926.10.15353/joci.v12i3.3275CrossRefGoogle Scholar
Zupanačič, A (2024) Disavowal. Cambridge: Polity.Google Scholar