To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Low-lying atoll islands are among the world’s most vulnerable coastal environments to sea-level rise (SLR). Global application of coastal flooding models suggests that centennial flood events may become annual events by 2050 in tropical regions. This article addresses this claim by modelling an island flooding event that occurred in the Maldives on 1 July 2022 as a result of a distant-swell event coinciding with an extra high spring tide. Hydrodynamic data collected after the event on one of the affected islands were used to calibrate and validate a one-dimensional non-hydrostatic XBeach model. The model overpredicted wave setup and underpredicted the water motion at frequencies <0.05 Hz, but the wave run-up elevation was predicted reasonably well. The 1 July flood event was considered in a decadal context using modelled wave data and measured tide data. It was concluded that the 1 July event represents a c. 1:25-year flooding event, but, due to SLR, such flooding could occur every few years by 2050. This prediction ignores natural or anthropogenic adjustments to the island morphology. The expected increase in frequency of coastal flooding in the Maldives requires atoll and island authorities in the Maldives to act swiftly in adapting to future flood risk.
The study of the shape of droplets on surfaces is an important problem in the physics of fluids and has applications in multiple industries, from agrichemical spraying to microfluidic devices. Motivated by these real-world applications, computational predictions for droplet shapes on complex substrates – rough and chemically heterogeneous surfaces – are desired. Grid-based discretisations in axisymmetric coordinates form the basis of well-established numerical solution methods in this area, but when the problem is not axisymmetric, the shape of the contact line and the distribution of the contact angle around it are unknown. Recently, particle methods, such as pairwise-force smoothed particle hydrodynamics (PF-SPH), have been used to conveniently forego explicit enforcement of the contact angle. The pairwise-force model, however, is far from mature, and there is no consensus in the literature on the choice of pairwise-force profile. We propose a new pair of polynomial force profiles with a simple motivation and validate the PF-SPH model in both static and dynamic tests. We demonstrate its capabilities by computing droplet shapes on a physically structured surface, a surface with a hydrophilic stripe and a virtual wheat leaf with both micro-scale roughness and variable wettability. We anticipate that this model can be extended to dynamic scenarios, such as droplet spreading or impaction, in the future.
Tree-afflicting pests, such as insects and pathogens, could change forests in ways promoting invasions by non-native plants. After tree death associated with the fungal pathogen oak wilt (Bretziella fagacearum) and its attempted containment (severing root connectivity and sanitation removal of infected trees), we examined change in cover of the non-native liana Oriental bittersweet (Celastrus orbiculatus Thunb.; hereafter Celastrus) at 28 sites in temperate black oak (Quercus velutina Lam.) forests, Ohio, USA. During our 5-yr study spanning 2020 to 2024, Celastrus cover increased significantly (P < 0.05) through time at oak wilt sites but not in untreated reference forest sites without evidence of oak wilt. Celastrus cover increased by an order of magnitude, up to an average of 32 times among oak wilt treatments up to 10 yr old. By 2024, Celastrus cover ranged from 6% to 22% on average in 5- to 10-yr-old oak wilt treatments, compared with 1% cover in reference forest. Results indicate that non-native plant invasion accelerated following disturbance associated with a fungal pathogen and its attempted containment and, more generally, suggest that tree-afflicting pests can promote invasive plants in forests. Co-management of tree-afflicting pests and non-native plants may become increasingly important to ensure forests recovering from tree mortality are dominated by native plants.
Recent reports suggest the ON and OFF pathways are differentially susceptible to selective vision loss in glaucoma. Thus, perimetric assessment of ON- and OFF-pathway function may serve as a useful diagnostic. However, this necessitates a developed understanding of normal ON/OFF pathway function around the visual field and as a function of input intensity. Here, using electroencephalography, we measured ON- and OFF-pathway biased contrast response functions in the upper and lower visual fields. Using the steady-state visually evoked potential paradigm, we flickered achromatic luminance probes according to a saw-tooth waveform, the fast phase of which biased responses towards the ON or OFF pathways. Neural responses from the upper and lower visual fields were simultaneously measured using frequency tagging - probes in the upper visual field modulated at 3.75 Hz, while those in the lower visual field modulated at 3 Hz. We find that responses to OFF/decrements are larger than ON/increments, especially in the lower visual field. In the lower visual field, both ON and OFF responses were well described by a sigmoidal non-linearity. In the upper visual field, the ON pathway function was very similar to that of the lower, but the OFF pathway function showed reduced saturation and more cross-subject variability. Overall, this demonstrates that the relationship between the ON and OFF pathways depends on the visual field location and contrast level, potentially reflective of natural scene statistics.
Accounting for 53% of U.S. peanuts (Arachis hypogaea L.), Georgia is the top peanut-producing state, with approximately 1.42 billion kg produced in 2023. Peanut producers often use the acetolactate synthase (ALS) imidazolinone herbicide imazapic, but reduced yellow nutsedge (Cyperus esculentus L.) control was reported in Georgia peanuts after 4 yr of continuous imazapic use. This study aimed to determine the level of resistance (LD50, I50, and GR50) and potential cross-resistance for the suspected resistant population and to identify the associated genetic mutations conferring resistance. A susceptible biotype was treated with 0, 0.0088, 0.0175, 0.035, 0.07, 0.14, 0.28, and 0.56 kg ai ha−1, and a resistant biotype was sprayed with 0, 0.07, 0.14, 0.28, 0.56, 1.13, 2.26, and 4.5 kg ai ha−1 of imazapic. To determine whether the suspected resistant biotype was cross-resistant to halosulfuron-methyl, an ALS herbicide used to control Cyperus spp., both biotypes were treated with 0, 0.0117, 0.0233, 0.0466, 0.0933, 0.187, 0.373, and 0.746 g ai ha−1 of halosulfuron-methyl. Plants were rated for injury at 7, 14, and 28 d after treatment (DAT), and aboveground biomass was harvested at 28 DAT. For imazapic, LD50 was 0.041 and 1.503 kg ai ha−1 and the GR50 was estimated to be 0.0128 and 1.853 kg ha−1 for Sus and Res biotypes, respectively, indicating 36- and 145-fold increase in resistance of the Res biotype for I50 and GR50, respectively. Both biotypes responded similarly to applications of halosulfuron-methyl, with biomass reduction at rates greater than 0.023 kg ai ha−1. Transcriptome profiles revealed a mutation in the target-site gene of the resistant biotype causing an amino acid substitution from alanine to valine at position 205 (Ala-205-Val). Growers should continue to rotate chemistries and implement integrated weed management approaches for control of C. esculentus, as the use of imazapic over consecutive years has led to resistance in C. esculentus.
Tree-of-heaven [Ailanthus altissima (Mill.) Swingle] readily exploits disturbances, grows quickly into dense monocultures, and suppresses native plant species. The vascular wilt pathogen, Verticillium nonalfalfae, native to the eastern United States, has been proposed as a biocontrol agent for the invasive A. altissima. Studies consistently demonstrate the safety and efficacy of the bioherbicide, but they also note that the selective nature of the fungus does not preclude other invasive plants that commonly co-occur with A. altissima from occupying the site. We quantified the standing plant community and seedbank at several sites across Virginia 5 yr after inoculation with V. nonalfalfae to understand which species are present or being naturally recruited. Ailanthus altissima remained dominant in untreated areas but was nearly eradicated from the treatment plots. Other non-native species made up a large portion of the plant community and seedbank across all study areas, with no differences in their respective cover and count between treatments. While variability in plant community composition is high and site-specific context is important for establishing effective management strategies, planting native species and mitigating other invasives will be crucial to ensuring native species successfully establish in bioherbicide-treated areas.
What has allowed inequalities in material resources to mount in advanced democracies? This chapter considers the role of media reporting on the economy in weakening accountability mechanisms that might otherwise have incentivized governments to pursue more equal outcomes. Building on prior work on the United States, we investigate how journalistic depictions of the economy relate to real distributional developments across OECD countries. Using sentiment analysis of economic news content, we demonstrate that the evaluative content of the economic news strongly and disproportionately tracks the fortunes of the very rich and that good (bad) economic news is more common in periods of rising (falling) income shares at the top. We then propose and test an explanation in which pro-rich biases in news tone arise from a journalistic focus on the performance of the economy in the aggregate, while aggregate growth is itself positively correlated with relative gains for the rich. The chapter’s findings suggest that the democratic politics of inequality may be shaped in important ways by the skewed nature of the informational environment within which citizens form economic evaluations.
The GINI project investigates the dynamics of inequality among populations over the long term by synthesising global archaeological housing data. This project brings archaeologists together from around the world to assess hypotheses concerning the causes and consequences of inequality that are of relevance to contemporary societies globally.
The IntCal family of radiocarbon (14C) calibration curves is based on research spanning more than three decades. The IntCal group have collated the 14C and calendar age data (mostly derived from primary publications with other types of data and meta-data) and, since 2010, made them available for other sorts of analysis through an open-access database. This has ensured transparency in terms of the data used in the construction of the ratified calibration curves. As the IntCal database expands, work is underway to facilitate best practice for new data submissions, make more of the associated metadata available in a structured form, and help those wishing to process the data with programming languages such as R, Python, and MATLAB. The data and metadata are complex because of the range of different types of archives. A restructured interface, based on the “IntChron” open-access data model, includes tools which allow the data to be plotted and compared without the need for export. The intention is to include complementary information which can be used alongside the main 14C series to provide new insights into the global carbon cycle, as well as facilitating access to the data for other research applications. Overall, this work aims to streamline the generation of new calibration curves.
To support school foods programmes by evaluating the relationship between nutritional quality, cost, student consumption and the environmental impacts of menus.
Design:
Using linear programming and data from previously served menu items, the relationships between the nutritional quality, cost, student consumption and the environmental impacts of lunch menus were investigated. Optimised lunch menus with the maximum potential student consumption and nutritional quality and lowest costs and environmental impacts were developed and compared with previously served menus (baseline).
Setting:
Boston Public Schools (BPS), Boston Massachusetts, USA.
Participants:
Menu items served on the 2018–2019 BPS lunch menu (n 142).
Results:
Using single-objective models, trade-offs were observed between most interests, but the use of multi-objective models minimised these trade-offs. Compared with the current weekly menus offered, multi-objective models increased potential caloric intake by up to 27 % and Healthy Eating Index scores by up to 19 % and reduced costs and environmental impacts by up to 13 % and 71 %, respectively. Improvements were made by reducing the frequency of beef and cheese entrées and increasing the frequency of fish and legume entrées on weekly menus.
Conclusions:
This work can be extrapolated to monthly menus to provide further direction for school districts, and the methods can be employed with different recipes and constraints. Future research should test the implementation of optimised menus in schools and consider the broader implications of implementation.
Climate change is resulting in global changes to sea level and wave climates, which in many locations significantly increase the probability of erosion, flooding and damage to coastal infrastructure and ecosystems. Therefore, there is a pressing societal need to be able to forecast the morphological evolution of our coastlines over a broad range of timescales, spanning days-to-decades, facilitating more focused, appropriate and cost-effective management interventions and data-informed planning to support the development of coastal environments. A wide range of modelling approaches have been used with varying degrees of success to assess both the detailed morphological evolution and/or simplified indicators of coastal erosion/accretion. This paper presents an overview of these modelling approaches, covering the full range of the complexity spectrum and summarising the advantages and disadvantages of each method. A focus is given to reduced-complexity modelling approaches, including models based on equilibrium concepts, which have emerged as a particularly promising methodology for the prediction of coastal change over multi-decadal timescales. The advantages of stable, computationally-efficient, reduced-complexity models must be balanced against the requirement for good generality and skill in diverse and complex coastal settings. Significant obstacles are also identified, limiting the generic application of models at regional and global scales. Challenges include the accurate long-term prediction of model forcing time-series in a changing climate, and accounting for processes that can largely be ignored in the shorter term but increase in importance in the long term. Further complications include coastal complexities, such as the accurate assessment of the impacts of headland bypassing. Additional complexities include complex structures and geology, mixed grain size, limited sediment supply, sources and sinks. It is concluded that with present computational resources, data availability limitations and process knowledge gaps, reduced-complexity modelling approaches currently offer the most promising solution to modelling shoreline evolution on daily-to-decadal timescales.
From 2014 to 2020, we compiled radiocarbon ages from the lower 48 states, creating a database of more than 100,000 archaeological, geological, and paleontological ages that will be freely available to researchers through the Canadian Archaeological Radiocarbon Database. Here, we discuss the process used to compile ages, general characteristics of the database, and lessons learned from this exercise in “big data” compilation.
The classical model for studying one-phase Hele-Shaw flows is based on a highly nonlinear moving boundary problem with the fluid velocity related to pressure gradients via a Darcy-type law. In a standard configuration with the Hele-Shaw cell made up of two flat stationary plates, the pressure is harmonic. Therefore, conformal mapping techniques and boundary integral methods can be readily applied to study the key interfacial dynamics, including the Saffman–Taylor instability and viscous fingering patterns. As well as providing a brief review of these key issues, we present a flexible numerical scheme for studying both the standard and nonstandard Hele-Shaw flows. Our method consists of using a modified finite-difference stencil in conjunction with the level-set method to solve the governing equation for pressure on complicated domains and track the location of the moving boundary. Simulations show that our method is capable of reproducing the distinctive morphological features of the Saffman–Taylor instability on a uniform computational grid. By making straightforward adjustments, we show how our scheme can easily be adapted to solve for a wide variety of nonstandard configurations, including cases where the gap between the plates is linearly tapered, the plates are separated in time, and the entire Hele-Shaw cell is rotated at a given angular velocity.
To assess the incidence and treatments currently used in clinical practice for the treatment of treatment-resistant depression (TRD) in Scotland.
Background
Patients with major depressive disorder (MDD) who have not responded to at least two successive antidepressant (AD) treatments in a single episode are described as having Treatment-Resistant Depression (TRD). Epidemiological data on TRD in Scotland is lacking. Furthermore, there is no data to our knowledge on therapies prescribed in Scottish clinical practice to treat TRD.
Method
A retrospective, longitudinal cohort study was conducted using Clinical Practice Research Datalink (CPRD) medical records. Adult patients were indexed on AD prescription, requiring MDD diagnosis within 90 days, from Jan 2011-May 2018 with 360-day baseline and 180-day minimum follow-up periods. Failure of ≥2 adequate oral AD regimens following indexing constituted TRD classification. Incidence rates of MDD and TRD (within the MDD cohort) and treatment lines following TRD classification were derived.
Result
The analysis included 20,059 patients with MDD (mean age 44 years, 63% female, median follow-up 59 months); 1,374 (6.8%) were classified as TRD. Median time-to-TRD classification was 25 months. The incidence rate of MDD was 15.9 per 1,000 patient-years and for TRD was 14.7 per 1,000 MDD-patient-years. For all first four post-TRD treatment lines, SSRI monotherapy was the most commonly prescribed therapy, followed by combination (dual/triple) therapy and augmentation therapy (at least one oral AD supplemented with lithium, an antipsychotic or an anticonvulsant therapy). At first-line of TRD treatment, 1,050 (76.4%) patients received monotherapy AD, 212 (15.4%) received combination AD therapy and 112 (8.2%) received augmentation therapy. The most common monotherapy treatments at first-line TRD were sertraline (15.6%), mirtazapine (13.8%), fluoxetine (12.2%) and venlafaxine (11.6%). Among combination therapies, mirtazapine, venlafaxine, sertraline and amitriptyline were frequently used. Among the TRD and MDD cohort, no somatic treatments were coded in CPRD, although the use of these treatments was likely underestimated.
Conclusion
Monotherapy AD treatment was the most common therapy type for all four post-TRD treatment lines. These data support the need for new treatments that can achieve and maintain therapeutic response, and avoid continuous cycling through similar AD therapies.
There is substantial evidence that voters’ choices are shaped by assessments of the state of the economy and that these assessments, in turn, are influenced by the news. But how does the economic news track the welfare of different income groups in an era of rising inequality? Whose economy does the news cover? Drawing on a large new dataset of US news content, we demonstrate that the tone of the economic news strongly and disproportionately tracks the fortunes of the richest households, with little sensitivity to income changes among the non-rich. Further, we present evidence that this pro-rich bias emerges not from pro-rich journalistic preferences but, rather, from the interaction of the media’s focus on economic aggregates with structural features of the relationship between economic growth and distribution. The findings yield a novel explanation of distributionally perverse electoral patterns and demonstrate how distributional biases in the economy condition economic accountability.
The surface gravity wave pattern that forms behind a steadily moving disturbance is well known to comprise divergent waves and transverse waves, contained within a distinctive $V$-shaped wake. In this paper, we are concerned with a theoretical study of the limit of a slow-moving disturbance (small Froude numbers) in the absence of surface tension, for which the wake is dominated by transverse waves. Three configurations are considered: flow past a submerged source singularity, a submerged doublet and a pressure distribution applied to the surface. We treat the linearised version of these problems and use the method of stationary phase and exponential asymptotics to demonstrate that the apparent wake angle is less than the classical Kelvin angle and to quantify the decrease in apparent wake angle as the Froude number decreases. These results complement a number of recent studies for sufficiently fast-moving disturbances (large Froude numbers) where the apparent wake angle has been also shown to be less than the classical Kelvin angle. As well as shedding light on the issue of apparent wake angle, we also study the fully nonlinear problems for our three configurations under various limits to demonstrate the unique and interesting features of Kelvin wake patterns at small Froude numbers.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
“Cultural Revolutions” examines the politicization of culture around 1968. From Surrealist and Situationist attempts to redefine art as a utopian-socialist enterprise, to the public scandals created by subversive avant-gardes like the Dutch Provos, to the development of popular culture into a new field of youth radicalism centered on rock‘n’roll and new styles of dress and behavior, the chapter shows that the new politics of the 1960s were inseparable from cultural innovations. This synergistic relationship frequently involved attempts to remake the self by reshaping the face of daily life, a goal central both to new aesthetic forms like the Happening and the growth of nonconformist subcultures and countercultures aimed at erasing the distinction between the personal and the political. The creation of local underground “scenes” in which much of the political-cultural work of the 1960s was accomplished was a key expression of this tendency, while the prominence of alternative media practices in and around those scenes highlights the importance around 1968 of efforts to create alternative sources of knowledge outside the mainstream.