To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Hector Galaxy Survey is a new optical integral field spectroscopy (IFS) survey currently using the Anglo-Australian Telescope to observe up to 15 000 galaxies at low redshift ($z \lt 0.1$). The Hector instrument employs 21 optical fibre bundles feeding into two double-beam spectrographs, AAOmega and the new Spector spectrograph, to enable wide-field multi-object IFS observations of galaxies. To efficiently process the survey data, we adopt the data reduction pipeline developed for the SAMI Galaxy Survey, with significant updates to accommodate Hector’s dual-spectrograph system. These enhancements address key differences in spectral resolution and other instrumental characteristics relative to SAMI and are specifically optimised for Hector’s unique configuration. We introduce a two-dimensional arc fitting approach that reduces the root-mean-square (RMS) velocity scatter by a factor of 1.2–3.4 compared to fitting arc lines independently for each fibre. The pipeline also incorporates detailed modelling of chromatic optical distortion in the wide-field corrector, to account for wavelength-dependent spatial shifts across the focal plane. We assess data quality through a series of validation tests, including wavelength solution accuracy (1.2–2.7 km s$^{-1}$ RMS), spectral resolution (FWHM of 1.2–1.4 Å for Spector), throughput characterisation, astrometric precision ($\lesssim$ 0.03 arcsec median offset), sky subtraction residuals (1–1.6% median continuum residual), and flux calibration stability (4% systematic offset when compared to Legacy Survey fluxes). We demonstrate that Hector delivers high-fidelity, science-ready datasets, supporting robust measurements of galaxy kinematics, stellar populations, and emission-line properties and provide examples. Additionally, we address systematic uncertainties identified during the data processing and propose future improvements to enhance the precision and reliability of upcoming data releases. This work establishes a robust data reduction framework for Hector, delivering high-quality data products that support a broad range of extragalactic studies.
Insomnia disorder, characterized by chronic sleep disruption, often co-occurs with maladaptive emotional memory processing. However, much remains unknown regarding the evolution of emotional memories and their neural representations over time among individuals with insomnia disorder.
Method
We examined the electroencephalographic (EEG) activities during emotional memory encoding, post-encoding sleep, and multiple retrieval phases – including immediate post-encoding, post-sleep, and a 7-day delayed retrieval – among 34 participants with insomnia disorder and 35 healthy control participants.
Results
Healthy controls exhibited adaptive dissipation of emotional memory: memory declined over time, accompanied by reduced subjective feelings toward negative memories. In contrast, participants with insomnia exhibited impaired dissipation: they retained both the emotional content and affective tone of the memories, with diminished time-dependent declines in memory and affect. Beyond behavioral performance, only participants with insomnia maintained stable neural representations of emotion over time, a pattern absent in healthy controls. Additionally, during the post-encoding sleep, slow-wave sleep (SWS), and rapid eye movement (REM) sleep durations predicted the adaptive dissipation of emotional memory over time, but only among healthy participants.
Conclusion
These findings highlight abnormalities in emotional memory processing among individuals with insomnia disorder and underscore the important function of SWS and REM sleep in facilitating adaptive emotional memory processing.
Serious incident management and organisational learning are international patient safety priorities. Little is known about the quality of suicide investigations and, in turn, the potential for organisational learning. Suicide risk assessment is acknowledged as a complex phenomenon, particularly in the context of adult community mental health services. Root cause analysis (RCA) is the dominant investigative approach, although the evidence base underpinning RCA is contested, with little attention paid to the patient in context and their cumulative risk over time.
Results
Recent literature proposes a Safety-II approach in response to the limitations of RCA. The importance of applying these approaches within a mental healthcare system that advocates a zero suicide framework, grounded in a restorative just culture, is highlighted.
Clinical implications
Although integrative reviews and syntheses have clear methodological limitations, this approach facilitates the management of a disparate body of work to advance a critical understanding of patient safety in adult community mental healthcare.
No attempt to evaluate the longue durée of human settlement can ignore the environment as both a formative influence and as a cultural artefact. The environmental programme of the project collected data to complement the regional geomorphological and palynological record on patterns of landscape change in response to climate change and the influence of human activities. The geomorphological fieldwork focused on the catchment of the Marta river that flows from Lake Bolsena past Tuscania to the Tyrrhenian sea near Tarquinia. The Late Glacial environment c.15,000 years ago consisted of a steppe landscape.After a sedimentary hiatus in the Early and Mid Holocene, sediments started to be laid down again in the Later Etruscan period c.500-300 BC, reflecting the extensive nature of Etruscan agriculture.Significant human impacts began in the Roman Republican period. Then and during the Early/Mid Imperial periods the Marta and other rivers in the area were unstable braided and wandering gravel-bedded rivers quite unlike the modern rivers. Their dynamism largely reflected a colder wetter climate than today but also woodland clearance and increased arable cultivation.This combination pre-conditioned the landscape’s sensitivity to alluviation in the Late Medieval and Post Medieval periods.
n-3 index, the erythrocyte proportion of the EPA + DHA fatty acids is a clinical marker of age-related disease risk. It is unclear whether regular intake of α-linolenic acid (ALA), a plant-derived n-3 polyunsaturated fatty acid, raises n-3 index in older adults. Of the 356 participants at the Loma Linda, CA centre from the original study, a randomly selected subset (n 192) was included for this secondary analysis (mostly Caucasian women, mean age 69 years). Participants were assigned to either the walnut (15 % of daily energy from walnuts) or the control group (usual diet, no walnuts) for 2 years. Erythrocyte fatty acids were determined at baseline and 1-year following intervention. No differences were observed for erythrocyte EPA, but erythrocyte DHA decreased albeit modestly in the walnut group (–0·125 %) and slightly improved in the control group (0·17 %). The change in n-3 index between the walnut and control groups was significantly different only among fish consumers (those who ate fish ≥ once/month). Longitudinal analyses combining both groups showed significant inverse association between the 1-year changes of the n-3 index and fasting plasma TAG (ß = −10), total cholesterol (ß = −5·59) and plasma glucose (ß = −0·27). Consuming ALA-rich walnuts failed to improve n-3 index in elders. A direct source of EPA/DHA may be needed to achieve desirable n-3 index, as it is inversely associated with cardiometabolic risk. Nevertheless, incorporating walnuts as part of heart healthy diets is still encouraged.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
Methods
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Results
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
Conclusions
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Agitated patients constitute 10% of all emergency psychiatric treatment. Management guidelines, the preferred treatment of clinicians differ in opinion and practice. In Lebanon, the use of the triple therapy haloperidol plus promethazine plus chlorpromazine (HPC) is frequently used but no studies involving this combination exists.
Method
A pragmatic randomised open trial (September 2018–July 2019) in the Lebanese Psychiatric Hospital of the Cross in Beirut Lebanon involving 100 people requiring urgent intramuscular sedation due to aggressive behaviour were given intramuscular chlorpromazine 100 mg plus haloperidol 5 mg plus promethazine 25 mg (HPC) or intramuscular haloperidol 5 mg plus promethazine 25 mg
Results
Primary outcome data were available for 94 (94%) people. People allocated to the haloperidol plus promethazine (HP) group showed no clear difference at 20 min compared with patients allocated to the HPC group [relative risk (RR) 0.84, 95% confidence interval (CI) 0.47–1.50].
Conclusions
Neither intervention consistently impacted the outcome of ‘calm’, or ‘asleep’ and had no discernible effect on the use of restraints, use of additional drugs or recurrence. If clinicians are faced with uncertainty on which of the two intervention combinations to use, the simpler HP is much more widely tested and the addition of chlorpromazine adds no clear benefit with a risk of additional adverse effects.
Intentional facial disfigurement is documented in archaeological contexts around the world. Here, the authors present the first archaeological evidence for intentional facial mutilation from Anglo-Saxon England—comprising the removal of the nose, upper lip and possible scalping—inflicted upon a young adult female. The injuries are consistent with documented punishments for female offenders. Although such mutilations do not appear in the written record until the tenth century AD, the instance reported here suggests that the practice may have emerged a century earlier. This case is examined in the context of a wider consideration of the motivations and significance of facial disfigurement in past societies.
Social jetlag (SJ) occurs when sleep-timing irregularities from social or occupational demands conflict with endogenous sleep–wake rhythms. SJ is associated with evening chronotype and poor mental health, but mechanisms supporting this link remain unknown. Impaired ability to retrieve extinction memory is an emotion regulatory deficit observed in some psychiatric illnesses. Thus, SJ-dependent extinction memory deficits may provide a mechanism for poor mental health. To test this, healthy male college students completed 7–9 nights of actigraphy, sleep questionnaires, and a fear conditioning and extinction protocol. As expected, greater SJ, but not total sleep time discrepancy, was associated with poorer extinction memory. Unexpectedly, greater SJ was associated with a tendency toward morning rather than evening chronotype. These findings suggest that deficient extinction memory represents a potential mechanism linking SJ to psychopathology and that SJ is particularly problematic for college students with a greater tendency toward a morning chronotype.
This study examined mental health status among Hurricane Sandy survivors in the most severely damaged areas of New York and New Jersey in 2014, approximately 2 years after this disaster. We used the 2014 Associated Press NORC survey of 1009 Sandy survivors to measure the prevalence of probable mental illness and to analyze its association with selected socioeconomic characteristics of survivors, direct impact by Sandy, as well as social support and social trust. The study found major disparities in mental illness by race/ethnicity, age groups, and employment status. Higher Sandy impact levels were strongly associated with higher rates of mental illness and accounted for much of the disparity between blacks and Hispanics compared with whites in our study group. Social support was more strongly associated with lower rates of mental illness than was social trust. In addition, social support served as a significant mitigating factor in the mental health disparities between blacks and whites. The severity of mental illness among Sandy survivors differed significantly among racial and ethnic groups but was moderated by both the direct impact of this disaster on their lives and the degree of social support they received, as well as how trusting they were.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
Methods
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
Results
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
Conclusions
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Methods
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Results
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
Conclusion
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
In 2017, Public Health England South East Health Protection Team (HPT) were involved in the management of an outbreak of Mycobacterium bovis (the causative agent of bovine tuberculosis) in a pack of working foxhounds. This paper summarises the actions taken by the team in managing the public health aspects of the outbreak, and lessons learned to improve the management of future potential outbreaks. A literature search was conducted to identify relevant publications on M. bovis. Clinical notes from the Public Health England (PHE) health protection database were reviewed and key points extracted. Animal and public health stakeholders involved in the management of the situation provided further evidence through unstructured interviews and personal communications. The PHE South East team initially provided ‘inform and advise’ letters to human contacts whilst awaiting laboratory confirmation to identify the infectious agent. Once M. bovis had been confirmed in the hounds, an in-depth risk assessment was conducted, and contacts were stratified in to risk pools. Eleven out of 20 exposed persons with the greatest risk of exposure were recommended to attend TB screening and one tested positive, but had no evidence of active TB infection. The number of human contacts working with foxhound packs can be large and varied. HPTs should undertake a comprehensive risk assessment of all potential routes of exposure, involve all other relevant stakeholders from an early stage and undertake regular risk assessments. Current guidance should be revised to account for the unique risks to human health posed by exposure to infected working dogs.
Invasion by cheatgrass and the associated high fire frequency can displace native plant communities from a perennial to an annual grass driven system. Our overall objective of this study was to determine the potential to favor desired native perennial bunchgrasses over annual grasses by altering plant available mineral nitrogen (N). In the first study, we grew cheatgrass and three native bunch grasses (native grasses were combined in equal proportions) in an addition series experimental design and applied one of three N treatments (0, 137, and 280 mg N/kg soil). Regression models were used to derive the effects of intra- and interspecific competition on individual plant yield of cheatgrass and the native bunch grasses (combined). In our second study, we compared the absolute growth rate of the four plant species grown in isolation in a randomized complete block design for 109 days under the same soil N treatments as the competition study. Predicted mean average weight of isolated individuals increased with increasing soil N concentrations for both cheatgrass and the three native perennials (P < 0.05). Biomass of cheatgrass and its competitive ability increased with increasing soil N concentrations (P < 0.0001) compared to the combined native bunchgrasses. However, the greatest resource partitioning occurred at the 137 mg N/kg soil N treatment compared to the 0 (control) and 280 mg N/kg soil treatments, suggesting there may be a level of N that minimizes competition. In the second study, the absolute growth of cheatgrass grown in isolation also increased with increasing N levels (P = 0.0297). Results and ecological implications of this study suggest that increasing soil N leads to greater competitive ability of cheatgrass, and that it may be possible to favor desired plant communities by modifying soil nutrient levels.
Rangeland health assessment provides qualitative information on ecosystem attributes. Successional management is a conceptual framework that allows managers to link information gathered in rangeland health assessment to ecological processes that need to be repaired to allow vegetation to change in a favorable direction. The objective of this paper is to detail how these two endeavors can be integrated to form a holistic vegetation management framework. The Rangeland Health Assessment procedures described by Pyke et al. (2002) and Pellant et al. (2005) currently are being adopted by land managers across the western United States. Seventeen standard indicators were selected to represent various ecological aspects of ecosystem health. Each of the indicators is rated from extreme to no (slight) departure from the Ecological Site Description and/or the Reference Area(s). Successional management identifies three general drivers of plant community change: site availability, species availability, and species performance, as well as specific ecological processes influencing these drivers. In this paper, we propose and provide examples of a method to link the information collected in rangeland health assessment to the successional management framework. Thus, this method not only allows managers to quantify a point-in-time indication of rangeland health but also allows managers to use this information to decide how various management options might influence vegetation trajectories. We argue that integrating the Rangeland Health Assessment with Successional Management enhances the usefulness of both systems and provides synergistic value to the decision-making process.
Invasion by annual grasses, such as cheatgrass, into the western U.S. sagebrush-steppe is a major concern of ecologists and resource managers. Maintaining or improving ecosystem health depends on our ability to protect or re-establish functioning, desired plant communities. In frequently disturbed ecosystems, nutrient status and the relative ability of species to acquire nutrients are important drivers of invasion, retrogression, and succession. Thus, these processes can potentially be modified to direct plant community dynamics toward a desired plant community. The overall objective of this review paper is to provide the ecological background of invasion by exotic plants and propose a concept to facilitate the use of soil nitrogen (N) management to achieve desired plant communities that resist invasion. Based on the literature, we propose a model that predicts the outcome of community dynamics based on N availability. The model predicts that at low N levels, native mid- and late-seral species are able to successfully out-compete early-seral and invasive annual species up to some optimal level. However, at some increased level of N, early-seral species and invasive annual grasses are able to grow and reproduce more successfully than native mid- and late-seral species. At the high end of N availability to plants, the community is most susceptible to invasion and ultimately, increased fire frequency. Soil N level can be managed by altering microbial communities, grazing, mowing, and using cover crops and bridge species during restoration. In these cases, management may be more sustainable since the underlying cause of invasion and succession is modified in the management process.