To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antarctic ice-free coastal environments, like the Vestfold Hills (East Antarctica), are shaped by a complex interplay of physical processes. This study synthesizes new data and existing research from the Vestfold Hills across marine, terrestrial and cryosphere science, meteorology, geomorphology, coastal oceanography and hydrology to explore interconnected processes ranging from icescape morphology and sediment transport to ocean-floor scouring and ocean-atmosphere interactions. Coastal landforms and habitats result from the interaction of marine dynamics with the aeolian and fluvial transport of glacially derived sediments and geomorphic features. Rocky shorelines dominate the region, and extensive fjords are prominent coastal features, whereas intertidal sediments and beaches are scarce. The marine environment is characterized by slow currents, low-energy waves, annually variable land-fast ice, irregular sedimentation rates and a geomorphologically complex shoreline. Aeolian and fluvial sediment deposition into coastal waters and onto sea ice can significantly impact local ecological and physical processes. Human activity further modifies these dynamics. Ice-free coastal areas such as the Vestfold Hills are predicted to experience substantial environmental shifts due to climate change. Wind speeds, temperature and precipitation are increasing in the Vestfold Hills. Retreating grounded ice sheets are likely to expand this coastal area and increase meltwater and sediment inputs into nearshore marine systems. Concurrently, changes in sea-ice extent, thickness and/or duration may profoundly alter the structure and function of this coastal environment.
Anhedonia is a common and impairing symptom of psychopathology that predicts negative outcomes and may undermine peer relationships. Anhedonia comprises both trait (stable, time-invariant) and state (dynamic, time-varying) components. Relative to trait anhedonia, state anhedonia may be more strongly related to proximal risk for deleterious outcomes. Yet, associations between state anhedonia and daily-life socio-affective experiences in adolescence are not well understood. Thus, the present study used ecological momentary assessment (EMA) to examine within-person associations between state anhedonia and the quantity and quality of daily-life peer interactions among a sample of adolescents enriched for suicidality risk, a population at high risk for anhedonic and peer problems. Participants included 102 adolescents assigned female at birth (ages 12–18; M[SD] = 15.34[1.50]; 67.6% at elevated risk for suicidality). State anhedonia, as well as being with peers, connectedness with peers, and positive affect with peers, was measured three times per day for 10 days via EMA (n = 30 prompts). Multilevel models demonstrated that within-person fluctuations in state anhedonia relate to reduced odds of being with peers, as well as decreased connectedness and positive affect with peers. Findings suggest that dynamic changes in state anhedonia are related to both the quantity and quality of peer experiences among adolescents.
Antipsychotic (AP) medication in individuals at clinical high risk for psychosis (CHR-P) is not routinely recommended by clinical guidelines but is commonly prescribed. Since little is known about the predictors of AP inception in CHR-P, we analyzed data from two observational cohorts.
Methods
To avoid baseline predictors being confounded by previous treatment, participants were selected for analysis from the 764 participants at CHR-P enrolled in NAPLS-2 and the 710 enrolled in NAPLS-3 by excluding those with lifetime histories of AP use. Baseline clinical variables available in both studies were employed as predictors of subsequent AP inception over the next 6 months in univariable and multivariable analyses.
Results
Preliminary analyses indicated no important effects of sample. The final combined population included 79 AP inception participants and 580 participants who did not have AP inception. The AP medications most commonly prescribed were risperidone, aripiprazole, and quetiapine. Univariable analyses identified seven significant predictors of AP inception. The final logistic regression model including these variables was highly significant (χ2 = 36.53, df = 7, p = <0.001). Three variables (current major depression, fewer education years, and current benzodiazepine use) emerged as significant independent predictors of AP inception.
Conclusion
This study is the first to determine baseline characteristics that predict subsequent AP initiation in CHR-P. Some AP use in CHR-P appears to be intended as augmentation of antidepressant treatment for comorbid major depression. Some prescribers may not have detected the attenuated positive symptoms characteristic of CHR-P since their severity did not significantly predict AP inception.
Genetic research on nicotine dependence has utilized multiple assessments that are in weak agreement.
Methods
We conducted a genome-wide association study (GWAS) of nicotine dependence defined using the Diagnostic and Statistical Manual of Mental Disorders (DSM-NicDep) in 61,861 individuals (47,884 of European ancestry [EUR], 10,231 of African ancestry, and 3,746 of East Asian ancestry) and compared the results to other nicotine-related phenotypes.
Results
We replicated the well-known association at the CHRNA5 locus (lead single-nucleotide polymorphism [SNP]: rs147144681, p = 1.27E−11 in EUR; lead SNP = rs2036527, p = 6.49e−13 in cross-ancestry analysis). DSM-NicDep showed strong positive genetic correlations with cannabis use disorder, opioid use disorder, problematic alcohol use, lung cancer, material deprivation, and several psychiatric disorders, and negative correlations with respiratory function and educational attainment. A polygenic score of DSM-NicDep predicted DSM-5 tobacco use disorder criterion count and all 11 individual diagnostic criteria in the independent National Epidemiologic Survey on Alcohol and Related Conditions-III sample. In genomic structural equation models, DSM-NicDep loaded more strongly on a previously identified factor of general addiction liability than a “problematic tobacco use” factor (a combination of cigarettes per day and nicotine dependence defined by the Fagerström Test for Nicotine Dependence). Finally, DSM-NicDep showed a strong genetic correlation with a GWAS of tobacco use disorder as defined in electronic health records (EHRs).
Conclusions
Our results suggest that combining the wide availability of diagnostic EHR data with nuanced criterion-level analyses of DSM tobacco use disorder may produce new insights into the genetics of this disorder.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
Metabolic dysfunction-associated fatty liver disease (MAFLD) is the most common liver disease globally, affecting 1 in 3 Australian adults and up to 39% in rural communities(1). Behaviour changes targeting diet and physical activity to achieve weight loss are considered the cornerstones of MAFLD management. A Mediterranean diet (MedDiet) rich in wholegrains, vegetables, fruits, fish, olives, raw nuts and seeds is recommended in key global guidelines as the optimal dietary pattern for MAFLD(2). Additionally, research evidence indicates moderate-intensity aerobic exercise is effective in reducing liver fat and improving cardiometabolic health(3). Given the higher rates of MAFLD in rural communities and their limited access to healthcare services, digital health interventions present a valuable opportunity to improve the accessibility, availability and personalisation of healthcare services to address this important unmet need. However, no digital interventions to address health risk behaviours in MAFLD including diet and physical activity, are currently available. This research aimed to use best practice co-design methodology to develop a web-based healthy living intervention for people with MAFLD. An iterative co-design process using the Double Diamond Framework, including four key phases was undertaken over 12 months. Twenty-seven adults (≥ 18 years) were recruited from The Alfred Hospital, Australia. This included people with MAFLD (n = 10; 50% female; mean age: 63.6 years), healthcare professionals (HCPs) (n = 17; 59% female; mean age: 37.1 years) [dietitians (n = 5), exercise professionals (n = 6), and clinicians/hepatologists (n = 6)]. Phase 1–discover. Barriers and facilitators were explored through semi-structured interviews to understand the needs of the target population regarding accessibility, appearance, resources and application of the web-based intervention. Interviews were virtual, conducted one-on-one via ZoomTM, transcribed and inductively analysed using NVivo. Phase 2–define. A reflexive thematic analysis identified five key themes within the data. These included: i) web-based functionality, navigation and formatting, ii) holistic behaviour change including MedDiet and physical activity, iii) digital health accessibility, iv) knowledge and resources, and v) intervention duration and reminders. Phase 3–develop. The knowledge gained from this process lead to the development of the web-based intervention taking into consideration expressed preferences for features that can enhance knowledge about the condition, offer dietary and physical activity support via targeted resources and videos, and increase engagement via chat group and frequent reminders. Phase 4–deliver. The co-design has led to the development of a web-based healthy living intervention that will be further evaluated for feasibility and implementation in a pilot trial. The resulting intervention aims to achieve behavioural change and promote healthier living amongst Australians with MAFLD. This knowledge has the potential to drive strategies to reduce barriers to accessing healthcare remotely, making the web-based intervention a valuable tool for both patients and professionals.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
Mica particles approximately 10 or 25 mm square and 0.5 mm thick were placed in NaCl-NaTPB solutions to make visual observations of the changes that occur in micas when the interlayer K is replaced by Na. Samples of muscovite, biotite, phlogopite, lepidolite, and lepidomelane were used, and the effects of different degradation periods were photographed.
An increase in the thickness of the particles due to basal planes splitting apart was observed with all micas. This exfoliation released interlayer K and in some cases caused the particles to cleave into separate flakes. Lepidomelane particles remained intact despite a 20-fold increase in thickness in 7 days. Even muscovite and lepidolite exfoliated and cleaved, but much longer degradation periods were needed.
There was a distinct change in the color of the dark biotite, phlogopite and lepidomelane particles when K was removed. Therefore, the initial stages of K depletion at holes, scratches, and edges of the particles were easily followed. As the degradation of the mica particles progressed, however, the color of the mica became a less reliable index of the stage of K depletion. Visual evidence of K depletion at the edges of particles was also obtained with muscovite, but not with lepidolite.
Transverse sections of 25-mm particles of K-depleted biotite were photographed to show the edge expansion that occurred when interlayer K was replaced by Na.
Interlayer K in muscovite, biotite, phlogopite, illite and vermiculite-hydrobiotite samples was replaced by cation exchange with Na. The rate and amount of exchange varied with the mineral and the level of K in solution.
Essentially, all the K in muscovite, biotite, phlogopite and vermiculite was exchangeable when the mass-action effect of the replaced KT was reduced by maintaining a very low level of K in solution. The time required for this exchange varied from < 10 hr with vermiculite to > 45 weeks with muscovite. Only 66% of the K in the illite was exchangeable under these conditions. When the replaced K was allowed to accumulate in the solution, the amount of exchange was determined by the level of K in solution required for equilibrium. These levels decreased with the degree of K-depletion and with the selectivity of the mica for K. The order of selectivity was muscovite > illite > biotite > phlogopite > vermiculite. Decreasing the K in solution from 10 to 7 ppm increased the exchangeable K in biotite from 30 to 100%. A K level of only 0.1 ppm restricted the exchange of K in muscovite to 17%.
A decrease in layer charge was not required for K exchange, but a decrease did occur in K-depleted biotite and vermiculite. Muscovite with the highest layer charge (247 meq/100 g), least expansion with Na (12.3Å), and least sensitivity to solution pH had the highest selectivity for K and the slowest rate of exchange. The K in vermiculite was the most readily exchangeable.
Samples of several naturally fine-grained micaceous minerals were heated at 450°C for 24 hr (after the effects of other temperatures and heating periods were evaluated with the < 2 μm fraction of Grun-dite) and then characterized in terms of their release of K to NaCl-NaTPB (sodium tetraphenylboron) solutions and other potentially related properties.
This heat treatment produced a substantial increase in the amount of K that each mineral released when first placed in the NaCl-NaTPB solution (the greatest increase being 22 m-equiv K/100 g in Marblehead illite). Depending upon the mineral heated, the subsequent rate of K release was increased, decreased or unchanged. Also, all the minerals except glauconite exhibited an increase (ranging from 4 to 38 m-equiv K/100 g) in their maximum degree of K release if they were heated. Thus, it was established that the K release behavior of these minerals is not only subject to appreciable alteration by heat treatments but is altered in a manner that varies with the mineral. The nature of these alterations, however, did not clearly identify an involvement of the other mineral properties that were examined. An increase in NH4- and Cs-exchangeable K occurred when these minerals were heated—presumably as a result of exfoliation. With Morris illite samples, this increase was nearly 28 m-equiv 100 g. Thus, heated samples of these minerals may be useful sinks for the removal of NH4 and Cs in various wastes.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
Identifying long-term care facility (LTCF)-exposed inpatients is important for infection control research and practice, but ascertaining LTCF exposure is challenging. Across a large validation study, electronic health record data fields identified 76% of LTCF-exposed patients compared to manual chart review.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
Late Life Major Depressive Disorder (LLD) and Hoarding Disorder (HD) are common in older adults with prevalence estimates up to 29% and 7%, respectively. Both LLD and HD are characterized by executive dysfunction and disability. There is evidence of overlapping neurobiological dysfunction in LLD and HD suggesting potential for compounded executive dysfunction and disability in the context of comorbid HD and LLD. Yet, prevalence of HD in primary presenting LLD has not been examined and potential compounded impact on executive functioning, disability, and treatment response remains unknown. Thus, the present study aimed to determine the prevalence of co-occurring HD in primary presenting LLD and examine hoarding symptom severity as a contributor to executive dysfunction, disability, and response to treatment for LLD.
Participants and Methods:
Eighty-three adults ages 65-90 participating in a psychotherapy study for LLD completed measures of hoarding symptom severity (Savings Inventory-Revised: SI-R), executive functioning (WAIS-IV Digit Span, Letter-Number Sequencing, Coding; Stroop Interference; Trail Making Test-Part B; Letter Fluency), functional ability (World Health Organization Disability Assessment Schedule-II-Short), and depression severity (Hamilton Depression Rating Scale) at post-treatment. Pearson's Chi-squared tests evaluated group differences in cognitive and functional impairment rates and depression treatment response between participants with (HD+LLD) and without (LLD-only) clinically significant hoarding symptoms. Linear regressions were used to examine the association between hoarding symptom severity and executive function performance and functional ability and included as covariates participant age, years of education, gender, and concurrent depression severity.
Results:
At post-treatment, 24.1% (20/83) of participants with LLD met criteria for clinically significant hoarding symptoms (SI-R.41). Relative to LLD-only, the LLD+HD group demonstrated greater impairment rates in Letter-Number Sequencing (χ2(1)=4.0, p=.045) and Stroop Interference (χ2(1)=4.8, p=.028). Greater hoarding symptom severity was associated with poorer executive functioning performance on Digit Span (t(71)=-2.4, β=-0.07, p=.019), Letter-Number Sequencing (t(70)=-2.1, β=-0.05, p=.044), and Letter Fluency (t(71)=-2.8, β=-0.24, p=.006). Rates of functional impairment were significantly higher in the LLD+HD (88.0%) group compared to the LLD-only (62.3%) group, (χ2(1)=5.41, p=.020). Additionally, higher hoarding symptom severity was related to greater disability (t(72)=2.97, β=0.13, p=.004). Furthermore, depression treatment response rates were significantly lower in the LLD+HD group at 24.0% (6/25) compared to 48.3% (28/58) in the LLD-only group, χ2(1)=4.26, p=.039.
Conclusions:
The present study is among the first to report prevalence of clinically significant hoarding symptoms in primary presenting LLD. The findings of 24.1% co-occurrence of HD in primary presenting LLD and increased burden on executive functioning, disability, and depression treatment outcomes have important implications for intervention and prevention efforts. Hoarding symptoms are likely under-evaluated, and thus may be overlooked, in clinical settings where LLD is identified as the primary diagnosis. Taken together with results indicating poorer depression treatment response in LLD+HD, these findings underscore the need for increased screening of hoarding behaviors in LLD and tailored interventions for this LLD+HD group. Future work examining the course of hoarding symptomatology in LLD (e.g., onset age of hoarding behaviors) may provide insights into the mechanisms associated with greater executive dysfunction and disability.