To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood maltreatment is a robust predictor of aggression. Research indicates that both maltreatment experiences and aggression are moderately heritable. It has been hypothesized that gene–environment correlation may be at play, whereby genetic predispositions to aggression in parents and children may be confounded with family environments conducive to its expression. Building on this framework, we tested whether maltreatment mediates the association between a polygenic score for aggression (PGSAGG) and school-age aggression, and whether this varied for reactive and proactive aggression.
Methods:
The sample comprised 721 participants (44.9% males; 99.0% White) with prospective assessments of maltreatment from 5 months to 12 years (10 assessments;1998–2010), and teachers-reported aggression from ages 6 to 13 (6 assessments; 2004–2011). The PGSAGG was derived using a Bayesian estimation method (PRS-CS).
Results:
PGSAGG was associated with most aggression measures across specific ages and trajectories. Maltreatment experiences partially mediated the association between PGSAGG and the Childhood-Limited trajectory of reactive – but not proactive – aggression.
Conclusion:
Children with higher genetic propensities for aggression were more likely to experience maltreatment, which partly explained the association between PGSAGG and a Childhood-Limited trajectory of reactive aggression during elementary school. This finding reinforces the possibility of confounding influences between genetic liability for aggression and maltreatment experiences.
To compare the incidence of surgical site infection (SSI) between cefazolin 3 g and 2 g surgical prophylaxis in patients weighing ≥120 kg that undergo elective colorectal surgery.
Methods:
A multicenter, retrospective cohort study was performed utilizing a validated database of elective colorectal surgeries in Michigan acute care hospitals. Adults weighing ≥120 kg who received cefazolin and metronidazole for surgical prophylaxis between 7/2012 and 6/2021 were included. The primary outcome was SSI, which was defined as an infection diagnosed within 30 days following the principal operative procedure. Multivariable logistic regression was used to identify variables associated with SSI; the exposure of interest was cefazolin 3 g surgical prophylaxis.
Results:
A total of 581 patients were included; of these, 367 (63.1%) received cefazolin 3 g, while 214 (36.8%) received 2 g. Patients who received cefazolin 3 g had less optimal antibiotic timing (324 [88.3%] vs 200 [93.5%]; P = .043) and a higher receipt of at least 1 of the prophylaxis antibiotics after incision (22 [6%] vs 5 [2.3%]; P = .043). There was no SSI difference between cefazolin 3 g and 2 g cohorts (23 [6.3%] vs 16 [7.5%], P = .574). When accounting for age, smoking status, and surgical duration, cefazolin 3 g was not associated with a reduction in SSI (adjOR, .64; 95%CI, .32–1.29).
Conclusions:
Surgical prophylaxis with cefazolin 3 g, in combination with metronidazole, was not associated with decreased SSI compared to 2 g dosing in obese patients undergoing elective colorectal surgery.
Childhood maltreatment is linked with later depressive symptoms, but not every maltreated child will experience symptoms later in life. Therefore, we investigate whether genetic predisposition for depression (i.e., polygenic score for depression, PGSDEP) modifies the association between maltreatment and depressive symptoms, while accounting for different types of maltreatment and whether it was evaluated through prospective and retrospective reports. The sample included 541–617 participants from the Quebec Longitudinal Study of Child Development with information on maltreatment, including threat, deprivation, assessed prospectively (5 months–17 years) and retrospectively (reported at 23 years), PGSDEP and self-reported depressive symptoms (20–23 years). Using hierarchical linear regressions, we found that retrospective, but not prospective indicators of maltreatment (threat/deprivation/cumulative) were associated with later depressive symptoms, above and beyond the PGSDEP. Our findings also show the presence of gene–environment interactions, whereby the association between maltreatment (retrospective cumulative maltreatment/threat, prospective deprivation) and depression was strengthened among youth with higher PGSDEP scores. Consistent with the Diathesis-Stress hypothesis, our findings suggest that a genetic predisposition for depression may exacerbate the putative impact of maltreatment on later depressive symptoms, especially when maltreatment is retrospective. Understanding the gene–environment interplay emerging in the context of maltreatment has the potential to guide prevention efforts.
This study identified 26 late invasive primary surgical site infection (IP-SSI) within 4–12 months of transplantation among 2073 SOT recipients at Duke University Hospital over the period 2015–2019. Thoracic organ transplants accounted for 25 late IP-SSI. Surveillance for late IP-SSI should be maintained for at least one year following transplant.
Residents of long-term care facilities (LTCFs) were disproportionately affected by the COVID-19 pandemic. We assessed the extent to which hospital-associated infections contributed to COVID-19 LTCF outbreaks in England. We matched addresses of cases between March 2020 and June 2021 to reference databases to identify LTCF residents. Linkage to health service records identified hospital-associated infections, with the number of days spent in hospital before positive specimen date used to classify these as definite or probable. Of 149,129 cases in LTCF residents during the study period, 3,748 (2.5%) were definite or probable hospital-associated and discharged to an LTCF. Overall, 431 (0.3%) were identified as index cases of potentially nosocomial-seeded outbreaks (2.7% (431/15,797) of all identified LTCF outbreaks). These outbreaks involved 4,521 resident cases and 1,335 deaths, representing 3.0% and 3.6% of all cases and deaths in LTCF residents, respectively. The proportion of outbreaks that were potentially nosocomial-seeded peaked in late June 2020, early December 2020, mid-January 2021, and mid-April 2021. Nosocomial seeding contributed to COVID-19 LTCF outbreaks but is unlikely to have accounted for a substantial proportion. The continued identification of such outbreaks after the implementation of preventative policies highlights the challenges of preventing their occurrence.
The continuing COVID-19 pandemic and social restrictions have impacted on the cognitive decline and mental health of people with dementia. Social isolation and loss of activities due to social restrictions may also have implications as to sense of identity for people with dementia. As part of the INCLUDE (Identifying and Mitigating the Individual and Dyadic Impact of COVID-19 and Life Under Physical Distancing on People with Dementia and Carers) component of the IDEAL (Improving the Experience of Dementia and Enhancing Active Life) cohort study, the overall aim of this subtle realist qualitative study was to explore the perspectives of people with dementia on living through the COVID-19 pandemic within the context of the ‘post-vaccine’ period and the national lockdowns in England and Wales; and to determine perceived challenges to and facilitators of ‘living well’ during the COVID-19 pandemic and beyond as restrictions were eased. In addition, the study findings are considered in relation to understandings of identity in dementia which the broader accounts of living through the pandemic have highlighted. Seven people with mild-to-moderate dementia were interviewed and themes were derived using framework analysis. Themes suggest interviewees' stoic acceptance of the pandemic and social restrictions but also fear of decline related to the temporality of their condition as well as loss of self-confidence to re-engage with the world. Interviewees managed threats to social identity by striving to maintain social and emotional connections, where the importance of a shared, social identity, particularly for people with young-onset dementia, was also apparent. Unlike in previous studies during the pandemic, the relevance of occupation for identity was observed, where maintaining previous or new activities or occupations was important to facilitate identity as well as to keep a sense of purpose. Therefore, as well as supporting people with dementia as the pandemic eases, future research into occupation and identity in dementia is of potential value.
Older people describe positive and negative age-related changes, but we do not know much about what contributes to make them aware of these changes. We used content analysis to categorize participants’ written comments and explored the extent to which the identified categories mapped onto theoretical conceptualizations of influences on awareness of age-related change (AARC).
Design:
Cross-sectional observational study.
Participants:
The study sample comprised 609 UK individuals aged 50 years or over (mean (SD) age = 67.9 (7.6) years), enrolled in the PROTECT study.
Measurements:
Between January and March 2019, participants provided demographic information, completed a questionnaire assessing awareness of age-related change (AARC-10 SF), and responded to an open-ended question asking them to comment on their responses.
Results:
While some of the emerging categories were in line with the existing conceptual framework of AARC (e.g. experiencing negative changes and attitudes toward aging), others were novel (e.g. engagement in purposeful activities or in activities that distract from age-related thoughts). Analysis revealed some of the thought processes involved in selecting responses to the questionnaire items, demonstrating different ways in which people make sense of specific items.
Conclusions:
Results support the ability of the AARC questionnaire to capture perceived age-related changes in cognitive functioning, physical and mental health, and engagement in social activities and in healthy and adaptive behaviors. However, findings also suggest ways of enriching the theoretical conceptualization of how AARC develops and offer insights into interpretation of responses to measures of AARC.
Evidence linking subjective concerns about cognition with poorer objective cognitive performance is limited by reliance on unidimensional measures of self-perceptions of aging (SPA). We used the awareness of age-related change (AARC) construct to assess self-perception of both positive and negative age-related changes (AARC gains and losses). We tested whether AARC has greater utility in linking self-perceptions to objective cognition compared to well-established measures of self-perceptions of cognition and aging. We examined the associations of AARC with objective cognition, several psychological variables, and engagement in cognitive training.
Design:
Cross-sectional observational study.
Participants:
The sample comprised 6056 cognitively healthy participants (mean [SD] age = 66.0 [7.0] years); divided into subgroups representing middle, early old, and advanced old age.
Measurements:
We used an online cognitive battery and measures of global AARC, AARC specific to the cognitive domain, subjective cognitive change, attitudes toward own aging (ATOA), subjective age (SA), depression, anxiety, self-rated health (SRH).
Results:
Scores on the AARC measures showed stronger associations with objective cognition compared to other measures of self-perceptions of cognition and aging. Higher AARC gains were associated with poorer cognition in middle and early old age. Higher AARC losses and poorer cognition were associated across all subgroups. Higher AARC losses were associated with greater depression and anxiety, more negative SPA, poorer SRH, but not with engagement in cognitive training.
Conclusions:
Assessing both positive and negative self-perceptions of cognition and aging is important when linking self-perceptions to cognitive functioning. Objective cognition is one of the many variables – alongside psychological variables – related to perceived cognitive losses.
Survival into adult life in patients with aortic coarctation is typical following surgical and catheter-based techniques to relieve obstruction. Late sequelae are recognised, including stroke, hypertension, and intracerebral aneurysm formation, with the underlying mechanisms being unclear. We hypothesised that patients with a history of aortic coarctation may have abnormalities of cerebral blood flow compared with controls.
Methods
Patients with a history of aortic coarctation underwent assessment of cerebral vascular function. Vascular responsiveness of intracranial vessels to hypercapnia and degree of cerebral artery stiffness using Doppler-derived pulsatility indices were used. Response to photic stimuli was used to assess neurovascular coupling, which reflects endothelial function in response to neuronal activation. Patient results were compared with age- and sex-matched controls.
Results
A total of 13 adult patients (males=10; 77%) along with 13 controls underwent evaluation. The mean age was 36.1±3.7 years in the patient group. Patients with a background of aortic coarctation were noted to have increased pulse pressure on blood pressure assessment at baseline with increased intracranial artery stiffness compared with controls. Patients with a history of aortic coarctation had less reactive cerebral vasculature to hypercapnic stimuli and impaired neurovascular coupling compared with controls.
Results
Adult patients with aortic coarctation had increased intracranial artery stiffness compared with controls, in addition to cerebral vasculature showing less responsiveness to hypercapnic and photic stimuli. Further studies are required to assess the aetiology and consequences of these documented abnormalities in cerebral blood flow in terms of stroke risk, cerebral aneurysm formation, and cognitive dysfunction.
Reducing the use of seclusion to deal with challenging behaviour is a priority in secure services for women. This study describes the concurrent introduction of a series of initiatives based on recovery principles and the full involvement of patients in their risk management plans.
Following change implementation, the first 19 patients who had completed one year of treatment were matched with 19 patients who had completed their first year of treatment before change.
A significant decline in both the number of seclusions and risk behaviour post-change was complemented by improved staff ratings of institutional behaviour, increased treatment engagement and a reduction in time spent in medium security. Staff and patients differed in terms of their ratings of the most effective strategies introduced. Patients favoured the Relational Security item of increased individual engagement and timetabled Behaviour Chain Analysis sessions. Staff viewed on ward training and use of de-escalation techniques as most effective.
Findings confirm results from mixed gender forensic mental health samples that seclusion can be successfully reduced without an increase in patient violence or alternative coercive strategies. Limitations of the study are discussed along with the need for future evaluations to address issues of fidelity and utilise vigorously designed case studies.
Meal-induced thermogenesis (MIT) research findings have been highly inconsistent, in part, due to the variety of durations and protocols used to measure MIT. In the present study, we aimed to determine the following: (1) the proportion of a 6 h MIT response completed at 3, 4 and 5 h; (2) the associations between the shorter durations and the 6 h measures; (3) whether shorter durations improved the reproducibility of the measurement. MIT was measured in response to a 2410 kJ mixed composition meal in ten individuals (five males and five females) on two occasions. Energy expenditure was measured continuously for 6 h post-meal using indirect calorimetry, and MIT was calculated as the increase in energy expenditure above the pre-meal RMR. On average, 76, 89 and 96 % of the 6 h MIT response was completed within 3, 4 and 5 h, respectively, and MIT at each of these time points was strongly correlated with the 6 h MIT response (range for correlations, r 0·990–0·998; P< 0·01). The between-day CV for the 6 h measurement was 33 %, but it was significantly lower after 3 h of measurement (CV 26 %; P= 0·02). Despite variability in the total MIT between days, the proportion of MIT that was completed at 3, 4 and 5 h was reproducible (mean CV: 5 %). While 6 h are typically required to measure the complete MIT response, the 3 h measures provide sufficient information about the magnitude of the MIT response and may be applicable for testing individuals on repeated occasions.
We sought to determine the antibiotic susceptibility of organisms causing community-acquired urinary tract infections (UTIs) in adult females attending an urban emergency department (ED) and to identify risk factors for antibiotic resistance.
Methods:
We reviewed the ED charts of all nonpregnant, nonlactating adult females with positive urine cultures for 2008 and recorded demographics, diagnosis, complicating factors, organism susceptibility, and risk factors for antibiotic resistance. Odds ratios (ORs) and 95% confidence intervals (CIs) for potential risk factors were calculated.
Results:
Our final sample comprised 327 UTIs: 218 were cystitis, of which 22 were complicated cases and 109 were pyelonephritis, including 22 complicated cases. Escherichia coli accounted for 82.3% of all UTIs, whereas Staphylococcus saprophyticus accounted for 5.2%. In uncomplicated cystitis, 9.5% of all isolates were resistant to ciprofloxacin and 24.0% to trimethoprim-sulfamethoxazole (TMP-SMX). In uncomplicated pyelonephritis, 19.5% of isolates were resistant to ciprofloxacin and 36.8% to TMP-SMX. In UTI (all types combined), any antibiotic use within the previous 3 months was a significant risk factor for resistance to both ciprofloxacin (OR 3.34, 95% CI 1.16–9.62) and TMP-SMX (OR 4.02, 95% CI 1.48–10.92). Being 65 years of age or older and having had a history of UTI in the previous year were risk factors only for ciprofloxacin resistance.
Conclusions:
E. coli was the predominant urinary pathogen in this series. Resistance to ciprofloxacin and TMP-SMX was high, highlighting the importance of relevant, local antibiograms. Any recent antibiotic use was a risk factor for both ciprofloxacin and TMP-SMX resistance in UTI. Our findings should be confirmed with a larger prospective study.