To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Hand hygiene is an important strategy for reducing healthcare-associated infections. While efforts to improve nursing home (NH) staff hand hygiene have been prioritized, there are few if any policies in-place to improve resident hand hygiene. Further, CMS guidance requires that residents be bathed “twice a week.” The objective of this study was to characterize resident hand hygiene knowledge and habits as well as bathing practices to identify barriers in a setting where new intervention strategies could be aimed. Methods: The survey was administered at 20 NHs across the United States between December 2023 and July 2024. Verbal consent was obtained from residents before survey administration. Survey questions explored residents’ hand hygiene knowledge and differences in hand hygiene habits and bathing practices since entering the NH from their last place of residence. Three knowledge-based questions assessed residents’ understanding of the recommended length of time to wash hands and use hand sanitizer in addition to when hand washing should be utilized instead of hand sanitizer. Frequency of hand washing, either through soap and water or hand sanitizer, instances of when and how residents wash and dry their hands, and whether resident’s faced challenges were assessed. Results:Of the 495 residents who completed the survey, only 142 (29%) residents answered all three knowledge-based questions correctly. Residents who answered two or three questions correctly reported washing their hands more frequently at their previous residence compared to residents who answered zero or one correct (Figure 1). Frequency of hand hygiene was lower at the NH compared to their previous residence across a variety of indications (Figure 2). More residents faced challenges with washing their hands at the NH compared to their previous residence (30% vs. 7%, P<.001). The most common challenges included: mobility limitations, medical issues, need for assistance, bathroom accessibility and inadequate bathroom supplies/equipment. About half the residents (53%) reported never being reminded to wash their hands; 60% reported that they would use hand sanitizer if it was easily accessible. 51% of residents reported bathing with soap and water less at the NH compared to their previous residence with reported causes being needing help and not receiving it, nursing home policy, medical issues, and mobility limitations. Conclusions: Survey results indicate opportunities for interventions aimed at reducing the barriers to hand hygiene practice and improving bathing practices in NHs. Policy changes and hand hygiene educational opportunities addressing these barriers could serve as potential strategies.
Carers are critical to support discharge home from hospital at end of life yet remain under-represented in health service initiatives to assist this transition. A carer-focused intervention embedded into practice may facilitate hospital discharge. This open-labeled, single-arm phase 2 study aimed to determine the feasibility of (1) delivering a multi-staged intervention (CARENET) to carers of advanced cancer patients in a hospital setting and (2) the study design to inform a phase 3 trial.
Methods
CARENET, delivered before and after discharge to address carer support needs, was tested in an Australian specialist cancer hospital. Eligible participants included carers of advanced cancer inpatients with planned discharge home. The primary outcome was intervention and trial feasibility (recruitment and adherence). Secondary outcomes were eligibility and effects of intervention on outcomes including carer preparedness.
Results
Of the 382 potential patient–carer dyads, 25 were recruited within required time frames. The intervention adherence outcome feasibility threshold of 80% of carer participants completing all 3 core components of CARENET was not achieved (60% completion). Trends in improvement in overall carer levels of preparedness were observed from baseline to discharge home (n = 12; mean [95% CI]) 0.5 [−0.0007, 1.007]). However, a downward trend in preparedness to provide emotional care after discharge was observed (n = 12; mean [95% CI] 0.25 [−0.30, 0.80]).
Significance of results
Delivering all elements of the CARENET intervention to address carers’ needs in the discharge planning context was not feasible. However, some elements were feasible, including identifying and responding to carer need, whilst completing elements after discharge were less feasible. Findings can be explained by problems with adherence, eligibility, and clinician barriers to fitting a multi-staged carer intervention into an acute healthcare setting. Future research should test a more adaptable intervention and delivery model that is accessible to all carers across and compatible with acute care settings.
This article examines how Native Nations and institutions have been affected by a new directive in the revised NAGPRA regulations, the duty of care provision (43 CFR 10.1(d)), with a focus on the care of Indigenous Ancestral remains and cultural items. The Native Nation’s perspective is provided by the Osage Nation and the Eastern Band of Cherokee Indians. The South Carolina Institute of Archaeology and Anthropology; the University of Tennessee, Knoxville; the Illinois State Museum; and Indiana University share their viewpoints as institutions that house Indigenous Ancestral remains, cultural items, and archaeological collections and describe the initial impacts of the revised legislation on their programs. There are several key takeaways of its initial effects, including (1) an increased burden to Native Nations, given the substantial uptick in requests for consultation linked to new requirements for consent and the revised definitions of cultural items and research (although the end result of more consultations leading to repatriations is desired), (2) a disconnect between Native Nations and institutions regarding cultural item identification, (3) a strengthening of existing NAGPRA-related institutional policies and procedures, and (4) an emphasis on the importance of consultation between institutions and Native Nations to facilitate repatriation.
Female genital schistosomiasis (FGS) is a chronic disease manifestation of the waterborne parasitic infection Schistosoma haematobium that affects up to 56 million women and girls, predominantly in sub-Saharan Africa. Starting from early childhood, this stigmatizing gynaecological condition is caused by the presence of Schistosoma eggs and associated toxins within the genital tract. Schistosoma haematobium typically causes debilitating urogenital symptoms, mostly as a consequence of inflammation, which includes bleeding, discharge and lower abdominal pelvic pain. Chronic complications of FGS include adverse sexual and reproductive health and rights outcomes such as infertility, ectopic pregnancy and miscarriage. FGS is associated with prevalent human immunodeficiency virus and may increase the susceptibility of women to high-risk human papillomavirus infection. Across SSA, and even in clinics outside endemic areas, the lack of awareness and available resources among both healthcare professionals and the public means FGS is underreported, misdiagnosed and inadequately treated. Several studies have highlighted research needs and priorities in FGS, including better training, accessible and accurate diagnostic tools, and treatment guidelines. On 6 September, 2024, LifeArc, the Global Schistosomiasis Alliance and partners from the BILGENSA Research Network (Genital Bilharzia in Southern Africa) convened a consultative, collaborative and translational workshop: ‘Female Genital Schistosomiasis: Translational Challenges and Opportunities’. Its ambition was to identify practical solutions that could address these research needs and drive appropriate actions towards progress in tackling FGS. Here, we present the outcomes of that workshop – a series of discrete translational actions to better galvanize the community and research funders.
Electronic medical record (EMR) systems in primary care present an opportunity to address frailty, a significant health concern for older adults. Researchers in the UK used Read codes to develop a 36-factor electronic frailty index (eFI), which produces frailty scores for patients in primary care settings.
Aim:
We aimed to translate the 36-factor eFI to a Canadian context.
Methods:
We used manual and automatic mapping to develop a coding set based on standardized terminologies used in Canada to reflect the 36 factors of the eFI. Manual mapping was completed independently by two coders, followed by group consensus among the research team. Automatic mapping was completed using Apelon TermWorks. We then used EMR data from the British Columbia Canadian Primary Care Sentinel Surveillance Network. We searched structured data fields related to diagnoses and reasons for patient visits to develop a list of free text terms associated with any of the 36 factors.
Results and conclusions:
A total of 3768 terms were identified; 3021 were codes. A total of 747 free text terms were identified from 527,521 reviewed data entries. Of the 36 frailty factors, 24 were captured mostly by codes; 7 mostly by free text; and 4 approximately equally by codes and free text. Three key findings emerged from this study: (1) It is difficult to capture frailty using only standardized terminologies currently used in Canada and a combination of standardized codes and free text terms better captures the complexity of frailty; (2) EMRs in primary care can be better optimized; (3) Output from this study allows for the development of a frailty screening algorithm that could be implemented in primary care settings to improve individual and system level outcomes related to frailty.
Hypertensive heart disease and hypertrophic cardiomyopathy both lead to left ventricular hypertrophy despite differing in aetiology. Elucidating the correct aetiology of the presenting hypertrophy can be a challenge for clinicians, especially in patients with overlapping risk factors. Furthermore, drugs typically used to combat hypertensive heart disease may be contraindicated for the treatment of hypertrophic cardiomyopathy, making the correct diagnosis imperative. In this review, we discuss characteristics of both hypertensive heart disease and hypertrophic cardiomyopathy that may enable clinicians to discriminate the two as causes of left ventricular hypertrophy. We summarise the current literature, which is primarily focused on adult populations, containing discriminative techniques available via diagnostic modalities such as electrocardiography, echocardiography, and cardiac MRI, noting strategies yet to be applied in paediatric populations. Finally, we review pharmacotherapy strategies for each disease with regard to pathophysiology.
Trace amine-associated receptor 1 (TAAR1) agonists offer a new approach, but there is uncertainty regarding their effects, exact mechanism of action and potential role in treating psychosis.
Aims
To evaluate the available evidence on TAAR1 agonists in psychosis, using triangulation of the output of living systematic reviews (LSRs) of animal and human studies, and provide recommendations for future research prioritisation.
Method
This study is part of GALENOS (Global Alliance for Living Evidence on aNxiety, depressiOn and pSychosis). In the triangulation process, a multidisciplinary group of experts, including those with lived experience, met and appraised the first co-produced living systematic reviews from GALENOS, on TAAR1 agonists.
Results
The animal data suggested a potential antipsychotic effect, as TAAR1 agonists reduced locomotor activity induced by pro-psychotic drug treatment. Human studies showed few differences for ulotaront and ralmitaront compared with placebo in improving overall symptoms in adults with acute schizophrenia (four studies, n = 1291 participants, standardised mean difference (SMD) 0.15, 95% CI −0.05 to 0.34). Large placebo responses were seen in ulotaront phase three trials. Ralmitaront was less efficacious than risperidone (one study, n = 156 participants, SMD = −0.53, 95% CI −0.86 to −0.20). The side-effect profile of TAAR1 agonists was favourable compared with existing antipsychotics. Priorities for future studies included (a) using different animal models of psychosis with greater translational validity; (b) animal and human studies with wider outcomes including cognitive and affective symptoms and (c) mechanistic studies and investigations of other potential applications, such as adjunctive treatments and long-term outcomes. Recommendations for future iterations of the LSRs included (a) meta-analysis of individual human participant data, (b) including studies that used different methodologies and (c) assessing other disorders and symptoms.
Conclusions
This co-produced, international triangulation examined the available evidence and developed recommendations for future research and clinical applications for TAAR1 agonists in psychosis. Broader challenges included difficulties in assessing the risk of bias, reproducibility, translation and interpretability of animal models to clinical outcomes, and a lack of individual and clinical characteristics in the human data. The research will inform a separate, independent prioritisation process, led by lived experience experts, to prioritise directions for future research.
The March 2, 2022, United Nations Environment Assembly Resolution 5/14: “End plastic pollution: Toward an international legally binding instrument by 2024” provides an important path for addressing global plastic pollution, from monomer design and production through the value chain to the final fate of plastic products, including resource recovery. Of the goals set for this effort, simplifying the polymer and additive universe is among the most significant. One primary obstacle to resource recovery from plastic waste is polymer variability, which renders post-use plastic inherently waste-like. While simplification will not address microplastics and leaching of chemicals during use, these measures simplify the plastic universe and mitigate leakage which is critical to ensuring circular plastic use. This study provides a pathway for simplification of formulations through the elimination of problematic additives and revealing paths toward simplifying and reducing the variability in polymers, waste streams and pollution, while preserving critical uses. This study focuses on phenolic antioxidants to support this concept; however, these principles can be applied to other additive classes. The results show extensive duplication of chemical species with different trade names and the appearance of only minor changes to species with the intention of evergreening patents for improved marketability.
Background: Nursing home (NH) residents are at high risk of COVID-19 from exposure to infected staff and other residents. Understanding SARS-CoV-2 viral RNA kinetics in residents and staff can guide testing, isolation, and return to work recommendations. We sought to determine the duration of antigen test and polymerase chain reaction (PCR) positivity in a cohort of NH residents and staff. Methods: We prospectively collected data on SARS-CoV-2 viral kinetics from April 2023 through November 2023. Staff and residents could enroll prospectively or upon a positive test (identified through routine clinical testing, screening, or outbreak response testing). Participating facilities performed routine clinical testing; asymptomatic testing of contacts was performed within 48 hours if an outbreak or known exposure occurred and upon (re-) admission. Enrolled participants who tested positive for SARS-CoV-2 were re-tested daily for 14 days with both nasal antigen and nasal PCR tests. All PCR tests were run by a central lab with the same assay. We conducted a Kaplan-Meier survival analysis on time to first negative test restricted to participants who initially tested positive (day zero) and had at least one test ≥10 days after initially testing positive with the same test type; a participant could contribute to both antigen and PCR survival curves. We compared survival curves for staff and residents using the log-rank test. Results: Twenty-four nursing homes in eight states participated; 587 participants (275 residents, 312 staff) enrolled in the evaluation, participants were only tested through routine clinical or outbreak response testing. Seventy-two participants tested positive for antigen; of these, 63 tested PCR-positive. Residents were antigen- and PCR-positive longer than staff (Figure 1), but this finding is only statistically significant (p=0.006) for duration of PCR positivity. Five days after the first positive test, 56% of 50 residents and 59% of 22 staff remained antigen-positive; 91% of 44 residents and 79% of 19 staff were PCR-positive. Ten days after the first positive test, 22% of 50 residents and 5% of 22 staff remained antigen-positive; 61% of 44 residents and 21% of 19 staff remained PCR-positive. Conclusions: Most NH residents and staff with SARS-CoV-2 remained antigen- or PCR-positive 5 days after the initial positive test; however, differences between staff and resident test positivity were noted at 10 days. These data can inform recommendations for testing, duration of NH resident isolation, and return to work guidance for staff. Additional viral culture data may strengthen these conclusions.
Disclosure: Stefan Gravenstein: Received consulting and speaker fees from most vaccine manufacturers (Sanofi, Seqirus, Moderna, Merck, Janssen, Pfizer, Novavax, GSK, and have or expect to receive grant funding from several (Sanofi, Seqirus, Moderna, Pfizer, GSK). Lona Mody: NIH, VA, CDC, Kahn Foundation; Honoraria: UpToDate; Contracted Research: Nano-Vibronix
Social connection is associated with better health, including reduced risk of dementia. Personality traits are also linked to cognitive outcomes; neuroticism is associated with increased risk of dementia. Personality traits and social connection are also associated with each other. Taken together, evidence suggests the potential impacts of neuroticism and social connection on cognitive outcomes may be linked. However, very few studies have simultaneously examined the relationships between personality, social connection and health.
Research objective:
We tested the association between neuroticism and cognitive measures while exploring the potential mediating roles of aspects of social connection (loneliness and social isolation).
Method:
We conducted a cross-sectional study with a secondary analysis of the Canadian Longitudinal Study on Aging (CLSA) Comprehensive Cohort, a sample of Canadians aged 45 to 85 years at baseline. We used only self-reported data collected at the first follow-up, between 2015 and 2018 (n= 27,765). We used structural equation modelling to assess the association between neuroticism (exposure) and six cognitive measures (Rey Auditory Verbal Learning Test immediate recall and delayed recall, Animal Fluency Test, Mental Alternation Test, Controlled Oral Word Association Test and Stroop Test interference ratio), with direct and indirect effects (through social isolation and loneliness). We included age, education and hearing in the models and stratified all analyses by sex, females (n= 14,133) and males (n=13,632).
Preliminary results of the ongoing study:
We found positive, statistically significant associations between neuroticism and social isolation (p<0.05) and loneliness (p<0.05), for both males and females. We also found inverse, statistically significant associations between neuroticism and all cognitive measures (p<0.05), except the Stroop Test interference ratio. In these models, there was consistent evidence of indirect effects (through social isolation and loneliness) and, in some cases, evidence of direct effects. We found sex differences in the model results.
Conclusion:
Our findings suggest that the association between neuroticism and cognitive outcomes may be mediated by aspects of social connection and differ by sex. Understanding if and how modifiable risk factors mediate the association between personality and cognitive outcomes would help develop and target intervention strategies that improve social connection and brain health.
Depression has a well-established negative effect on cognitive functioning. Variations in the apolipoprotein e (APOE) and brain-derived neurotrophic factor (BDNF) genes likely contribute to this relationship. APOE4 and the BDNF Val66Met polymorphism are independently associated with late-life depression and cognitive dysfunction. The current study investigated the moderating effects of APOE4 and BDNFMet (i.e., the presence of the BDNF Val66Met polymorphism) on the relationship between depression and cognitive functioning in older adults.
Participants and Methods:
The sample included 103 older adults drawn from two clinical trials who were recruited from the VA Palo Alto Health Care System (VAPAHCS) and the Stanford/VA Alzheimer’s Disease Center. Depression was diagnosed using the Mini Neuropsychiatric Interview for the Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV). The presence of an APOE4 and BDNFMet allele were dichotomized (i.e., yes/no) and determined using venipuncture. A comprehensive neuropsychological battery was used to assess attention (RAVLT Trial 1, WAIS-IV DSF), processing speed (TMTA, SDMT, Stroop Word, Stroop Color), working memory (WAIS-IV DSB, DSS), visuospatial functioning (JLO), language (VNT), memory (RAVLT Delayed Recall, WMS-IV Logical Memory II), and executive function (TMTB, Stroop Color-Word). Separate moderation analyses were conducted with depression as the predictor and APOE4 or BDNFMet status as the moderator using the SPSS PROCESS macro v4.0. Age was a covariate for models with processing speed, memory, language, and executive function as outcome variables.
Results:
Participants were largely male (93%) and White (75%). Ten percent met criteria for depression, 26% were APOE4 carriers, and 32% were BDNFMet carriers. The overall model examining depression, APOE4, and memory was significant (p < .01, R2 = .14). Depression was associated with lower memory performance (p < .05), however, APOE4 was not a significant moderator (p > .05). Similarly, the overall model examining depression, APOE4, and language was also significant (p < .05, R2 = .10). While the direct effects of depression and APOE4 on language were nonsignificant (p > .05), there was a significant two-way interaction between APOE4 and depression (p = .03). The overall model with depression, BDNFMet, and memory was significant (p < .001, R2 = .18). While neither depression nor BDNFMet had significant direct effects on memory (p > .05), a two-way interaction emerged between depression and BDNFMet (p = .05). Simple slopes analyses were used to further investigate significant interactions. Depression, APOE4, and BDNFMet did not significantly impact attention, processing speed, working memory, visuospatial functioning, or executive function, and no significant interactions were noted among variables. BDNFMet had no direct impact on language.
Conclusions:
APOE4 and BDNFMet were found to differentially moderate the relationship between depression and cognition. Specifically, APOE4 carriers with depression had worse language performance compared to those who were healthy, depressed, or APOE4 carriers. BDNFMet carriers with depression performed worse on measures of memory compared to those who were healthy, depressed, or BDNFMet carriers. The treatment of depression in APOE4 and BDNFMet carriers may reduce associated cognitive impairments. Limitations and future implications are also discussed.
Methamphetamine and cannabis are two widely used substances with possibly opposing effects on aspects of central nervous system functioning. Use of these substances is prevalent among people with HIV (PWH), though their combined effects on HIV-associated neurocognitive impairment (NCI) are unknown. Adverse effects of methamphetamine use on cognition are well documented. Cannabis may disturb cognition acutely, though its longer-term effects in PWH are not well understood. Our prior analysis of people without HIV (PWoH) found that cotemporaneous cannabis use was associated with better neurocognitive outcomes among methamphetamine users. The aim of this study was to assess how lifetime cannabis and methamphetamine use disorder relate to neurocognitive outcomes in PWH.
Participants and Methods:
HIV-positive participants (n=472) were on average 45.6±11.5 years of age, male (86.4%), White (60.6%), and educated 13.9±2.5 years. Most participants were on ART (81.9%) and virally suppressed (70%). Participants were stratified by lifetime methamphetamine (M-/M+) and cannabis (C-/C+) DSM-IV abuse/dependence disorder into four groups: M-C- (n=187), M-C+ (n=68), M+C-, (n=82) and M+C+ (n=135) and completed a comprehensive neurobehavioral assessment. Demographically corrected T-scores and deficit scores were used for analyses. Group differences in global and domain NC performances (i.e., T-scores) were examined using multiple linear regression, holding constant covariates that were associated with study groups and/or cognition. Specifically, M+ participants displayed higher rates of Hepatitis C infection (p=.004), higher current depressive symptom scores (p<.001), and higher rates of detectable plasma HIV RNA (p=.014). Multiple logistic regression was used to test for group differences in probability of neurocognitive impairment (i.e., deficit scores>0.5), including the same covariates. Pooling data with a sample of HIV-negative participants (n=423), we used generalized linear mixed effect models to examine how neurocognitive performance and impairment profiles varied by methamphetamine and/or cannabis use group, HIV disease characteristics, and their interactions.
Results:
Compared to M+C+, M+C- performed worse on measures of executive functions (ß=-3.17), learning (ß=-3.95), memory (ß=-5.58), and working memory (ß=-4.05) and were more likely to be classified as impaired in the learning (OR=2.93), memory (OR=5.24), and working memory (OR=2.48) domains. M-C- performed better than M+C+ on measures of learning (ß=3.46) and memory (ß=5.19), but worse than M-C+ on measures of executive functions (ß=-3.90), learning (ß=-3.32), memory (ß=-3.38), and working memory (ß=-3.38). Generalized linear mixed effect models indicate that detectable plasma HIV RNA (ß=-1.85) and low nadir CD4 T-cell counts (nadir CD4<200; ß=-1.07) were associated with worse neurocognitive performance, and these effects did not differ in size or direction by substance use group.
Conclusions:
In PWH, lifetime methamphetamine use disorder and both current and legacy markers of HIV disease severity are associated with worse neurocognitive outcomes. Cannabis use disorder does not appear to exacerbate methamphetamine-related deficits in PWH. Instead, results are consistent with findings from preclinical studies that cannabis use may protect against methamphetamine’s deleterious effects. Profile analysis models showed that participants with a history of cannabis use disorder display better overall neurocognitive performance than comparison (M-C-) participants. Mechanisms underlying a potential protective effect of cannabis may be elucidated by examining the temporal relationship between cannabis and methamphetamine consumption and neurocognitive performance.
Methamphetamine and cannabis are two widely used, and frequently co-used, substances with possibly opposing effects on the central nervous system. Evidence of neurocognitive deficits related to use is robust for methamphetamine and mixed for cannabis. Findings regarding their combined use are inconclusive. We aimed to compare neurocognitive performance in people with lifetime cannabis or methamphetamine use disorder diagnoses, or both, relative to people without substance use disorders.
Method:
423 (71.9% male, aged 44.6 ± 14.2 years) participants, stratified by presence or absence of lifetime methamphetamine (M−/M+) and/or cannabis (C−/C+) DSM-IV abuse/dependence, completed a comprehensive neuropsychological, substance use, and psychiatric assessment. Neurocognitive domain T-scores and impairment rates were examined using multiple linear and binomial regression, respectively, controlling for covariates that may impact cognition.
Results:
Globally, M+C+ performed worse than M−C− but better than M+C−. M+C+ outperformed M+C− on measures of verbal fluency, information processing speed, learning, memory, and working memory. M−C+ did not display lower performance than M−C− globally or on any domain measures, and M−C+ even performed better than M−C− on measures of learning, memory, and working memory.
Conclusions:
Our findings are consistent with prior work showing that methamphetamine use confers risk for worse neurocognitive outcomes, and that cannabis use does not appear to exacerbate and may even reduce this risk. People with a history of cannabis use disorders performed similarly to our nonsubstance using comparison group and outperformed them in some domains. These findings warrant further investigation as to whether cannabis use may ameliorate methamphetamine neurotoxicity.
Background: Long-term care facility (LTCF) employees pose potential risk for COVID-19 outbreaks. Association between employee infection prevention (IP) adherence with facility COVID-19 outbreaks remains a knowledge gap. Methods: From April through December 2020, prior to COVID-19 vaccination, we tested asymptomatic Veterans’ Affairs (VA) community living center (CLC) residents twice weekly and employees monthly, which increased to weekly with known exposure, for SARS-CoV-2 via nasopharyngeal PCR. Employees voluntarily completed multiple choice questionnaires assessing self-reported IP adherence at and outside work. Surveys were longitudinally administered in April, June, July, and October 2020. Changes in paired employee responses for each period were analyzed using the McNemar test. We obtained COVID-19 community rates from surrounding Davidson and Rutherford counties from the Tennessee Department of Health public data set. CLC resident COVID-19 cases were obtained from VA IP data. Incidence rate and number of positive tests were calculated. Results: Between April and December 2020, 444 employees completed at least 1 survey; 177 completed surveys in both April and June, 179 completed surveys in both June and July, and 140 completed surveys in both July and October (Fig. 1). Across periods, employee surveys demonstrated an increase in masking at work and outside work between April and June (63% to 95% [P < .01] and 36% to 63% [P < .01], respectively), and June to July (95% to 99% [P < .05] and 71% to 84% [P < .01], respectively) that were both maintained between July and October (Fig. 2). Distancing at work and limiting social contacts outside work significantly decreased from April to June but increased in subsequent periods, although not significantly. COVID-19 community incidence peaked in July and again in December, but CLC resident COVID-19 cases peaked in August, declined, and remained low through December (Fig. 3). Discussion: Wearing a mask at work, which was mandatory, increased, and voluntary employee masking outside work also increased. CLC COVID-19 cases mirrored community increases in July and August; however, community cases increased again later in 2020 while CLC cases remained low. Employees reporting distancing at work and limiting social contacts outside work decreased preceding the initial rise in CLC cases but increased and remained high after July. Conclusions: These data from the pre–COVID-19 vaccination era suggest that widespread, increased support for and emphasis on LTCF IP adherence, especially masking, may have effectively prevented COVID-19 outbreaks in the vulnerable LTCF population.
Bloodstream infections (BSIs) are a frequent cause of morbidity in patients with acute myeloid leukemia (AML), due in part to the presence of central venous access devices (CVADs) required to deliver therapy.
Objective:
To determine the differential risk of bacterial BSI during neutropenia by CVAD type in pediatric patients with AML.
Methods:
We performed a secondary analysis in a cohort of 560 pediatric patients (1,828 chemotherapy courses) receiving frontline AML chemotherapy at 17 US centers. The exposure was CVAD type at course start: tunneled externalized catheter (TEC), peripherally inserted central catheter (PICC), or totally implanted catheter (TIC). The primary outcome was course-specific incident bacterial BSI; secondary outcomes included mucosal barrier injury (MBI)-BSI and non-MBI BSI. Poisson regression was used to compute adjusted rate ratios comparing BSI occurrence during neutropenia by line type, controlling for demographic, clinical, and hospital-level characteristics.
Results:
The rate of BSI did not differ by CVAD type: 11 BSIs per 1,000 neutropenic days for TECs, 13.7 for PICCs, and 10.7 for TICs. After adjustment, there was no statistically significant association between CVAD type and BSI: PICC incident rate ratio [IRR] = 1.00 (95% confidence interval [CI], 0.75–1.32) and TIC IRR = 0.83 (95% CI, 0.49–1.41) compared to TEC. When MBI and non-MBI were examined separately, results were similar.
Conclusions:
In this large, multicenter cohort of pediatric AML patients, we found no difference in the rate of BSI during neutropenia by CVAD type. This may be due to a risk-profile for BSI that is unique to AML patients.
Understand how the built environment can affect safety and efficiency outcomes during doffing of personal protective equipment (PPE) in the context of coronavirus disease 2019 (COVID-19) patient care.
Study design:
We conducted (1) field observations and surveys administered to healthcare workers (HCWs) performing PPE doffing, (2) focus groups with HCWs and infection prevention experts, and (3) a with healthcare design experts.
Settings:
This study was conducted in 4 inpatient units treating patients with COVID-19, in 3 hospitals of a single healthcare system.
Participants:
The study included 24 nurses, 2 physicians, 1 respiratory therapist, and 2 infection preventionists.
Results:
The doffing task sequence and the layout of doffing spaces varied considerably across sites, with field observations showing most doffing tasks occurring around the patient room door and PPE support stations. Behaviors perceived as most risky included touching contaminated items and inadequate hand hygiene. Doffing space layout and types of PPE storage and work surfaces were often associated with inadequate cleaning and improper storage of PPE. Focus groups and the design charrette provided insights on how design affording standardization, accessibility, and flexibility can support PPE doffing safety and efficiency in this context.
Conclusions:
There is a need to define, organize and standardize PPE doffing spaces in healthcare settings and to understand the environmental implications of COVID-19–specific issues related to supply shortage and staff workload. Low-effort and low-cost design adaptations of the layout and design of PPE doffing spaces may improve HCW safety and efficiency in existing healthcare facilities.
The rapid growth in web-based grocery food purchasing has outpaced federal regulatory attention to the online provision of nutrition and allergen information historically required on food product labels. We sought to characterise the extent and variability that online retailers disclose required and regulated information and identify the legal authorities for the federal government to require online food retailers to disclose such information.
Design:
We performed a limited scan of ten products across nine national online retailers and conducted legal research using LexisNexis to analyse federal regulatory agencies’ authorities.
Setting:
USA.
Participants:
N/A.
Results:
The scan of products revealed that required information (Nutrition Facts Panels, ingredient lists, common food allergens and per cent juice for fruit drinks) was present, conspicuous and legible for an average of only 36·5 % of the products surveyed, ranging from 11·4 % for potential allergens to 54·2 % for ingredients lists. More commonly, voluntary nutrition-related claims were prominently and conspicuously displayed (63·5 % across retailers and products). Our legal examination found that the Food and Drug Administration, Federal Trade Commission and United States Department of Agriculture have existing regulatory authority over labelling, online sales and advertising, and Supplemental Nutrition Assistance Programme retailers that can be utilised to address deficiencies in the provision of required information in the online food retail environment.
Conclusions:
Information regularly provided to consumers in conventional settings is not being uniformly provided online. Congress or the federal agencies can require online food retailers disclose required nutrition and allergen information to support health, nutrition, equity and informed consumer decision-making.
Jennifer Morgan describes how civil society worked to secure a fair and ambitious multilateral agreement as seen from her perspective as Global Director of the Climate Program at the World Resources Institute at the time. In her view, civil society had four different roles and functions: civil society (1) gathered together idea generators, analysts, and researchers ahead of time, (2) provided informal intelligence and diplomatic service combined with high-trust networks, (3) brought the voices of people into the negotiations, and (4) explained COP 21’s complexities and alerted the outside world. Morgan concludes that no single function or role of civil society made the difference in Paris, but the combination of them all did so. She highlights the “bursting of the UNFCCC bubble” as a main function and assesses that the 1.5°C goal, came into the Paris Agreement as a result of peoples’ voices being heard and listened to. Without the NGO contribution, it would have been much more difficult, perhaps impossible, to sow the seeds of change. Morgan suggests that civil society can use its principled and objective-based approach to its advantage and listen to allies in their movement.