To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cults have captivated public imagination, gained visibility in the media, and become a popular topic of discourse. While anecdotal and journalistic accounts offer compelling insights, systematic study on the structure, psychological predispositions, and relevance to clinical and legal settings are comparatively scarce. This disparity highlights a crucial need for rigorous scholarly inquiry, moving beyond media portrayals to uncover the foundational mechanisms that sustain and shape these enigmatic groups. Authored by experts in forensic psychiatry and psychology, this book consolidates the extant literature in reviewing the theoretical, sociocultural, clinical, and forensic issues surrounding cultist groups. This text applies evidence-based study to identify group subtypes and explore mediators and moderators that may be relevant in clinical and legal contexts. Authors address issues as they relate to a variety of subpopulations, comorbid mental disorders, mind-altering substances, treatment, and the and legal implications inherent to cults and persuasive leadership. This book may be especially pertinent to mental health professionals and those working in the criminal justice system.
Seeking a “21st-century way of working” that promotes increased labor productivity and “diverse work styles,” Japan's Ministry of Health, Labor and Welfare (MHLW) has issued a report recommending major changes to work hour regulations. The centerpiece of the proposal is an exemption from overtime pay for white-collar workers.
This article provides an overview of individuals with schizophrenia who become unhoused and explores current approaches to managing this severe illness in those who often do not want care or believe they need it. Individuals with schizophrenia and who are unhoused face numerous adverse consequences including premature mortality and increased rates of suicide. There is a dearth of research evidence demonstrating efficacy of the Housing First (HF) model and harm reduction approach in decreasing psychotic symptoms in individuals with schizophrenia. Ensuring medication adherence in individuals with psychosis, both housed and unhoused, is important to prevent delays in untreated psychosis and chronic deterioration.
Prime Minister Abe Shinzō has made Work Style Reform (hatarakikata kaikaku) part of his core policy agenda, promising above all to remedy the Japanese way of work's two greatest problems: dangerously long work hours and grossly unequal wage gaps between regular and non-regular workers. However, critics charge that the proposals will likely aggravate these problems, given that labor policymaking is dominated by conservative business and political leaders bent on deregulation. This paper examines the current Work Style Reform proposals, explaining howthe work hour reduction and equal pay for equal work proposals are being promoted to the public, and why they ultimately fail as reforms from the worker point of view. Despite these serious problems, the government's effective marketing has helped to defuse potential resistance and the reform plans may become law in 2018.
Coastal wetlands are hotspots of carbon sequestration, and their conservation and restoration can help to mitigate climate change. However, there remains uncertainty on when and where coastal wetland restoration can most effectively act as natural climate solutions (NCS). Here, we synthesize current understanding to illustrate the requirements for coastal wetland restoration to benefit climate, and discuss potential paths forward that address key uncertainties impeding implementation. To be effective as NCS, coastal wetland restoration projects will accrue climate cooling benefits that would not occur without management action (additionality), will be implementable (feasibility) and will persist over management-relevant timeframes (permanence). Several issues add uncertainty to understanding if these minimum requirements are met. First, coastal wetlands serve as both a landscape source and sink of carbon for other habitats, increasing uncertainty in additionality. Second, coastal wetlands can potentially migrate outside of project footprints as they respond to sea-level rise, increasing uncertainty in permanence. To address these first two issues, a system-wide approach may be necessary, rather than basing cooling benefits only on changes that occur within project boundaries. Third, the need for NCS to function over management-relevant decadal timescales means methane responses may be necessary to include in coastal wetland restoration planning and monitoring. Finally, there is uncertainty on how much data are required to justify restoration action. We summarize the minimum data required to make a binary decision on whether there is a net cooling benefit from a management action, noting that these data are more readily available than the data required to quantify the magnitude of cooling benefits for carbon crediting purposes. By reducing uncertainty, coastal wetland restoration can be implemented at the scale required to significantly contribute to addressing the current climate crisis.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
CHD care is resource-intensive. Unwarranted variation in care may increase cost and result in poorer health outcomes. We hypothesise that process variation exists within the pre-operative evaluation and planning process for children undergoing repair of atrial septal defect or ventricular septal defect and that substantial variation occurs in a small number of care points.
Methods:
From interviews with staff of an integrated congenital heart centre, an initial process map was constructed. A retrospective chart review of patients with isolated surgical atrial septal defect and ventricular septal defect repair from 7/1/2018 through 11/1/2020 informed revisions of the process map. The map was assessed for points of consistency and variability.
Results:
Thirty-two surgical atrial septal defect/ventricular septal defect repair patients were identified. Ten (31%) were reviewed by interventional cardiology before surgical review. Of these, 6(60%) had a failed catheter-based closure and 4 (40%) were deemed inappropriate for catheter-based closure. Thirty (94%) were reviewed in case conference, all attended surgical clinic, and none were admitted prior to surgery. The process map from interviews alone identified surgery rescheduling as a point of major variability; however, chart review revealed this was not as prominent a source of variability as pre-operative interventional cardiology review.
Conclusions:
Significant variation in the pre-operative evaluation and planning process for surgical atrial septal defect/ventricular septal defect patients was identified. If such process variation is widespread through CHD care, it may contribute to variations in outcome and cost previously documented within CHD surgery. Future research will focus on determining whether the variation is warranted or unwarranted, associated health outcomes and cost variation attributed to these variations in care processes.
In difficult-to-treat depression (DTD) the outcome metrics historically used to evaluate treatment effectiveness may be suboptimal. Metrics based on remission status and on single end-point (SEP) assessment may be problematic given infrequent symptom remission, temporal instability, and poor durability of benefit in DTD.
Methods
Self-report and clinician assessment of depression symptom severity were regularly obtained over a 2-year period in a chronic and highly treatment-resistant registry sample (N = 406) receiving treatment as usual, with or without vagus nerve stimulation. Twenty alternative metrics for characterizing symptomatic improvement were evaluated, contrasting SEP metrics with integrative (INT) metrics that aggregated information over time. Metrics were compared in effect size and discriminating power when contrasting groups that did (N = 153) and did not (N = 253) achieve a threshold level of improvement in end-point quality-of-life (QoL) scores, and in their association with continuous QoL scores.
Results
Metrics based on remission status had smaller effect size and poorer discrimination of the binary QoL outcome and weaker associations with the continuous end-point QoL scores than metrics based on partial response or response. The metrics with the strongest performance characteristics were the SEP measure of percentage change in symptom severity and the INT metric quantifying the proportion of the observation period in partial response or better. Both metrics contributed independent variance when predicting end-point QoL scores.
Conclusions
Revision is needed in the metrics used to quantify symptomatic change in DTD with consideration of INT time-based measures as primary or secondary outcomes. Metrics based on remission status may not be useful.
Approximately one-third of individuals in a major depressive episode will not achieve sustained remission despite multiple, well-delivered treatments. These patients experience prolonged suffering and disproportionately utilize mental and general health care resources. The recently proposed clinical heuristic of ‘difficult-to-treat depression’ (DTD) aims to broaden our understanding and focus attention on the identification, clinical management, treatment selection, and outcomes of such individuals. Clinical trial methodologies developed to detect short-term therapeutic effects in treatment-responsive populations may not be appropriate in DTD. This report reviews three essential challenges for clinical intervention research in DTD: (1) how to define and subtype this heterogeneous group of patients; (2) how, when, and by what methods to select, acquire, compile, and interpret clinically meaningful outcome metrics; and (3) how to choose among alternative clinical trial design options to promote causal inference and generalizability. The boundaries of DTD are uncertain, and an evidence-based taxonomy and reliable assessment tools are preconditions for clinical research and subtyping. Traditional outcome metrics in treatment-responsive depression may not apply to DTD, as they largely reflect the only short-term symptomatic change and do not incorporate durability of benefit, side effect burden, or sustained impact on quality of life or daily function. The trial methodology will also require modification as trials will likely be of longer duration to examine the sustained impact, raising complex issues regarding control group selection, blinding and its integrity, and concomitant treatments.
Excited delirium, which has been defined as combativeness, agitation, and altered sensorium, requires immediate treatment in prehospital or emergency department (ED) settings for the safety of both patients and caregivers. Prehospital ketamine use is prevalent, although the evidence on safety and efficacy is limited. Many patients with excited delirium are intoxicated with illicit substances. This investigation explores whether patients treated with prehospital ketamine for excited delirium with concomitant substance intoxication have higher rates of subsequent intubation in the ED compared to those without confirmed substance usage.
Methods:
Over 28 months at two large community hospitals, all medical records were retrospectively searched for all patients age 18 years or greater with prehospital ketamine intramuscular (IM) administration for excited delirium and identified illicit and prescription substance co-ingestions. Trained abstractors collected demographic characteristics, history of present illness (HPI), urine drug screens (UDS), alcohol levels, and noted additional sedative administrations. Substance intoxication was determined by UDS and alcohol positivity or negativity, as well as physician HPI. Patients without toxicological testing or documentation of substance intoxication, or who may have tested positive due to ED sedation, were excluded from relevant analyses. Subsequent ED intubation was the primary pre-specified outcome. Odds ratios (OR) and 95% confidence intervals (CI) were calculated to compare variables.
Results:
Among 86 patients given prehospital ketamine IM for excited delirium, baseline characteristics including age, ketamine dose, and body mass index were similar between those who did or did not undergo intubation. Men had higher intubation rates. Patients testing positive for alcohol, amphetamines, barbiturates, benzodiazepines, ecstasy, marijuana, opiates, and synthetic cathinones, both bath salts and flakka, had similar rates of intubation compared to those negative for these substances. Of 27 patients with excited delirium and concomitant cocaine intoxication, nine (33%) were intubated compared with four of 50 (8%) without cocaine intoxication, yielding a 5.75 OR (95%, CI 1.57 to 21.05; P = .009).
Conclusion:
Patients treated with ketamine IM for excited delirium with concomitant cocaine intoxication had a statistically significant 5.75-fold increased rate of subsequent intubation in the ED. Amongst other substances, no other trends with intubation were noted, but further study is warranted.
Herbicide-resistant (HR) kochia is a growing problem in the Great Plains region of Canada and the United States. Resistance to up to four herbicide sites of action, including photosystem II inhibitors, acetolactate synthase inhibitors, synthetic auxins, and the 5-enolpyruvylshikimate-3-phosphate synthase inhibitor glyphosate have been reported in many areas of this region. Despite being present in the United States since 1993/1994, auxinic-HR kochia is a recent and growing phenomenon in Canada. This study was designed to characterize 1) the level of resistance and 2) patterns of cross-resistance to dicamba and fluroxypyr in 12 putative auxinic-HR kochia populations from western Canada. The incidence of dicamba-resistant individuals ranged among populations from 0% to 85%, while fluroxypyr-resistant individuals ranged from 0% to 45%. In whole-plant dose-response bioassays, the populations exhibited up to 6.5-fold resistance to dicamba and up to 51.5-fold resistance to fluroxypyr based on visible injury 28 d after application. Based on plant survival estimates, the populations exhibited up to 3.7-fold resistance to dicamba and up to 72.5-fold resistance to fluroxypyr. Multiple patterns of synthetic auxin resistance were observed, in which one population from Cypress County, Alberta, was resistant to dicamba but not fluroxypyr, whereas another from Rocky View County, Alberta, was resistant to fluroxypyr but not dicamba based on single-dose population screening and dose-response bioassays. These results suggest that multiple mechanisms may confer resistance to dicamba and/or fluroxypyr in Canadian kochia populations. Further research is warranted to determine these mechanisms. Farmers are urged to adopt proactive nonchemical weed management practices in an effort to preserve efficacy of the remaining herbicide options available for control of HR kochia.