To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Archaeological sediments can be used to retrieve evidence for parasites that infected past populations, giving evidence for disease, diet, sanitation, and migration in the past. To increase our understanding of parasite infections in Roman Britain and determine which parasites may have infected people living at Vindolanda, sediment samples were collected from a drain connected to a latrine at the bath complex of Vindolanda. These samples were used to look for preserved parasite eggs and cysts deposited in the drain with the faeces of people who used the latrine. Microscopic analysis was used to identify eggs of helminths, and enzyme-linked immunosorbent assay (ELISA) was used to look for protozoan parasites that can cause severe diarrhoea. Eggs of Ascaris sp. (roundworm) and Trichuris sp. (whipworm) were found by microscopy and Giardia duodenalis was detected using ELISA. All of these parasites are transmitted by the faecal-oral route, usually through contaminated food and water. This is the first evidence for G. duodenalis in Roman Britain. A range of zoonotic and faecal-oral parasites have been found at other sites in Roman Britain, yet the drain studied from Vindolanda only contained faecal-oral parasites that can be transmitted directly between humans. This predominance of faecal-oral parasites is similar to a pattern found in large urban sites in the Roman Mediterranean and other military sites in the empire. In contrast, sites from larger urban cities in Roman Britain, such as London and York, appear to have a more diverse range of parasites.
Recent corporate scandals and excessively egotistical behavior on the part of organizational leaders underscore the need for industrial and organizational (I-O) psychology and human resource (HR) scholars and practitioners to critically examine how organizational systems and practices can stimulate leader narcissism. Whereas most organizational scholarship considers leader narcissism to be a stable input that influences important organizational outcomes, we challenge organizational scholars and practitioners to further inspect how organizational practices may either stimulate or suppress leader narcissism. We focus on HR practices as one specific set of organizational practices within the area of expertise of I-O psychologists and HR professionals. Drawing on self-categorization theory, we argue that highly personalizing HR practices (e.g., hypercompetitive leader selection, high-potential programs, elevated leader pay) can encourage leaders to define themselves in terms of a “special” personal identity in ways that set them apart from the broader collective within organizations and in turn facilitate leader narcissism. In contrast, we argue that depersonalizing HR practices (e.g., rotational leader selection, inclusive developmental programs, interdependent rewards) can encourage leaders to act in group-oriented ways that benefit the interests of others in an organization—and beyond. We call on organizational scholars and practitioners to consider more carefully how HR practices—often designed with the goal of cultivating leadership potential—may unintentionally reinforce leader narcissism. With this analysis, we hope to stimulate research in this area and offer insights to shape HR policies and practices in ways that discourage destructive forms of leader narcissism.
The district of Realejo in Granada, Spain, was a renown centre for the production of fine silk cloth from the medieval period onwards. During the excavation of a building on the south side of the square of Campo del Principe, two cesspits were identified that dated to the 17th–18th century. Historical evidence suggests this building might have been associated with the guild of silk workers, or might have been a residential property. Samples of sediment from each cesspit were taken at the time of excavation. Optical microscopy identified the eggs of Ascaris sp. (roundworm), Trichuris sp. (whipworm), probable Fasciola sp., Spirometra sp. and Capillaria sp. The presence of Ascaris and Trichuris likely reflect infection of the population by these helminths, and indicate ineffective sanitation. However, the eggs of Fasciola, Spirometra and Capillaria are more likely to reflect infection of animals rather than humans. The eggs could have been deposited in the cesspit if humans ate the organs of infected herbivores (Fasciola), if the faeces of companion animals such as cats or dogs were discarded in the cesspits (Spirometra), or if rodents defecated inside the cesspits as they explored the waste discarded there (Capillaria). While we cannot be sure if those who used these toilets were involved in silk manufacturing, merchants who traded in silk, or other members of society, the pattern of parasite species recovered help provide a vivid picture of life in the people who lived and worked in the silk district of Granada 300–400 years ago.
While the interaction between humans and their parasites is well studied today, taking a long view of infection throughout human evolution helps to place the current picture in context and identify trends in infection over time. After considering how early technologies may have facilitated the transmission of parasites to humans, we examine the association between humans and parasites through time using archaeological and genetic evidence. Techniques such as microscopy, immunoenzymatic assays and DNA analysis have identified a range of protozoa, helminths and ectoparasites in our ancestors. Evidence is discussed for the origins and impact upon societies through time for protozoa causing malaria, leishmaniasis, Chagas’ Disease and diarrhoeal illnesses, helminths such as schistosomiasis, soil-transmitted helminths, Taenia tapeworms, fish tapeworms and liver flukes, and ectoparasites such as fleas, body lice and pubic lice. Prevalence studies show widespread infection for some parasites, such as 36% with falciparum malaria in ancient Egypt, and 40% with Chagas disease in prehistoric Peru and northern Chile. Humans have been responsible for the inadvertent spread of a range of parasites around the world, ranging from African heirloom parasites with early human migrations to the introduction of malaria and schistosomiasis to the Americas with the transatlantic slave trade in the 1600s–1800s. It is clear that the epidemics due to bacterial pathogens spread by ectoparasites since the Bronze Age must have had major impacts upon past societies, particularly for bubonic plague and epidemic typhus.
People with severe mental illness die 10-20 years earlier than the general population. This is largely due to non-communicable diseases (NCDs) such as hypertension, diabetes and hypercholesterolaemia increasing the risk of cardiovascular disease, which is the greatest contributor to the excess mortality seen. The effect of these NCDs is likely to be greater in low-and middle-income countries such as Bangladesh, India and Pakistan due to additional barriers to health care access, lack of resources and other sociodemographic variables.
Objectives
Our study aimed to estimate the proportion of individuals with SMI in Bangladesh, India, and Pakistan who were screened for NCDs and offered health risk modification advice. Furthermore, we also explored socio-demographic factors associated with the likelihood of being screened for NCDs within this demographic.
Methods
This cross-sectional study gathered data from three national mental health institutions in South Asia. Participants aged ≥18 years diagnosed with SMI were included. Data collection involved face-to-face interviews based on the World Health Organisation Stepwise (WHO-STEPS) approach to NCD risk factor surveillance, supplemented by anthropometric measurements and blood tests to confirm NCDs. The prevalence of screening, diagnosis, health risk modification advice, and treatment for diabetes, hypertension, and high cholesterol was assessed. A logistic regression model assessed the associations of sociodemographic characteristics with NCD screening.
Results
3,989 participants were recruited. Screening prevalence varied by country and disease, with hypertension being the most commonly screened NCD (Bangladesh = 52.5% [50.0-55.1], India = 43.1% [40.3-45.9], Pakistan = 60.9% [58.2-63.5]), and cholesterol was the least common (Bangladesh = 4.1% [3.2-5.2], India = 14.8% [12.9-17.0], Pakistan = 9.6% [8.1-11.3]). Characteristics such as BMI, age and education level were positively associated with screening, and females were more likely to be screened than males. The provision of health risk modification advice was most common in India (diet = 66.7% [62.1-71.1], physical activity = 71.5% [67.0-75.6], smoking = 17.1% [13.8-21.0]), and least common in Bangladesh (diet = 17.8% [15.8-20.0], physical activity = 12.0% [10.3-13.8], smoking = 9.8% [8.3-11.5]).
Conclusions
There is a consistent gap in the screening of NCDs among individuals with SMI in South Asia, with marked sociodemographic disparities. There is a pressing need for standardised screening protocols and health risk modification interventions tailored to South Asian populations. Improving health literacy and implementing culturally sensitive, cost-effective prevention strategies could mitigate the increased risk of NCDs in South Asian individuals with SMI.
Knowledge of the status of ecosystems is vital to help develop and implement conservation strategies. This is particularly relevant to the Arctic where the need for biodiversity conservation and monitoring has long been recognised, but where issues of local capacity and logistic barriers make surveys challenging. This paper demonstrates how long-term monitoring programmes outside the Arctic can contribute to developing composite trend indicators, using monitoring of annual abundance and population-level reproduction of species of migratory Arctic-breeding waterbirds on their temperate non-breeding areas. Using data from the UK and the Netherlands, countries with year-round waterbird monitoring schemes and supporting relevant shares of Arctic-breeding populations of waterbirds, we present example multi-species abundance and productivity indicators related to the migratory pathways used by different biogeographical populations of Arctic-breeding wildfowl and wader species in the East Atlantic Flyway. These composite trend indicators show that long-term increases in population size have slowed markedly in recent years and in several cases show declines over, at least, the last decade. These results constitute proof of concept. Some other non-Arctic countries located on the flyways of Arctic-breeding waterbirds also annually monitor abundance and breeding success, and we advocate that future development of “Arctic waterbird indicators” should be as inclusive of data as possible to derive the most robust outputs and help account for effects of current changes in non-breeding waterbird distributions. The incorporation of non-Arctic datasets into assessments of the status of Arctic biodiversity is recognised as highly desirable, because logistic constraints in monitoring within the Arctic region limit effective population-scale monitoring there, in effect enabling “monitoring at a distance”.
Background: Adverse effects and risks associated with glucocorticoid (GC) treatment are frequently encountered in immune-mediated neuromuscular disorders. However, significant variability exists in the management of these complications. Our aim was to establish international consensus guidance on the management of GC-related complications in neuromuscular disorders. Methods: An international task force of 15 experts was assembled to develop clinical recommendations for managing GC-related complications in neuromuscular patients. The RAND/UCLA Appropriateness Method (RAM) was employed to formulate consensus guidance statements. Initial statements were drafted following a comprehensive literature review and were refined based on anonymous expert feedback, with up to three rounds of email voting to achieve consensus. Results: Consensus was reached on statements addressing general patient care, monitoring during GC therapy, osteoporosis prevention, vaccinations, infection screening, and prophylaxis for Pneumocystis jiroveci pneumonia. A multidisciplinary approach to managing GC-related complications was highlighted as a key recommendation. Conclusions: This represents the first consensus guidance in the neurological literature on GC complications, and offer clinicians structured guidance on mitigating and managing common adverse effects associated with both short- and long-term GC use. They also provide a foundation for future debate, quality improvement, research work in this area.
Increasing consumer demand for sustainably-sourced products has created a need to benchmark sustainability at the field level. To address this issue, some companies are offering incentives to producers, but are still lacking participation. This study estimated producers’ willingness to accept for participating in sustainability programs and implementing sustainable practices at the field level using a double-bounded dichotomous-choice framework. The results revealed preferences for longer contracts in length of time, industry as the verification party, supplemental benefits that yield an economic incentive, and a per-bale payment. This project will give new insights to the value and importance of documenting, verification, and traceability throughout the supply chain.
Evaluate impact of COVID-19 prevention training with video-based feedback on nursing home (NH) staff safety behaviors.
Design:
Public health intervention
Setting & Participants:
Twelve NHs in Orange County, California, 6/2020-4/2022
Methods:
NHs received direct-to-staff COVID-19 prevention training and weekly feedback reports with video montages about hand hygiene, mask-wearing, and mask/face-touching. One-hour periods of recorded streaming video from common areas (breakroom, hallway, nursing station, entryway) were sampled randomly across days of the week and nursing shifts for safe behavior. Multivariable models assessed the intervention impact.
Results:
Video auditing encompassed 182,803 staff opportunities for safe behavior. Hand hygiene errors improved from first (67.0%) to last (35.7%) months of the intervention, decreasing 7.6% per month (OR = 0.92, 95% CI = 0.92–0.93, P < 0.001); masking errors improved from first (10.3 %) to last (6.6%) months of the intervention, decreasing 2.3% per month (OR = 0.98, 95% CI = 0.97–0.99, P < 0.001); face/mask touching improved from first (30.0%) to last (10.6%) months of the intervention, decreasing 2.5% per month (OR = 0.98, 95% CI = 0.97–0.98, P < 0.001). Hand hygiene errors were most common in entryways and on weekends, with similar rates across shifts. Masking errors and face/mask touching errors were most common in breakrooms, with the latter occurring most commonly during the day (7A.M.–3P.M.) shift, with similar rates across weekdays/weekends. Error reductions were seen across camera locations, days of the week, and nursing shifts, suggesting a widespread benefit within participating NHs.
Conclusion:
Direct-to-staff training with video-based feedback was temporally associated with improved hand hygiene, masking, and face/mask-touching behaviors among NH staff during the COVID-19 pandemic.
In recent years, scholars worldwide have begun organising and developing a coherent framework and research agenda focused on the emerging field of ‘global administrative law’. This nascent body of law, unlike domestic or national forms of administrative law, does not operate within the bounds of unitary nation states, and unlike traditional accounts of public international law, it does not arise exclusively between nation states. Instead it operates in a transnational or global space occupied by a vast variety of administrative actors responsible for trans-governmental regulation and administration; the field of ‘global governance’. To combat growing concerns that there are crucial legitimacy, accountability and democratic deficiencies inherent in this system of global governance, numerous administrative law type mechanisms and principles have been developed by global administrative bodies. Global administrative law embodies the totality of these various mechanisms and principles.
Soldiers must be treated in the first instance with humanity but kept under control by means of iron discipline.
In the past fifty years the High Court of Australia has scrutinised the constitutional validity of military service tribunals on several occasions. Each time, the validity of service tribunals to conduct trials and impose punishment in relation to the particular offence has been upheld on the basis that it derives from a proper exercise by the legislature of its power under s 51(vi) of the Constitution. On no occasion has this been considered by the Court as a whole to involve a breach of the separation of powers doctrine. However, while it is generally accepted that service tribunals exercise what would ordinarily be seen as falling within the definition of “judicial power”, there has been no unifying and satisfactory explanation as to why this does not breach the separation of powers doctrine.
Recent theories have implicated inflammatory biology in the development of psychopathology and maladaptive behaviors in adolescence, including suicidal thoughts and behaviors (STB). Examining specific biological markers related to inflammation is thus warranted to better understand risk for STB in adolescents, for whom suicide is a leading cause of death.
Method:
Participants were 211 adolescent females (ages 9–14 years; Mage = 11.8 years, SD = 1.8 years) at increased risk for STB. This study examined the prospective association between basal levels of inflammatory gene expression (average of 15 proinflammatory mRNA transcripts) and subsequent risk for suicidal ideation and suicidal behavior over a 12-month follow-up period.
Results:
Controlling for past levels of STB, greater proinflammatory gene expression was associated with prospective risk for STB in these youth. Similar effects were observed for CD14 mRNA level, a marker of monocyte abundance within the blood sample. Sensitivity analyses controlling for other relevant covariates, including history of trauma, depressive symptoms, and STB prior to data collection, yielded similar patterns of results.
Conclusions:
Upregulated inflammatory signaling in the immune system is prospectively associated with STB among at-risk adolescent females, even after controlling for history of trauma, depressive symptoms, and STB prior to data collection. Additional research is needed to identify the sources of inflammatory up-regulation in adolescents (e.g., stress psychobiology, physiological development, microbial exposures) and strategies for mitigating such effects to reduce STB.
The purpose of this study is to analyze agricultural producers’ willingness to adopt regenerative cover crop practices in their operation and the effects of producer and farm characteristics on willingness to accept (WTA) values. The paper utilizes the double-bounded contingent valuation method to analyze survey responses submitted by producers and non-operating landowners in the Texas and Oklahoma portions of the Southern Great Plains. Results showed an average WTA of $26.38/acre for producers to adopt cover crops and that programs aimed at increasing adoption rates may require more substantial investment compared to those focused on continuity with current adopters.
The term ‘natural theology’ provokes a variety of reactions, spanning from whole-hearted endorsement to passionate rejection. Charged as it is with polemical and pejorative undertones, this debate begs for an intervention. If the scholarly community is to engage constructively with the concept and practice of natural theology – either by way of acceptance, rejection, or something in between – clarity in its definition and identification is imperative. The aim of this paper is to try to shed some light on three of the most common definitions of ‘natural theology’ in contemporary scholarship, to provide clarity about the ways in which they differ, and to propose some conceptual refinements in the hope that, if adopted, more fruitful discourse may take place in relation to this much-debated and interdisciplinary phrase.
The aim of this study is to estimate the minimum prevalence of intestinal parasites in the population of Roman London through analysis of pelvic sediment from 29 third- to fourth-century burials from the 1989 excavations of the western cemetery at 24–30 West Smithfield, 18–20 Cock Lane and 1–4 Giltspur Street (WES89). Microscopy was used to identify roundworm eggs in 10.3 per cent of burials. We integrate these results with past palaeoparasitological work in the province of Britannia to explore disease, hygiene and diet. The most commonly found parasites (whipworm and roundworm) were spread by poor sanitation, but other species caught from animals were also present (fish tapeworm, beef/pork tapeworm and liver flukes). Parasite diversity was highest in urban sites. The health impacts of these infections range from asymptomatic to severe.
This article examines shifts towards onshoring pharmaceutical manufacturing, a response to the vulnerabilities exposed by the COVID-19 pandemic in global supply chains. It delves into how globalization, public policy, and geopolitical tensions have shaped pharmaceutical markets, compelling nations to seek solutions that ensure reliable medicine access and reduce dependency on foreign supplies. The study highlights disparities in regulatory oversight and geographic concentration of production, which contribute to frequent shortages, particularly of generic medicines. The pandemic intensified these issues, prompting increased state interventions and heightening concerns over geopolitical risks. As a result, onshoring efforts, often encapsulated in local content measures, have expanded, and are now driven by both economic motives and imperatives of national security and public health.
In the absence of written records, disease and parasite loads are often used as indicators of sanitation in past populations. Here, the authors adopt the novel approach of integrating the bioarchaeological analysis of cesspits in an area of medieval Leiden (the Netherlands) with historical property records to explore living conditions. Using light microscopy and enzyme-linked immunosorbent assays (ELISA) they identify evidence of parasites associated with ineffective sanitation (whipworm, roundworm and the protozoan Giardia duodenalis)—at residences of all social levels—and the consumption of infected livestock and freshwater fish (Diphyllobothriidae, cf. Echinostoma sp., cf. Fasciola hepatica and Dicrocoelium sp.).
The question posed is how deep-time perspectives contribute to tackling contemporary One Health challenges, improving understanding and disease mitigation. Using evidence from the field of paleopathology, it is possible to explore this question and highlight key learning points from the past to focus the minds of those making healthcare policy decisions today. In previous centuries urbanization led to poorer health for a wide range of indicators, including life expectancy, sanitation and intestinal parasites, airway disorders such as maxillary sinusitis, metabolic diseases such as rickets, and even conditions resulting from clothing fashions such as bunions. Modern concerns regarding the quality of urban air and rivers show we have still to incorporate these lessons. When we consider major infectious diseases affecting past societies such as bubonic plague, tuberculosis and leprosy, interaction between humans and wild mammal reservoirs was key. Wild red squirrels in Britain today remain infected by the medieval strain of leprosy that affected people 1,500 years ago. It is clear that the One Health focus on the interaction between humans, animals and their environment is important. Eradicating zoonotic infectious diseases from humans but not these reservoirs leaves the door open to their spread back to people in the future.
Evaluation of adult antibiotic order sets (AOSs) on antibiotic stewardship metrics has been limited. The primary outcome was to evaluate the standardized antimicrobial administration ratio (SAAR). Secondary outcomes included antibiotic days of therapy (DOT) per 1,000 patient days (PD); selected antibiotic use; AOS utilization; Clostridioides difficile infection (CDI) cases; and clinicians’ perceptions of the AOS via a survey following the final study phase.
Design:
This 5-year, single-center, quasi-experimental study comprised 5 phases from 2017 to 2022 over 10-month periods between August 1 and May 31.
Setting:
The study was conducted in a 752-bed tertiary care, academic medical center.
Intervention:
Our institution implemented AOSs in the electronic medical record (EMR) for common infections among hospitalized adults.
Results:
For the primary outcome, a statistically significant decreases in SAAR were detected from phase 1 to phase 5 (1.0 vs 0.90; P < .001). A statistically significant decreases were detected in DOT per 1,000 PD (4,884 vs 3,939; P = .001), fluoroquinolone orders (407 vs 175; P < .001), carbapenem orders (147 vs 106; P = .024), and clindamycin orders (113 vs 73; P = .01). No statistically significant change in mean vancomycin orders was detected (991 vs 902; P = .221). A statistically significant decrease in CDI cases was also detected (7.8, vs 2.4; P = .002) but may have been attributable to changes in CDI case diagnosis. Clinicians indicated that the AOSs were easy to use overall and that they helped them select the appropriate antibiotics.
Conclusions:
Implementing AOS into the EMR was associated with a statistically significant reduction in SAAR, antibiotic DOT per 1,000 PD, selected antibiotic orders, and CDI cases.
Bentonites are readily available clays used in the livestock industry as feed additives to reduce aflatoxin (AF) exposure; their potential interaction with nutrients is the main concern limiting their use, however. The objective of the present study was to determine the safety of a dietary sodium-bentonite (Na-bentonite) supplement as a potential AF adsorbent, using juvenile Sprague Dawley (SD) rats as a research model. Animals were fed either a control diet or a diet containing Na-bentonite at 0.25% and 2% (w/w) inclusion rate. Growth, serum, and blood biochemical parameters, including selected serum vitamins (A and E) and elements such as calcium (Ca), potassium (K), iron (Fe), and zinc (Zn) were measured. The mineral characteristics and the aflatoxin B1 sorption capacity of Na-bentonite were also determined. By the end of the study, males gained more weight than females in control and Na-bentonite groups (p ≤ 0.0001); the interaction between treatment and sex was not significant (p = 0.6780), however. Some significant differences between the control group and bentonite treatments were observed in serum biochemistry and vitamin and minerals measurements; however, parameters fell within reference clinical values reported for SD rats and no evidence of dose-dependency was found. Serum Na and Na/K ratios were increased, while K levels were decreased in males and females from Na-bentonite groups. Serum Zn levels were decreased only in males from Na-bentonite treatments. Overall, results showed that inclusion of Na-bentonite at 0.25% and 2% did not cause any observable toxicity in a 3-month rodent study.