To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Liquid biopsy (LB) is a minimally invasive technique that enables analysis of circulating tumor DNA (ctDNA) by next generation sequencing (NGS) and has several potential applications in oncology. The main objective was to describe the diagnostic accuracy of LB over tissue biopsy for screening, diagnosis, and tumor mutation profiling in breast, lung, and colorectal cancer, including at different cancer stages.
Methods
Initially, a literature overview to select systematic reviews with meta-analyses or health technology assessment reports was conducted. When no data were obtained, a systematic review was conducted to select individual studies. For both procedures, a systematic literature search, two-step study selection, quality appraisal and a narrative synthesis were conducted.
Results
The only study in the screening setting involved colorectal cancer (sensitivity 14%, specificity 90%). Regarding initial cancer diagnosis, LB in breast cancer had a sensitivity ranging from 31 to 74 percent and a specificity of 66 to 92 percent, with inconsistent results by stage. For lung cancer, LB had a sensitivity of 22 to 86 percent and a specificity of 42 to 91 percent, with improved utility in advanced stages. Colorectal cancer studies using multicomponent tests showed sensitivities of 83 to 93 percent and specificities of more than 95%. LB ctDNA analysis in molecular profiling of advanced cancer had high accuracy in breast (PIK3CA), lung (epidermal growth factor receptor mutations), and colorectal cancer (NRAS/KRAS), with values for sensitivity of up to 83 percent and for specificity of around 90 percent.
Conclusions
No results were available for screening in breast and lung cancer. Limited evidence for colorectal cancer showed low sensitivity but high specificity for LB over tissue biopsy. Initial cancer diagnosis results were heterogeneous, with inconsistencies in breast and colorectal stages but improvements in advanced lung cancer. Tumor profiling provided the strongest evidence, with high diagnostic utility relative to tissue biopsy across all three cancer types.
Hospital malnutrition is caused by metabolic imbalance and is associated with increased rates of morbidity and mortality and unfavorable prognosis for hospitalized patients. In 2001, severe malnutrition was detected in 12.5 percent of hospitalized patients in Brazil. The aim of this study was to evaluate evidence on the efficacy and safety of oral nutritional supplements for malnutrition in hospitalized patients.
Methods
Search strategies developed for the PubMed, Embase, LILACS and Cochrane Library databases were used to search and select the available evidence. The included evidence compared oral nutritional supplementation with standard hospital care or placebo. A total of 47 clinical trials were selected. For mortality, a meta-analysis was performed combining the effect measures of 40 clinical trials. R software was used with the meta package installed. The revised Cochrane risk-of-bias tool was used to assess the risk of bias.
Results
For the primary outcomes of mortality (relative risk [RR] 0.94, 95% CI: 0.77, 1.15), length of hospital stay, and number of hospital readmissions (RR 0.97, 95% CI: 0.74, 1.28), oral supplementation offered no benefit, compared with the control group. Regarding infectious and non-infectious complications, there was a significant reduction in the number of patients with complications in the supplemented group, compared with the control group (RR 0.71, 95% CI: 0.59, 0.86). Regarding the risk-of-bias assessment, most of the studies had a high risk of bias and limitations in almost all areas.
Conclusions
The designs of the selected studies were quite heterogeneous in terms of follow-up time, population, and sampling, among others, and several doubts were raised about the methodology used in the clinical trials. These aspects were reflected in the low quality and high risk of bias of the primary studies, which proved to be an important limitation.
Generative artificial intelligence (AI) is transforming the generation of real-world evidence in health care. This study compares chemotherapy recommendations for women with node-positive breast cancer, as provided by a Delphi panel of experts, with those generated by ChatGPT and Copilot. The objective was to assess concordance and discrepancies between AI-generated and expert-driven recommendations.
Methods
A Delphi panel of 10 independent breast cancer experts, blinded to each other’s responses, participated in an online survey to evaluate chemotherapy recommendations for early stage, node-positive breast cancer, stratified by menopausal status and Oncotype DX Breast Recurrence Score® (RS: <14, 14 to 25, >25). ChatGPT and Copilot, using automatic prompt engineering, addressed the same scenarios as agent-based oncologists. Expert recommendations were summarized as arithmetic means, while AI responses were analyzed for concordance. A one-sample t-test compared mean estimates between the Delphi panel and each AI tool.
Results
Chemotherapy recommendation rates for premenopausal women by the Delphi panel, ChatGPT, and Copilot were: RS less than 14 (82.2, 2.5, 10); RS 14 to 25 (96.7, 22.5, 30); RS greater than 25 (100, 80, 60). For postmenopausal women, these were: RS less than 14 (3.3, 1.5, 15); RS 14 to 25 (17.8, 17.5, 35); RS greater than 25 (94.4, 72.5, 65). Significant differences were observed, especially in lower RS ranges. Differences in means between the Delphi panel and AI for postmenopausal with a RS greater than 25 and premenopausal with a RS less than or equal to 25 were statistically significant (p<0.05).
Conclusions
The recommendations from AI demonstrated significant divergence from the “gold standard” panel, especially for premenopausal patients. The observed deviations from Brazilian experts’ recommendations likely stem from differences in local practices, compared with international published data, which may have influenced AI outputs. These findings emphasize the continued importance of human panels to account for regional variations in clinical decision-making.
HIV-1 compromises the immune system, leading to immunodeficiency. Standard treatment involves the use of antiretroviral therapies; however, the development of resistance is common, highlighting the need for alternative therapeutic options. This study evaluated the therapeutic value and cost effectiveness of fostemsavir in combination with optimized background therapy (OBT), compared with OBT alone for patients living with multidrug-resistant HIV-1.
Methods
A cost-effectiveness analysis was conducted to estimate the incremental cost-effectiveness ratio (ICER) of fostemsavir combined with OBT for people living with HIV-1 resistant to at least four classes of antiretroviral therapies, compared with OBT alone. The analysis was performed from the perspective of the Brazilian Unified Health System (SUS), considering a one-year treatment horizon. For OBT, it was assumed that the regimen could include the simultaneous use of up to six medications. The efficiency frontier methodology was applied to determine a proportional price based on the cost and therapeutic response benefit.
Results
Fostemsavir demonstrated an incremental effectiveness of 54 percent in therapeutic response rate (viral load reduction). Over a one-year horizon, the cost-effectiveness analysis revealed a cost of BRL163,402.21 (USD27,233.70) for fostemsavir combined with OBT, compared with BRL23,271.60 (USD3,878.60) for OBT alone. The incremental cost of using fostemsavir was BRL260,127.37 (USD43,354.56) per therapeutic response. Based on the efficiency frontier approach, the incremental cost of fostemsavir should not exceed BRL19,819.83 (USD3,303.30), calculated based on the amount currently spent by the SUS plus the 54 percent additional benefit.
Conclusions
The use of fostemsavir combined with OBT demonstrates a superior virologic response rate, compared with OBT. Efficiency frontier analysis suggested a maximum justified cost, providing a framework for pricing decisions. This study highlighted the importance of balancing innovation with cost effectiveness in healthcare decision-making.
Generic medicines form the backbone of affordable health care in low- and middle-income countries (LMICs), constituting a significant portion of pharmaceutical consumption. However, issues such as fragmented pricing policies and insufficient regulatory frameworks hinder equitable access and affordability. This study evaluated pricing models that address these challenges by integrating best practices, including tiered pricing mechanisms and enhanced transparency. The aim was to guide policymakers toward sustainable solutions that improve medicine affordability, equity, and market resilience.
Methods
A comparative analysis was conducted across 20 countries, chosen for their diverse healthcare profiles and potential to be best practice examples for LMIC contexts. Key elements analyzed included pricing mechanisms (e.g., tiered pricing, external reference pricing, internal reference pricing), regulatory frameworks, and incentives for local production. Data were gathered from government reports, healthcare provider feedback, and industry insights. A policy evaluation framework was applied to assess each model’s effectiveness in addressing affordability, accessibility, and equity. The analysis prioritized generalizability to LMICs while accounting for contextual adaptations such as regional economic challenges and varying levels of regulatory capacity.
Supported by TUBITAK 2224-A Program for participation in HTAi 2025; it had no role in the study conduct.
Results
The findings revealed that tiered pricing structures significantly enhanced affordability by incentivizing market competition and reducing prices as generics enter. Transparent regulatory practices, including centralized databases and consistent price reviews, mitigated regional price disparities and increased trust among stakeholders. Countries fostering domestic production reduced import dependencies, stabilizing supply chains and ensuring medicine availability. These strategies, when applied collectively, have great potential to improve equitable access to generics in resource-constrained settings. The results highlighted the importance of adapting global best practices to the economic and healthcare realities of LMICs, emphasizing sustainability and resilience in pharmaceutical markets.
Conclusions
Integrating tiered pricing frameworks, regulatory transparency, and local production incentives provided a pragmatic pathway to improve generic medicine access in LMICs. These approaches align with global best practices and address affordability and equity challenges. Policymakers should prioritize such strategies while leveraging stakeholder collaboration. Future research should examine dynamic pricing models and their long-term impacts on market stability and healthcare outcomes.
Impaired social functioning is commonly observed in youth at clinical high risk (CHR) for psychosis. Interpersonal synchrony, defined as the temporal alignment of movement between interacting partners, is a key component of successful social interactions. This study aimed to investigate interpersonal head synchrony in naturalistic virtual settings among CHR individuals using automated video analysis tools.
Methods
We analyzed short video recordings from virtual clinical interviews involving 116 participants including 50 CHR participants, 36 individuals with sub-threshold positive symptoms (SUB), and 30 healthy controls (HC). Vertical head movement time series were extracted using an open-access video-based head-tracking tool. Interpersonal head synchrony was computed using Windowed Cross-Correlation to assess group differences and associations with clinical symptoms and functioning.
Results
CHR participants showed significantly reduced strength of synchrony compared to HC (β = −0.05, 95% CI [−0.09, −0.02], p = .004), although 14% of variance in strength of synchrony was attributable to assessor identity. No significant group differences were found for delay of synchrony. Within the CHR group, delay of synchrony was positively associated with social anhedonia (r = 0.29). Strength of synchrony correlated with better social (r = 0.33) and role (r = 0.28) functioning.
Conclusion
Our findings suggest that impaired interpersonal head synchrony is already present in the psychosis-risk state and relates to negative symptoms and social and role functioning. These findings support the utility of nonverbal synchrony as a potential biomarker and demonstrate the feasibility of automated tools and virtual assessments to study social processes in at-risk populations.
Head and neck squamous cell carcinoma (HNSCC) is the most common malignancy of the oral cavity, pharynx, and larynx. Nivolumab has demonstrated significant survival benefits in patients with recurrent or metastatic HNSCC after platinum-based chemotherapy. This study evaluated the cost utility of nivolumab compared with standard chemotherapy for recurrent or metastatic HNSCC in adults in the Brazilian Unified Health System (SUS).
Methods
A partitioned survival model was developed using survival data from the CheckMate-141 trial. Health states included progression-free survival (PFS), post-progression survival, and death. Estimates for overall survival (OS) and PFS were extrapolated from Kaplan-Meier curves using parametric distributions. Costs and utilities were analyzed from the SUS perspective over a five-year horizon, with a five percent annual discount rate for both costs and benefits. Utilities were derived from international literature. Costs were measured in BRL and converted to purchasing-power parity in 2023 (USD1=BRL2.44). Incremental cost-utility ratios (ICUR) were calculated using quality-adjusted life years (QALYs) as the outcome.
Results
Nivolumab resulted in incremental costs of USD56,040 (BRL136,737) and incremental benefits of 0.12 QALYs and 0.37 life years gained (LY), compared with chemotherapy, leading to an ICUR of USD458,039 (BRL1,117,615) per QALY gained, or USD152,588 (BRL372,315) per LY. This exceeded the commonly accepted threshold of USD49,180 (BRL120,000) per QALY for the SUS. Sensitivity analyses identified drug costs and utility values as the most impactful parameters. Despite benefits for OS (hazard ratio 0.68, 95% confidence interval [CI]: 0.54, 0.86), nivolumab was not cost effective under current thresholds due to its high costs and lack of effect on PFS (hazard ratio 0.89, 95% CI: 0.70, 1.13).
Conclusions
While nivolumab offered significant clinical benefits, including improved OS and a favorable safety profile, its high incremental cost means it is not a cost-effective option for treating recurrent or metastatic HNSCC from the SUS perspective. Strategic pricing adjustments or alternative access mechanisms may be necessary to enable broader access to this drug in the Brazilian healthcare system.
Visceral leishmaniasis (VL), caused by Leishmania infantum, is a significant public health challenge in the Americas and the Mediterranean basin. Effective disease control requires timely and accurate diagnostic methods, particularly in endemic regions. This study systematically evaluated the diagnostic accuracy of polymerase chain reaction (PCR), the rK39 rapid test, and ELISA with respect to their sensitivity and specificity against parasitological examination across diverse populations.
Methods
A systematic review was conducted using MEDLINE, Embase, the Cochrane Library, and LILACS databases without restrictions on publication date, language, or whether the study was published, unpublished, or ongoing. Studies assessing the diagnostic accuracy of PCR, the rK39 rapid test, and ELISA for detecting VL against direct parasitological examination were included based on predefined PIROS framework (Population, Index test, Reference standard, Outcomes, and Study design). Study selection and data extraction were independently performed by two reviewers, with disagreements resolved by a third reviewer. Data were synthesized narratively and through meta-analyses to estimate sensitivity and specificity. Methodological quality was assessed using QUADAS-2 and evidence certainty was evaluated using the GRADE approach.
Results
Twenty studies were included. In immunocompetent patients, PCR had high sensitivity (91 to 98.5%) and specificity (up to 100%) for detecting VL, providing a less invasive alternative to bone marrow aspiration. The rK39 test had sensitivity of 82.4 to 97 percent and a specificity close to 100 percent, which is suitable for endemic regions. ELISA sensitivity ranged from 89 to 94 percent. In immunosuppressed patients, PCR retained superior accuracy (83 to 100% sensitivity), while the rK39 sensitivity declined to 60 to 67 percent. Certainty of the evidence was moderate for immunocompetent patients but low for immunosuppressed groups, due to heterogeneity and the limited number of studies.
Conclusions
PCR was the most accurate method for detecting Leishmania infantum in endemic regions, especially for immunosuppressed patients. The rK39 rapid test was effective for initial screening in high-prevalence areas, and ELISA was helpful in specific contexts. These results emphasize the need for non-invasive, accurate diagnostics. Future studies should address evidence limitations, particularly for vulnerable populations.
To explore nurses’ perceptions regarding their knowledge, degree of autonomy, and the difficulties encountered in managing diabetic foot in Primary Care.
Background:
Diabetes mellitus is a chronic condition with a high prevalence in Spain, predominantly type 2. One of its most serious complications is diabetic foot disease, affecting between 19% and 34% of patients and associated with considerable morbidity and amputation risk. Primary Care, particularly nursing professionals, plays a pivotal role in the prevention, assessment, and management of diabetic foot. However, institutional, methodological, and personal barriers continue to affect care quality.
Methods:
A descriptive, cross-sectional observational study was conducted using quantitative and qualitative methods. A validated ad hoc questionnaire was administered to 176 nurses from the Murcian Health Service participating in a blended learning course on diabetic foot. Variables assessed included professional autonomy, knowledge, dressings use, clinical documentation, training, and perceived challenges. Qualitative analysis was based on open-ended responses using content analysis.
Findings:
A total of 88.1% of nurses reported autonomy in performing foot examinations; however, only 45.5% managed wound care independently. Just 19.9% considered themselves sufficiently trained, while 42.6% felt confident in selecting dressings appropriate to the healing phase. Although 56.8% regularly completed specific clinical documentation forms, many still expressed uncertainty about dressing use. Qualitative analysis identified five key barriers: lack of knowledge, patient complexity, institutional constraints, issues of authority and communication, and professional insecurity. These findings provide a current picture of persistent barriers in diabetic foot care and reinforce the need for targeted training and institutional support.
In Brazil, decisions regarding the incorporation of health technologies, guided by health technology assessment (HTA), are made independently within the public and private health systems. There is ongoing discussion about establishing a unified HTA agency to address incorporation across both sectors. This study sought to compare the decision-making criteria for technology incorporation in Brazil’s public and private health systems.
Methods
A dataset of new medicines registered in Brazil between 2010 and 2020, along with their corresponding indications, was compiled to identify and assess medication-indication pairs. Case studies were developed for pairs that underwent coverage evaluations in both the public and private health systems. These HTA coverage decisions were systematically analyzed and compared using a published evidence-based methodological framework. This methodology requires that research questions from different HTA agencies be sufficiently alike to enable a meaningful comparison of decision-making criteria.
Results
Twenty medicines across 10 indications were analyzed. Oncology related pairs (10 medicines, five indications) could not be compared due to differing research focuses: public evaluations covered entire treatment lines, whereas private assessments focused on specific medicine-indication pairs. Non-oncology cases (10 medicines, five indications) targeted eosinophilic asthma, ulcerative colitis, psoriasis, multiple sclerosis, and neuronal ceroid lipofuscinosis type 2. Public sector decisions prioritized unmet needs, disease impact, clinical benefits, and negotiated prices. In contrast, the private sector focused on expanding therapeutic options, with clinical uncertainties having limited influence. The private sector did not consider costs in decision-making, even when medicines lacked clinical superiority over comparators.
Conclusions
Decision-making in the public sector relies on HTA principles, while the private sector focuses on therapeutic expansion, ignoring efficiency and clinical uncertainties. A unified HTA agency in Brazil could negatively affect the public sector if private sector priorities dominate. Moreover, such an agency is unlikely to reduce the workload, as research questions differ significantly across sectors.
The world is facing an urgent environmental emergency that calls for ambitious, coordinated action by governments to improve humankind’s relationship with nature. Mitigation (reducing pollution) and adaptation (adjusting to pollution) are both necessary, and “NextGen” health technology assessment can play a key role in helping to guide health system decision-makers to achieve environmental sustainability.
Methods
In 2022, the International Network of Agencies for Health Technology Assessment (INAHTA) identified environmental impact assessment (EIA) as an important and urgent topic for a white paper. An international author group (n=10), formed of staff from INAHTA member agencies, wrote the paper. The author group met seven times over 2022 to 2024 to develop the paper, which is based on a literature review and expert opinion of INAHTA members. A second group of members formed an international advisory group (n=11), which reviewed the draft twice. The paper was approved by the INAHTA Board before release on the INAHTA website.
Results
EIA can be useful to help achieve the “green” policy goals of health systems by identifying technologies that are comparatively harmful to the environment. However, health technology assessment (HTA) agencies are subject to the institutional structures and regulations of their local health system that can constrain their ability to independently adapt to incorporate EIA. Nevertheless, HTA agencies are starting to do this with some success, but difficult challenges remain, particularly around lack of data and methods. Looking ahead, raising awareness and working together across the HTA ecosystem will be key to achieving the shared goal of environmental sustainability.
Conclusions
The successful inclusion of EIA in priority setting, assessment, and appraisal, as well as knowledge sharing and dissemination activities of HTA, is a challenge since the operating principles, methods, and data in this area are not yet mature. The successful inclusion of environmental impacts in HTA will require some reconsideration of existing value frameworks and methods.
The aim of this paper is to review the latest evidence on food reformulation as a public health policy to improve our understanding of how different policy designs can drive reformulation and influence food system change. The focus is on three key nutrients of concern—trans fatty acids, salt and sugar.
In recent times, food reformulation has been categorised as either mandatory or voluntary, a distinction that can help assess policy effectiveness. However, this binary classification oversimplifies a far more complex landscape. Some policies—whether mandated by government or voluntarily suggested to industry—are explicitly intended to trigger reformulation. Others, by contrast, may have never been designed with the intention to encourage reformulation but have nonetheless prompted significant changes in product composition, intake and potential health outcomes.
Within what is commonly described as mandatory reformulation, for example, we find a broad mix of policy tools that operate very differently. Some, such as the UK’s Soft Drinks Industry Levy, were deliberately created to incentivise reformulation by applying financial pressure. Others, including front of pack nutrition labelling systems (particularly warning labels) and school food standards have encouraged reformulation only as a positive unintended consequence. These indirect drivers are not always evaluated for their impact on reformulation, which may lead to an incomplete understanding of their contribution to reducing intake nutrients of concern and health outcomes.
Nevertheless, emerging evidence suggests no single policy encourages reformulation alone, instead a combination of approaches are likely to drive it and contribute to meaningful and sustained food system change.
To evaluate eligibility and participation in nutrition assistance programmes (Supplemental Nutrition Assistance Program (SNAP) and Women, Infants and Children (WIC)) among transgender and gender diverse (TGD) adults in the USA and to capture their experiences when accessing food benefits.
Design:
This was a cross-sectional analysis of the US Transgender Survey (USTS) dataset – the largest survey of TGD adults in the US SNAP and WIC participation and experiences when visiting the public assistance office were reported using descriptive statistics; stratified analyses were conducted based on race using multivariate logistic regression modelling.
Setting:
The USTS was completed electronically in the USA.
Participants:
27 715 TGD adults.
Results:
Approximately 40·9 % of the full sample were SNAP eligible, yet only 30·6 % of those eligible were receiving the benefit; 0·45 % of the sample reported receiving WIC. TGD adults avoided the public assistance office because they feared being mistreated (3·2 %), were identified as transgender (46·2 %), were denied equal treatment (6·5 %) or were verbally harassed (5·2 %). People of colour were more likely to be denied equal treatment and verbally harassed at the public benefits office than their white peers. The impact of age, education level, employment status, relationship status and census region varied within each racial group.
Conclusions:
Far more TGD adults need food assistance compared with the general population, yet fewer are receiving the benefit. Culturally informed interventions are urgently needed to resolve the root causes of food insecurity, increase SNAP participation and address the negative experiences of TGD adults when accessing food benefits.
Aspects of the school food environment can influence food purchasing and consumption among adolescents, particularly those without access to a school meal programme. Our objective was to describe and compare food vendors of junior high schools (JHS) in Ghana.
Design:
We conducted structured observations of food vendors within a 0·25-km radius of eight JHSs. We compared foods sold and hygiene practices by vendor and community characteristics, such as on- v. off-campus location, urban v. rural, and predominant income-generating activity of the community. We also assessed the relationship between adolescent diet quality (food group diversity, all-5, NCD-protect and NCD-risk scores) and procurement method for foods consumed during the school day.
Setting:
Cape Coast and Elmina, Ghana.
Participants:
200 randomly selected students.
Results:
Of 265 identified vendors, 25·3 % sold foods on-campus. On-campus vendors were less likely to sell branded snacks (19·4 % v. 33·8 %, P = 0·001) and beverages (17·9 % v. 35·4 %, P = 0·008) and more likely to sell prepared dishes (53·7 % v. 31·8 %, P = 0·001) than off-campus vendors. Vendors practised an average of 38·8 % of applicable food hygiene practices, which did not differ by on- or off-campus location. In the previous month, 59·4 % of students most often purchased food on campus. There were no significant relationships between method of food procurement and diet quality.
Conclusion:
Many adolescents purchased food at school, and there were differences in foods sold by on- and off-campus vendors. School policies may be a promising avenue to alter food environments for adolescents.
The present study examined the association of body mass index (BMI), screen and sleep time, physical fitness and eating behaviour with Mediterranean diet (MD) adherence in a sample of pre-schoolers from Granada, Spain.
Design:
A cross-sectional, non-randomised design was employed. A multilinear regression model with backward elimination was used for analysis.
Setting:
Variables included age, BMI, screen time, hours of nightly sleep, physical fitness, food approach and food avoidance. The developed model met assumptions of multiple regression in terms of linearity, homoscedasticity, normality, independence and non-multicollinearity.
Participants:
Data were collected from 653 of the 2250 three-to-six-year-old children attending the 18 schools invited to take part in the present study.
Results:
Better sleep time and lower screen time and food avoidance were found to be predictive of MD adherence. These variables explained 15% of the variance in pre-schoolers MD adherence.
Conclusions:
The present study suggests that sleep and screen time and food avoidance are important components to consider when targeting improvements in MD adherence in pre-schoolers. Future research should explore the way in which parental health behaviours influence their children’s health habits in order to better understand outcomes.
The role of healthcare provider ownership in shaping health system performance remains contested. An umbrella review was conducted to synthesise evidence on the relationship between healthcare provider ownership and performance in high-income countries. Systematic reviews were included that examined performance of healthcare providers based on ownership status. Searches yielded 1,862 results, with 31 systematic reviews meeting the inclusion criteria, and one further systematic review identified through grey literature searches. Following the exclusion of 10 reviews classified as low-quality and two previous umbrella reviews both published in 2014, 20 reviews were eligible for data extraction and synthesis. Inconsistent evidence was found across reviews between healthcare provider ownership and several performance indicators including health outcomes, technical efficiency, and patient satisfaction. Private hospitals tend to serve wealthier patients, select less complex or costly patients, and charge higher payments for care than public comparators. Private for-profit (FP) providers of hospital and long-term care generally had poorer workforce outcomes than private not-for-profit or public providers, including reduced staffing levels, higher workloads, and lower job satisfaction. Private PF hospitals and nursing homes had improved financial performance based on revenues or profit margins. Our findings underscore the need for nuanced regulatory responses to the expansion of private FP provision within publicly funded systems.
This study aims to examine the awareness, attitudes, and acceptability of medical aid in dying (MAiD) among healthcare professionals in Pakistan, a predominantly Muslim country where cultural and religious values heavily influence medical ethics and end-of-life decisions.
Methods
A cross-sectional survey was conducted online among 70 healthcare professionals, including physicians, nurses, and allied health workers in Pakistan. Data were collected via a structured, self-administered online questionnaire assessing knowledge, attitudes, and willingness to participate in MAiD-related actions. Descriptive and correlational analyses were conducted to identify patterns and associations.
Results
Participants demonstrated moderate knowledge about MAiD (M = 17.13, SD = 3.42) and moderate support for its legalization (M = 18.89, SD = 4.99). However, levels of negative attitudes (M = 32.21, SD = 6.11) and legal and ethical concerns (M = 24.73, SD = 3.66) were high. Behavioral willingness to engage in MAiD-related actions remained low (M = 2.42, SD = 3.38), with limited intent to assist (M = 0.39), refer (M = 0.64), or approve physician-assisted MAID (M = 0.81). A significant negative correlation emerged between knowledge and support for legalization (r = − .25, p = .037), while no significant associations were observed between knowledge and willingness to participate in MAiD. Gender and profession did not significantly influence attitudes or willingness.
Significance of results
While Pakistani healthcare professionals display a conceptual understanding of MAiD, their readiness to participate remains low, primarily due to ethical, legal, and religious concerns. These findings highlight the need for creating awareness regarding MAiD and for providing culturally sensitive education, structured training in palliative care, and the development of clear legal frameworks to guide end-of-life decision-making in Muslim-majority contexts.
To explore the experiences of military medical first responders managing mass casualty incidents (MCIs) during the ongoing conflict in Ukraine to identify key challenges and insights.
Methods
This qualitative study employed in-depth, semi-structured interviews with medical first responders who managed MCIs in Ukraine. Thematic analysis was leveraged by our research team to identify recurring themes and patterns within the interview data.
Results
Our results revealed crucial takeaways related to the (1) need for preparedness and training, (2) variability of triage, (3) importance of communication and teamwork, and (4) the resulting psychological strain.
Conclusions
These firsthand accounts offer valuable lessons for identifying challenges of first responders, developing areas of future research for MCI response strategies, and enhancing the readiness and well-being of medical first responders in current and future conflicts.
This study assessed iron-rich food consumption and its factors among children aged 6–23 months in South and Southeast Asia.
Design:
A cross-sectional study from the Standard Demographic and Health Survey (2015-2022).
Setting:
South and Southeast Asian countries.
Subjects:
Data collected from 95,515 children aged 6 to 23 months, including information from their parents or caregivers.
Results:
The overall proportion of children, aged 6 to 23 months, consuming iron-rich foods in the region was 29.87% (95% CI: 29.58, 30.16). Higher odds of iron-rich food consumption were observed among children aged 12–23 months (AOR = 3.59; 95% CI: 3.45–3.76), had history of exclusive breastfeeding (AOR = 1.17; 95% CI: 1.12–1.23), born to teenage motherhood (AOR = 1.09; 95% CI: 1.02–1.17), born in health institution (AOR = 1.10; 95% CI: 1.02–1.19), and had pregnant mother at the time of the survey (AOR = 1.60; 95% CI: 1.50–1.72). Children of birth order 2–4 (AOR = 1.26; 95% CI: 1.20–1.32) and 5+ (AOR = 1.29; 95% CI: 1.18–1.43), from female-headed households (AOR = 1.06; 95% CI: 1.01–1.12), and those with household mass media exposure (AOR = 1.27; 95% CI: 1.19–1.36) also had significantly higher odds of iron-rich food consumption. Additionally, higher odds ratios (AOR > 1) of iron-rich food consumption were observed in Cambodia, Bangladesh, Indonesia, Myanmar, Maldives, Philippines, Pakistan, and Timor-Leste.
Conclusion:
Across countries, only about 30% of children consumed iron-rich foods, with significant variation. Targeted public health efforts are essential to address maternal, child, and household factors that influence intake.