To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this paper is to review several key aspects of undernutrition in later life, with a major focus on undernutrition in community-dwelling older adults. The prevalence of undernutrition in community-dwelling older adults is about 8.5%, but higher in vulnerable subgroups such as the oldest old (19.3%), those reporting poor appetite (22.4%), and those receiving home care (15.8%). Frequently reported risk factors for undernutrition in the community include poor appetite, functional limitations and previous hospitalization. The Determinants of Malnutrition in Aged Persons (DoMAP) model provides a clear framework to structure the different direct and indirect potential determinants of undernutrition in old age. Low body mass index as well as involuntary weight loss, both important phenotypic criteria of undernutrition, are associated with early mortality in older adults. Furthermore, undernutrition in community-dwelling older adults is associated with a subsequent increased risk of frailty, falls, functional decline and rehospitalization. Qualitative studies indicate a poor undernutrition awareness among healthcare professionals working in community care as well as among older adults themselves. The Malnutrition Awareness Scale can be used to objectively measure an older persons’ undernutrition awareness. In conclusion, the prevalence of undernutrition among older adults living in the community is substantial and has several negative consequences for health and functioning. Strategies towards greater undernutrition awareness by primary care professionals as well as older adults themselves is therefore necessary.
This chapter delves into the severe health impacts of climate change, focusing on issues such as heat stress, infectious diseases, and food insecurity. Medical doctor Sweta Koirala from Nepal shares insights on increasing heat-related illnesses and the spread of vector-borne diseases such as dengue fever. The chapter highlights the critical need for climate adaptation measures to protect human health, emphasizing the vulnerability of agricultural systems and labour productivity. Personal stories, such as those of outdoor workers facing extreme heat in Bangladesh, illustrate the direct effects on daily life and economic stability. The CVF’s Monitor and the Lancet the Lancet Countdown’s works on Health...’s works on Health and Climate Change address the interplay between climate adaptation, public health, and agricultural productivity, stressing the urgent need for comprehensive health and food security policies to mitigate these impacts.
Cross-cutting issues like nutrition have not been adequately addressed for children with severe visual impairment studying in integrated schools of Nepal. To support advocacy, this study aimed to determine the nutritional status of this vulnerable group, using a descriptive cross-sectional design involving 101 students aged 5–19 years from two integrated public schools near Kathmandu Valley and two in western Nepal. The weight-for-age z-score (WAZ), height-for-age z-score (HAZ), and body mass index-for-age z-score (BAZ) were computed and categorised using World Health Organization cut-off values (overnutrition: z-score > +2.0 standard deviations (SD), healthy weight: z-score −2.0SD to +2.0SD, moderate undernutrition: z-score ≥ −3.0SD to <−2.0SD, severe undernutrition: z-score < −3.0 SD) to assess nutritional status. A child was considered to have undernutrition for any z-scores <−2.0SD. Multivariate logistic regression was used to analyse variables linked to undernutrition. The mean age of participants was 11.86 ± 3.66 years, and the male-to-female ratio was nearly 2:1. Among the participants, 71.29% had blindness, and 28.71% had low vision. The mean BAZ and HAZ scores decreased with age. The WAZ, HAZ, and BAZ scores indicated that 6.46% were underweight, 20.79% were stunted, and 5.94% were thin, respectively. Overall, 23.76% of students had undernutrition and 7.92% had overnutrition. More than three in ten students had malnutrition and stunting was found to be prevalent. Older students and females were more likely to have undernutrition. These findings highlight the need for nutrition interventions within inclusive education settings, particularly targeting girls with visual impairments who may face compounded vulnerabilities.
Malnutrition is a significant issue among older New Zealanders, with 24% malnourished and 35% at high risk(1). Oral nutritional supplements (ONS) are prescribed to improve nutrient intake in malnourished or at-risk individuals. Evidence supports that ONS can enhance energy and protein intake(2). However, efficacy depends on regular and adequate consumption. Fonterra Research and Development Centre sponsored a research programme of three interventions with the aim of assessing the liking, absorption, and compliance of ONS formulations (containing functional proteins at 9.6% and 14.4% w/v protein) versus commercial comparators. A feasibility study was also done to assess whether ONS could be used to fortify foods in a residential care setting. All trials received ethics approval. In study one (trial registration: NCT04397146), the palatability and satiating effects were evaluated in 104 participants. Fonterra’s 14.4% protein ONS was well-received for sweetness, creaminess, and texture, while the 9.6% protein ONS had lower palatability. Satiety levels were similar across all products. Key drivers of overall liking included smooth texture, pleasant taste, and ease of drinking. In study two (ACTRN12621000127808), a randomized, double-blind crossover trial of 18 healthy adults, the post-prandial effects of Fonterra’s formulation compared to energy and protein matched commercial products on amino acid (AA) appearance and gastric emptying were examined. Fonterra’s 14.4% protein ONS significantly increased the incremental area under the curve and peak concentration of essential and branched-chain AA, including leucine, compared to control (p<0.05). These findings suggest potential benefits for muscle mass preservation in at-risk patients. In study three (ACTRN12622000842763), a randomized, single-blind crossover trial, 100 older adults completed compliance and tolerance assessments of Fonterra’s formulation compared to energy and 9.6% protein matched commercial product. Compliance for all three ONS was high, with mean compliance rates of 96.1% for Fonterra 9.6%, 94.5% for Fonterra 14%, and 95.2% for comparator. Palatability scores were not significantly different. Adverse events were minimal and short-lived, mainly occurring on the first day; 30-50% of participants reported tolerance issues, such as flatulence, bloating, and burping, regardless of the product. No significant differences in satiety were observed between the interventions. Lastly, a pilot study assessed the feasibility of incorporating ONS into foods in a residential care setting. The chef found the ONS easy to work with and add to desserts, which subsequently increased the protein and calcium content of main meals. Residents found the fortified desserts palatable and acceptable. This research programme supports the use of ONS assisting older adults to meet their nutrient requirements and demonstrates that formulations containing Fonterra’s functional proteins are well-accepted, effective in increasing amino acid appearance, and easily incorporated into institutional diets, with high consumption compliance and minimal adverse effects.
Head and neck cancer (HNC), characterised by malignant neoplasms originating in the oral cavity, upper aerodigestive tract, the sinuses, salivary glands, bone, and soft tissues of the head and neck, is diagnosed in approximately 600 people annually in New Zealand. Although HNC is a less common cancer, it has a profound effect on almost all aspects of the lives of those affected, particularly the nutritional and social domains. This is due to the common treatment modality being surgery and/or radiotherapy, which can result in major structural and physiological changes in the affected areas, which in turn affects chewing, swallowing, and speaking(1). Specific nutrition impact symptoms (NIS) of HNC have been identified and are significant predictors of reduced dietary intake and malnutrition risk(2). We aimed to identify and describe the malnutrition risk, prevalence of NIS, and protein and energy intake of community living adult HNC survivors 6 months–3 years post treatment in New Zealand. Participants were recruited through virtual HNC support groups in New Zealand. A descriptive observational case series design was used. Malnutrition risk was determined using the Patient-Generated Subjective Global Assessment Short Form (PG-SGA SF). Malnutrition was defined as a PG-SGA SF score between 2 - 8 (mild/suspected - moderate malnutrition) or ≥9 (severely malnourished). NIS were obtained via a validated symptom checklist specific for HNC patients(3), and dietary data was collected using a four-day food record. Participants (N=7) are referred to as PTP1 – PTP7. PTP1 was well-nourished. PTP3 through PTP7 were categorised as mildly/suspected to moderately malnourished (scores ranged from 2-7), and PTP2 was severely malnourished (score of 16). NIS were experienced by all seven participants, with “difficulty chewing” and “difficulty swallowing” being the most selected and highest scored NIS that interfered with oral intake. PTP2 (severely malnourished) scored loss of appetite, difficulty chewing, and difficulty swallowing highly (interfering “a lot”), indicating a high degree of prevalence and impact. Despite being well-nourished, PTP1 had inadequate energy intake (85.5% of their estimated energy requirement (EER)). PTP2, 3, 6, and 7 also had inadequate energy intake (79.3%, 79.3%, 73.9%, and 99.3%, respectively, of their EER). All participants had adequate protein intake based on a range of 1.2-1.5 g/kg body weight per day. The prevalence of malnutrition and NIS in this case series indicates an urgent need for research to identify the true extent of malnutrition in community living HNC survivors post treatment.
Persistent malnutrition is associated with poor clinical outcomes in cancer. However, assessing its reversibility can be challenging. The present study aimed to utilise machine learning (ML) to predict reversible malnutrition (RM) in patients with cancer. A multicentre cohort study including hospitalised oncology patients. Malnutrition was diagnosed using an international consensus. RM was defined as a positive diagnosis of malnutrition upon patient admission which turned negative one month later. Time-series data on body weight and skeletal muscle were modelled using a long short-term memory architecture to predict RM. The model was named as WAL-net, and its performance, explainability, clinical relevance and generalisability were evaluated. We investigated 4254 patients with cancer-associated malnutrition (discovery set = 2977, test set = 1277). There were 2783 men and 1471 women (median age = 61 years). RM was identified in 754 (17·7 %) patients. RM/non-RM groups showed distinct patterns of weight and muscle dynamics, and RM was negatively correlated to the progressive stages of cancer cachexia (r = –0·340, P < 0·001). WAL-net was the state-of-the-art model among all ML algorithms evaluated, demonstrating favourable performance to predict RM in the test set (AUC = 0·924, 95 % CI = 0·904, 0·944) and an external validation set (n 798, AUC = 0·909, 95 % CI = 0·876, 0·943). Model-predicted RM using baseline information was associated with lower future risks of underweight, sarcopenia, performance status decline and progression of malnutrition (all P < 0·05). This study presents an explainable deep learning model, the WAL-net, for early identification of RM in patients with cancer. These findings might help the management of cancer-associated malnutrition to optimise patient outcomes in multidisciplinary cancer care.
Adolescent girls are vulnerable and deserve the utmost attention to complement their nutrition. This scoping review endeavours to identify the determinants of malnutrition among adolescent girls in Pakistan and to comprehend the interventions to improve their health and nutritional status. This review of the literature was conducted using Google Scholar, PubMed/Medline, Scopus and Web of Science for articles published between 2015 and 2024. MeSH terms used for search were as follows: adolescent, youth, health, malnutrition, nutrition interventions, systems approach. In addition, reports from the WHO, the UN, the World Bank, the Government of Pakistan and other organisations were also critically reviewed. Moreover, this paper has used the Pathways framework, which advocates multi-sectoral approaches for poverty reduction. In most developing countries, the compromised nutritional status of adolescent girls, compounded by poverty, has life-long health and economic consequences, as well as their infants having nutritional deficits. They are expected to grow as stunted children. Abundant evidence has shown that nutrition-sensitive and nutrition-specific interventions can improve their nutritional status and that of subsequent generations. There is a dire need to involve key stakeholders from health, education, nutrition, population, women’s development, social welfare and other relevant sectors. It is imperative to design interventions for adolescent girls in each country’s context to break the intergenerational cycle of malnutrition and to improve economic productivity. Political commitment and effective governance along with policy coherence are required for their healthy transitions into adulthood.
Despite previous observational studies suggesting that malnutrition could be involved in venous thromboembolism (VTE), definitive causality still lacks high-quality research evidence. This study aims to explore the genetic causal association between malnutrition and VTE. The study was performed using summary statistics from genome-wide association studies for VTE (cases = 23 367; controls = 430 366). SNP associated with exposure was selected based on quality control steps. The primary analysis employed the inverse variance weighted (IVW) method, with additional support from Mendelian randomisation (MR)-Egger, weighted median and weighted mode approaches. MR-Egger, leave-one-SNP-out analysis and MR pleiotropy residual sum and outlier (MR-PRESSO) were used for sensitivity analysis. Cochran’s Q test was used to assess heterogeneity between instrumental variables (IV). IVW suggested that overweight has a positive genetic causal effect on VTE (OR = 1·1344, 95 % CI = 1·056, 1·2186, P < 0·001). No genetic causal effect of malnutrition (IVW: OR = 0·9983, 95 % CI = 0·9593, 1·0388, P = 0·9333) was found on VTE. Cochran’s Q test suggests no possible heterogeneity in both related exposures. The results of the MR-Egger regression suggest that the analysis is not affected by horizontal pleiotropy. The results of the MR-PRESSO suggest that there are no outliers. The results revealed a statistical genetic association where overweight correlates with an increased risk of VTE. Meanwhile, no genetic causal link was observed between malnutrition and VTE. Further research is warranted to deepen our understanding of these associations.
Malnutrition is a relevant prognostic factor in cardiovascular disease. However, it has not been studied in adults with CHD and Fontan circulation.
Methods:
Retrospective, single-centre cohort study including all consecutive adults with Fontan circulation. Objectives: 1. To evaluate the prevalence of malnutrition, defined according to Controlling Nutritional Status score, which includes albumin, lymphocytes, and cholesterol and 2. To assess its utility as a prognostic marker.
Results:
We included 93 patients (55.9% male) with a mean age of 32.7 ± 8.3 years. After a median follow-up of 5.5 years (interquartile range 2.2 – 10.6), 14 patients met the combined primary outcome of death or heart transplant (15.1%). Moderate or severe malnutrition (Controlling Nutritional Status score ≥ 5) was detected in 18.3%. Overweight was found in 21.5% of patients, obesity in 4.3%, and low weight in 8.6%, with no significant differences in malnutrition parameters across weight categories. Patients with malnutrition had worse functional capacity (58.8% in New York Heart Association—NYHA-class III–IV, vs. 33.3% in patients without malnutrition, p = 0.05).
In univariate analysis, malnutrition was associated with a worse prognosis (death or heart transplant) with a hazard ratio of 3.7 (95% confidence interval 1.3 to 10.7, p = 0.01). In the adjusted model including cyanosis, functional class, and protein-losing enteropathy, malnutrition did not reach statistical significance (p = 0.81).
Conclusion:
Malnutrition as defined by Controlling Nutritional Status score is common in adults with Fontan circulation and represents a strong prognostic marker. Controlling Nutritional Status scale could be used in Fontan patients as a simple tool to identify a high-risk population.
This study aimed to assess the concordance between different anthropometric indexes in the Global Leaders Initiated Malnutrition Standards (GLIM) and the geriatric risk index (GNRI) for evaluating muscle mass, while also exploring performance-based criteria for GLIM muscle content suitable for elderly patients with intermediate and advanced tumours. A total of 312 patients admitted to Shanghai Tenth People’s Hospital between September 2022 and June 2023 were retrospectively included. Nutritional assessments were conducted using the GLIM framework, employing grip strength, upper arm circumference and calf circumference as criteria for muscle content evaluation. The diagnostic value of these tools was compared against the GNRI as a reference standard. Among the participants, 127 (40·71 %) were diagnosed as malnourished by GNRI, while the GLIM assessments yielded 138 (44·23 %), 128 (41·03 %) and 162 (51·92 %) malnutrition diagnoses based on grip strength, calf circumference and upper arm circumference, respectively. Both GNRI and GLIM-grip strength were significantly associated with complications and length of hospital stays. Notably, using GNRI as a reference, GLIM-grip strength demonstrated good consistency in diagnosing malnutrition (K value = 0·692, P< 0·001), with calf circumference having the highest diagnostic value. In conclusion, grip strength is a practical and effective performance-based criterion within the GLIM standards and has the potential to enhance malnutrition diagnosis in elderly patients with advanced malignancies, highlighting its relevance in nutritional science.
Understanding the determinants of malnutrition is pivotal for public health interventions. This study aimed to identify socio-economic, demographic, dietary and maternal determinants of wasting and overweight among Brazilian children between 6 and 59 months. Data from the Brazilian National Survey on Child Nutrition were analysed (n 11 789). Children’s weight-for-height Z-scores were calculated according to the WHO growth standard and classified as wasting (Z < −2), normal weight (–2 ≤ Z ≤ 1), overweight risk (1 < Z ≤ 2) and overweight (Z > 2). Socio-economic, demographic, dietary and maternal covariables were considered. Adjusted multinomial logistic regression (OR and 95 % CI) was employed. The prevalence of overweight and wasting was 9·5 and 2·6 %, respectively. In the adjusted model, younger age (6–23 months: OR: 1·7; 95 % CI: 1·3, 2·2), consumption of ≥ 5 ultra-processed food groups (OR: 1·8; 95 % CI: 1·1, 3·1), maternal underweight (OR: 0·4; 95 % CI: 0·2, 0·9), overweight (OR: 1·5; 95 % CI: 1·2, 1·9) and mild food insecurity (OR: 0·8; 95 % CI: 0·6, 1·0) were associated with child overweight. The Brazilian Northeast (OR: 4·9; 95 % CI: 2·1, 11·3), Southeast (OR: 7·1; 95 % CI: 3·0, 16·6), South (OR: 4·7; 95 % CI: 1·8, 12·1), Midwest regions (OR: 2·7; 95 % CI: 1·2, 6·2) and maternal underweight (OR: 5·4; 95 % CI: 2·7, 10·7) were associated with wasting. Overweight in Brazil is prevalent among children between 6 and 59 months, while wasting is not a major public health problem. The main determinants of these Brazilian children’s nutritional status were age, ultra-processed food consumption and maternal nutritional status.
To examine power and governance arrangements in food and nutrition policy formulation and agenda-setting in South Africa
Design:
Analysis of the policy implementation environment and in-depth interviews were conducted focussing on: existing policy content and priorities across food system sectors; institutional structures for cross-sectoral and external stakeholder engagement; exercise of power in relation to food system policies; and opportunities to strengthen action on nutrition.
Setting:
South Africa
Participants:
Interviews were conducted with 48 key stakeholders involved in the food and nutrition policy sphere: government sectors relevant to food systems (n=21), the private sector (n=4), academia (n=10), NGOs (n=11) and farmers (n=2).
Results:
This study found that there are power dynamics involved in shaping the planning agenda that is inadvertently generating a food system that undermines the right to food. The concept of nutrition governance remains poorly defined and applied in different ways and usually based on a relatively narrow interpretation – therefore limiting policy coherence and coordination. South Africa has strong legal institutions and practices, and social policies that support public provisioning of food, but a non-interventionist approach to the food system.
Conclusions:
The right to food and nutrition, as outlined in the South African Constitution, has not yet been effectively utilized to establish a robust normative and legal basis for tackling the dual challenges of food insecurity and malnutrition. Currently, the governance of the food system is grappling with substantial obstacles, balancing the influence of powerful stakeholders who uphold the status quo against its responsibilities for food justice.
The current study is an attempt to explore under-five child malnutrition in a low-income population setting using the Extended Composite Index of Anthropometric Failure (ECIAF).
Design:
Data from the Bangladesh Demographic and Health Survey 2017–2018 were analysed. Malnutrition using ECIAF was estimated using stunting, wasting underweight and overweight. Multilevel logistic regression models identified factors associated with malnutrition. Geospatial analysis was conducted using R programming.
Setting:
Bangladesh.
Participants:
Children under 5 years of age.
Results:
In Bangladesh, as indicated by the ECIAF, approximately 40·8 % (95 % CI: 39·7, 41·9) of children under five experience malnutrition, whereas about 3·3 % (95 % CI: 2·9, 3·7) were overweight. Children of parents with no formal education (56·3 %, 95 % CI: 50·8, 61·8), underweight mothers (53·4 %, 95 % CI: 50·4, 56·3), belonging to the lowest socio-economic strata (50·6 %, 95 % CI: 48·3, 53·0), residing in rural areas (43·3 %, 95 % CI: 41·9, 44·6) and aged below 3 years (47·7 %, 95 % CI: 45·2, 50·2) demonstrated a greater age- and sex-adjusted prevalence of malnutrition. The Sylhet division (Eastern region) exhibited a higher prevalence of malnutrition (> 55·0 %). Mothers with no formal education (adjusted OR (AOR): 1·51, 95 % CI: 1·08, 2·10), underweight mother (AOR: 1·54, 95 % CI: 1·03, 1·83), poorest socio-economic status (AOR: 2·14, 95 % CI: 1·64, 2·81), children aged 24–35 months (AOR: 2·37, 95 % CI: 1·97, 2·85) and fourth and above birth order children (AOR: 1·41, 95 % CI: 1·16, 1·72) were identified key factors associated with childhood malnutrition while adjusting community- and household-level variations.
Conclusions:
In Bangladesh, two out of five children were malnourished, and one in thirty-five children was overweight. Continuous monitoring of the ECIAF over time would facilitate tracking changes in the prevalence of different forms of malnutrition, helping to plan interventions and assess the effectiveness of interventions aimed at addressing both undernutrition and overweight.
Low vegetable consumption among school-age children and adolescents puts them at risk of micronutrient malnutrition and non-communicable diseases. There is a dearth of synthesised literature on vegetable intake and interventions to promote increased consumption among this age group in West Africa. This study pooled evidence on vegetable consumption and interventions to promote vegetable consumption among school-age children and adolescents (6–19 years) in West Africa. Quantitative and qualitative studies from 2002 to 2023 were electronically searched in PubMed, African Journals Online (AJOL) and Google Scholar databases. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses system was adhered to in reporting this review (PROSPERO ID: CRD42023444444). The Joanna Briggs Institute critical evaluation tool was used to appraise the quality of studies. Forty (40) studies met the search criteria out of n 5080 non-duplicated records. Meta-analysis was not possible due to high heterogeneity. Low vegetable consumption expressed in frequency or amounts was recorded among school-age children and adolescents in the reviewed studies. Intervention studies were mostly among adolescents; the most common type of intervention was the use of nutrition education. Insufficient evidence and high heterogeneity of studies reflect the need for more high-quality interventions using globally identified standards but applied contextually. School-age children appear to be an under-served population in West Africa with regard to nutrition interventions to promote vegetable consumption. There is a need for multi-component intervention studies that encourage vegetable consumption as a food group. Gardening, parental involvement, gamification and goal setting are promising components that could improve the availability, accessibility and consumption of vegetables.
An assessment of systemic inflammation and nutritional status may form the basis of a framework to examine the prognostic value of cachexia in patients with advanced cancer. The objective of the study was to examine the prognostic value of the Global Leadership Initiative on Malnutrition criteria, including BMI, weight loss (WL) and systemic inflammation (as measured by the modified Glasgow Prognostic Score (mGPS)), in advanced cancer patients. Three criteria were examined in a combined cohort of patients with advanced cancer, and their relationship with survival was examined using Cox regression methods. Data were available on 1303 patients. Considering BMI and the mGPS, the 3-month survival rate varied from 74 % (BMI > 28 kg/m2) to 61 % (BMI < 20 kg/m2) and from 84 % (mGPS 0) to 60 % (mGPS 2). Considering WL and the mGPS, the 3-month survival rate varied from 81 % (WL ± 2·4 %) to 47 % (WL ≥ 15 %) and from 93 % (mGPS 0) to 60 % (mGPS 2). Considering BMI/WL grade and mGPS, the 3-month survival rate varied from 86 % (BMI/WL grade 0) to 59 % (BMI/WL grade 4) and from 93 % (mGPS 0) to 63 % (mGPS 2). When these criteria were combined, they better predicted survival. On multivariate survival analysis, the most highly predictive factors were BMI/WL grade 3 (HR 1·454, P = 0·004), BMI/WL grade 4 (HR 2·285, P < 0·001) and mGPS 1 and 2 (HR 1·889, HR 2·545, all P < 0·001). In summary, a high BMI/WL grade and a high mGPS as outlined in the BMI/WL grade/mGPS framework were consistently associated with poorer survival of patients with advanced cancer. It can be readily incorporated into the routine assessment of patients.
To describe the economic, lifestyle and nutritional impact of the COVID-19 pandemic on parents, guardians and children in Malaysia, Indonesia, Thailand and Vietnam.
Design:
Data from the SEANUTS II cohort were used. Questionnaires, including a COVID-19 questionnaire, were used to study the impact of the pandemic on parents/guardians and their children with respect to work status, household expenditures and children’s dietary intake and lifestyle behaviours.
Setting:
Data were collected in Malaysia, Indonesia, Thailand and Vietnam between May 2019 and April 2021.
Participants:
In total, 9203 children, aged 0·5–12·9 years, including their parents/guardians.
Results:
Children and their families were significantly affected by the pandemic. Although the impact of lockdown measures on children’s food intake has been relatively mild in all countries, food security was negatively impacted, especially in Indonesia. Surprisingly, in Malaysia, lockdown resulted in overall healthier dietary patterns with more basic food groups and less discretionary foods. Consumption of milk/dairy products, however, decreased. In the other countries, intake of most food groups did not change much during lockdown for households based on self-reporting. Only in rural Thailand, some marginal decreases in food intakes during lockdown persisted after lockdown. Physical activity of children, monthly household income and job security of the parents/guardians were negatively affected in all countries due to the pandemic.
Conclusion:
The COVID-19 pandemic has significantly impacted societies in South-East Asia. To counteract negative effects, economic measures should be combined with strategies to promote physical activity and eating nutrient-adequate diets to increase resilience of the population.
The aim of this study was to analyse the validity and reliability of the Turkish version of the renal inpatient nutrition screening tool (Renal iNUT) for haemodialysis patients. The Renal iNUT and the malnutrition universal screening tool (MUST) were used in adult haemodialysis patients at two different centres to identify malnutrition. The subjective global assessment (SGA), regarded as the gold standard for nutritional status assessment, was utilised for comparison. Structural validity was assessed using biochemical values and anthropometric measurements, while reliability was assessed using repeated the Renal iNUT assessment. Of the 260 patients admitted, 42·3 % were malnourished (SGA score was B or C). According to the Renal iNUT, 59·6 % of the patients were at increased risk for malnutrition (score ≥ 1) and 3·8 % required referral to a dietitian (score ≥ 2). According to the MUST, 13·1 % of the patients were at increased risk for malnutrition and 8·5 % required referral to a dietitian. The Renal iNUT was found to be more sensitive in detecting increased risk of malnutrition in haemodialysis patients compared with the MUST (59·6 % v. 13·1 %). According to the SGA, the sensitivity of the Renal iNUT is higher compared to the MUST (89 % and 45 %, respectively). Kappa-assessed reliability of the Renal iNUT was 0·48 (95 % CI, 0·58, 0·9) and a moderate concordance was observed. The Renal iNUT is a valid and reliable nutritional screening tool for evaluating haemodialysis patients to determine their nutritional status. The use of the Renal iNUT by dietitians will contribute to the identification of malnutrition and its treatment.
Central America was a “hot spot” in the Cold War, constituting a strategic zone for US campaigns against communism from the 1960s to the 1980s. During the same period, the region was also a “hot spot” due to the critical nutritional situation of its poorest populations. Informed by the idea of a “protein gap,” international organizations and scientific institutions carried out field investigations and nutritional surveys to identify dietary deficiencies, their causes, and possible solutions. This chapter explores the role that bean varietal improvement played in this situation of war and nutritional crisis, and the political and social conditions under which bean research took shape. It describes the research programs that the International Center for Tropical Agriculture (CIAT) promoted in Latin America through the 1980s and Central American countries’ participation in these. It reviews the bean program established by CIAT in Latin America and Africa and a regional program created specifically for Central America and the Caribbean. It then interprets the evolution of these programs in the context of civil war and economic crisis in Central America between 1970 and 1990.
Previous studies have shown that patients who are readmitted to the hospital from a skilled nursing facility (SNF) have a higher mortality rate. The objective of this study is to determine factors associated with high mortality rate for older adults who require hospital readmission while on presumed short stay in SNF to trigger a goals-of-care discussion.
Methods
Retrospective study of 847 patients aged 65 and above who were discharged from 1 large urban academic medical center to multiple SNF in 2019.
Results
Charts of 847 patients admitted to SNF after an acute hospital stay were reviewed; their overall 1-year mortality rate was 28.3%. The 1-year mortality rate among individuals readmitted to the hospital within 30 days of discharge to SNF was 50%, whereas for those who did not require readmission, the rate was 22%. For the most common diagnostic categories of nervous system, and musculoskeletal, patients with readmission to hospital within 30 days of discharge to SNF had a roughly threefold higher 1-year mortality rate. Worse frailty score on hospital readmission, poor nutrition, and weight loss were the most impactful individual factors carrying a higher degree of mortality of up to 83%.
Significance of results
Hospital discharge to SNF and readmission from SNF within 30 days, further decline in functional status, and malnutrition characterize high-risk groups that should trigger care preference and prognostic discussions with patients as these events may be markers of vulnerability and are associated with high 1-year mortality rates.
The negative role of malnutrition in patients with Crohn’s disease is known; however, many coexisting disease-related factors could cause misinterpretation of the real culprit. This study aimed to describe the role of malnutrition using a novel methodology, entropy balancing. This was a retrospective analysis of consecutive patients undergoing elective major surgery for Crohn’s disease, preoperatively screened following the European Society for Clinical Nutrition guidelines. Two-step entropy balancing was applied to the group of malnourished patients to obtain an equal cohort having a null or low risk of malnutrition. The first reweighting homogenised the cohorts for non-modifiable confounding factors. The second reweighting matched the two groups for modifiable nutritional factors, assuming successful treatment of malnutrition. The entropy balancing was evaluated using the d-value. Postoperative results are reported as mean difference or OR, with a 95 % CI. Of the 183 patients, 69 (37·7 %) were at moderate/high risk for malnutrition. The malnourished patients had lower BMI (d = 1·000), Hb (d = 0·715), serum albumin (d = 0·981), a higher lymphocyte count (d = 0·124), Charlson Comorbidity Index (d = 0·257), American Society of Anaesthesiologists (d = 0·327) and Harvey-Bradshaw scores (d = 0·696). Protective loop ileostomy was more frequently performed (d = 0·648) in the malnourished group. After the first reweighting, malnourished patients experienced a prolonged length of stay (mean difference = 1·9; 0·11, 3·71, days), higher overall complication rate (OR 4·42; 1·39, 13·97) and higher comprehensive complication index score (mean difference = 8·9; 2·2 15·7). After the second reweighting, the postoperative course of the two groups was comparable. Entropy balancing showed the independent role of preoperative malnutrition and the possible advantages obtainable from a pre-habilitation programme in Crohn’s disease patients awaiting surgery.