To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Currently, the Infectious Diseases Society of America (IDSA) Guidelines for Uncomplicated Urinary Tract Infections (UTIs) recommend a 3 to 7-day antibiotic course of oral beta-lactam agents when other recommended agents are not feasible. In recent years, studies have demonstrated efficacy in shorter courses of antimicrobial therapy for acute uncomplicated cystitis compared with longer courses, but there is limited data regarding intravenous beta-lactams for acute uncomplicated cystitis.
Methods:
This single-center, retrospective, non-inferiority cohort study included adult patients admitted to University of Kentucky Albert B. Chandler Medical Center or Good Samaritan Hospital with acute uncomplicated cystitis. The primary outcome assessed was treatment failure, defined as the need for retreatment with additional antibiotic therapy within 30 days of antibiotic completion. Secondary outcomes include incidence of C. difficile infection within 30 days of antibiotic therapy, hospital readmission, and outpatient telephone encounters within 30 days of discharge. Patients were divided into the short course (those receiving three days or less of beta-lactam antibiotics and at least 1 day was IV) or the long course (those receiving four or more days of beta lactam antibiotics).
Results:
Overall, 52 patients met the criteria to be included in the final study, with 33 in the short course beta-lactam group and 19 in the long-course beta-lactam group. Failure rates between short and long course were 15.2% and 15.8% respectively (p=1.000). Ceftriaxone was the most commonly utilized antibiotic in both groups. The median total antibiotic duration between the long and short groups was 3 and 6 days respectively (p<0.001).
Conclusions:
In hospitalized patients warranting initial IV therapy for acute uncomplicated cystitis, a 3-day total of beta-lactam therapy, with transition to oral, should be considered.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
Objectives/Goals: The Wake Forest Clinical and Translational Science Institute (CTSI) has integrated academic goals of T0-T4 translation, scholarship, and education into our Academic Learning Health System (aLHS) framework. Our Translation Research Academy (TRA) provides rigorous training for outstanding and diverse K12 and early-career faculty to develop LHS core competencies. Methods/Study Population: The TRA Forum is the main vehicle for delivering an aLHS-oriented curriculum. Currently, the program includes six K12 scholars and 18 other early-career research faculty with facilitated access to CTSI resources. The TRA Forum is a 2-year seminar series that meets twice a month to discuss topics relevant to the aLHS, leadership, and career development. Inclusion of first- and second-year scholars facilitates peer mentorship, allowing Year 2 scholars to share insights with new scholars. Forum sessions are developed around adult learning theory: Each participant is asked to contribute their experience to discussions, and sessions focus on real-world examples. Results/Anticipated Results: Scholar and faculty commitment is very high. For the first 30 min., scholars present their work in small groups. This extends the range of disciplines exposed (64% of TRA graduates found this very helpful) and promotes translational traits of boundary crosser, team player, and systems thinker. Participants view the TRA as an opportunity to form internal peer networks, promote peer mentoring, and establish new collaborations. The remaining 60 minutes are used for education. Sessions include nominated topics and those providing a solid foundation in core aLHS competencies and characteristics of translational scientists. Educational sessions (97%) were rated as helpful or very helpful. Discussion/Significance of Impact: TRA scholars receive rigorous training in a highly supportive environment to produce aLHS researchers with skills to transcend boundaries, innovate systems, create new knowledge, and rigorously evaluate results.
Through compositional inclusion or exclusion, the photograph can assert and communicate what belongs in a picture, in a landscape, in an ecosystem. It can illuminate what we deem conservation-worthy, or, on a larger scale, which extinctions are attention-worthy. Photographic practice helps to illuminate the active nature of extinction, and our choices as actors and witnesses within that process. Here, researchers from the University of Leeds’ Extinction Studies Doctoral Training Programme present individual reflections on interdisciplinary practice-led research in the Scottish Small Isles. We consider how photography, as a form of praxis, can generate new forms of knowledge surrounding extinction: its meanings, representations, and legacies, particularly through visual representation. We offer seven perspectives on contemporary image-making, from disciplines including philosophy, conservation biology, literature, sociology, geology, cultural anthropology, and palaeontology. Researchers gathered experiential, ethical, even biological meanings from considering what to include or exclude in images: from the micro to the macro, the visible to the invisible, the aesthetic to the ecological. We draw conclusions around meaning-making through the process of photography itself, and the tensions encountered through framing and decision-making in a time of mass ecological decline.
In July 2022, a genetically linked and geographically dispersed cluster of 12 cases of Shiga toxin-producing Escherichia coli (STEC) O103:H2 was detected by the UK Health Security Agency using whole genome sequencing. Review of food history questionnaires identified cheese (particularly an unpasteurized brie-style cheese) and mixed salad leaves as potential vehicles. A case–control study was conducted to investigate exposure to these products. Case food history information was collected by telephone. Controls were recruited using a market research panel and self-completed an online questionnaire. Univariable and multivariable analyses were undertaken using Firth Logistic Regression. Eleven cases and 24 controls were included in the analysis. Consumption of the brie-style cheese of interest was associated with illness (OR 57.5, 95% confidence interval: 3.10–1,060). Concurrently, the production of the brie-style cheese was investigated. Microbiological sample results for the cheese products and implicated dairy herd did not identify the outbreak strain, but did identify the presence of stx genes and STEC, respectively. Together, epidemiological, microbiological, and environmental investigations provided evidence that the brie-style cheese was the vehicle for this outbreak. Production of unpasteurized dairy products was suspended by the business operator, and a review of practices was performed.
Neurocognitive decline is prevalent in patients with metastatic cancers, attributed to various disease, treatment, and individual factors. Whether the presence of brain metastases (BrMets) contributes to neurocognitive decline is unclear. Aims of this study are to examine neurocognitive performance in BrMets patients and compare findings to patients with advanced metastatic cancer without BrMets. Here, we present baseline findings from an ongoing, prospective longitudinal study.
Participants and Methods:
English-speaking adults with advanced metastatic cancers were recruited from the brain metastases and lung clinics at the Princess Margaret Cancer Centre. Participants completed standardized tests (WTAR, HVLT-R, BVMT-R, COWAT, Trailmaking test, WAIS-IV Digit Span) and questionnaires (FACT-Cog v3, EORTC-QLQ C30 and BN20, PROMIS Depression(8a) and Anxiety(6a)) prior to cranial radiotherapy for those who required it. Test scores were converted to z-scores based on published normative data and averaged to create a composite neurocognitive performance score and domain scores for memory, attention/working memory, processing speed and executive function. Neurocognitive impairment was defined according to International Cancer and Cognition Task Force criteria. Univariate and multivariate regressions were used to identify individual, disease and treatment variables that predict cognitive performance.
Results:
76 patients (mean (SD) age: 63.2 (11.7) years; 53% male) with BrMets were included. 61% experienced neurocognitive impairment overall; impairment rates varied across domains (38% memory, 39% executive functioning, 13% attention/working memory, 8% processing speed). BrMets quantity, volume, and location were not associated with neurocognitive performance. Better performance status (ECOG; ß[95%CI];-0.38[-0.70,-0.05], p=0.021), higher premorbid IQ (0.34[0.10,0.58], p=0.005) and greater cognitive concerns (0.02[-3.9e-04,0.04], p=0.051) were associated with better neurocognitive performance in univariate analyses. Only premorbid IQ (0.37[0.14,0.60], p=0.003) and cognitive concerns (0.02[0.0004, 0.03], p=0.05) remained significant in multivariate analysis. We also recruited 31 patients with metastatic non-small cell lung cancer (mNSCLC) with no known BrMets (age: 67.5 (8.3); 32% male) and compared them to the subgroup of BrMets patients in our sample with mNSCLC (N=32; age: 67.8 (11.7); 53% male). We found no differences in impairment rates (BrMets/non-BrMets: Cognitive Composite, 59%/55%; Memory, 31%/32%; Executive Functioning, 35%/29%; Attention/working memory, 16%/13%; Processing speed, 7%/6%; Wilcoxon rank-sum test, all p-value’s > 0.5). The presence or absence of BrMets did not predict neurocognitive performance. Among patients with mNSCLC, higher education (0.11[0.03,0.18], p=0.004) and premorbid IQ (0.36[0.12,0.61], p=0.003), fewer days since primary diagnosis (0.00290[-0.0052,-0.0005], p=0.015) fewer pack-years smoking history (0.01[0.02,-0.001], p=0.027) and greater cognitive concerns (0.02[7e-5,0.04], p=0.045) were associated with better neurocognitive performance in univariate analyses; only premorbid IQ (0.26[0.02,0.51], p=0.04) and cognitive concerns (0.02[0.01,0.04], p=0.02) remained significant in multivariate analysis.
Conclusions:
Cognitive impairment is prevalent in patients with advanced metastatic cancers, particularly affecting memory and executive functioning. However, 39% of patients in our sample were not impaired in any domain. We found no associations between the presence of BrMets and neurocognitive function in patients with advanced cancers prior to cranial radiation. Premorbid IQ, a proxy for cognitive reserve, was associated with cognitive outcomes in our sample. Our longitudinal study will allow us to identify risk and resilience factors associated with neurocognitive changes in patients with metastatic cancers to better inform therapeutic interventions in this population.
Patients with Fontan failure are high-risk candidates for heart transplantation and other advanced therapies. Understanding the outcomes following initial heart failure consultation can help define appropriate timing of referral for advanced heart failure care.
Methods:
This is a survey study of heart failure providers seeing any Fontan patient for initial heart failure care. Part 1 of the survey captured data on clinical characteristics at the time of heart failure consultation, and Part 2, completed 30 days later, captured outcomes (death, transplant evaluation outcome, and other interventions). Patients were classified as “too late” (death or declined for transplant due to being too sick) and/or “care escalation” (ventricular assist device implanted, inotrope initiated, and/or listed for transplant), within 30 days. “Late referral” was defined as those referred too late and/or had care escalation.
Results:
Between 7/2020 and 7/2022, 77 Fontan patients (52% inpatient) had an initial heart failure consultation. Ten per cent were referred too late (6 were too sick for heart transplantation with one subsequent death, and two others died without heart transplantation evaluation, within 30 days), and 36% had care escalation (21 listed ± 5 ventricular assist device implanted ± 6 inotrope initiated). Overall, 42% were late referrals. Heart failure consultation < 1 year after Fontan surgery was strongly associated with late referral (OR 6.2, 95% CI 1.8–21.5, p=0.004).
Conclusions:
Over 40% of Fontan patients seen for an initial heart failure consultation were late referrals, with 10% dying or being declined for transplant within a month of consultation. Earlier referral, particularly for those with heart failure soon after Fontan surgery, should be encouraged.
In England, a range of mental health crisis care models and approaches to organising crisis care systems have been implemented, but characteristics associated with their effectiveness are poorly understood.
Aims
To (a) develop a typology of catchment area mental health crisis care systems and (b) investigate how crisis care service models and system characteristics relate to psychiatric hospital admissions and detentions.
Method
Crisis systems data were obtained from a 2019 English national survey. Latent class analyses were conducted to identify discernible typologies, and mixed-effects negative binomial regression models were fitted to explore associations between crisis care models and admissions and detention rates, obtained from nationally reported data.
Results
No clear typology of catchment area crisis care systems emerged. Regression models suggested that provision of a crisis telephone service within the local crisis system was associated with a 11.6% lower admissions rate and 15.3% lower detention rate. Provision of a crisis cafe was associated with a 7.8% lower admission rates. The provision of a crisis assessment team separate from the crisis resolution and home treatment service was associated with a 12.8% higher admission rate.
Conclusions
The configuration of crisis care systems varies considerably in England, but we could not derive a typology that convincingly categorised crisis care systems. Our results suggest that a crisis phone line and a crisis cafe may be associated with lower admission rates. However, our findings suggest crisis assessment teams, separate from home treatment teams, may not be associated with reductions in admission and detentions.
Diets with a low proportion of energy from protein have shown to cause overconsumption of non-protein energy, known as Protein Leverage. Older adults are susceptible to nutritional inadequacy. The aim was to investigate associations between protein to non-protein ratio (P:NP) and intakes of dietary components and assess the nutritional adequacy of individuals aged 65–75 years from the Nutrition for Healthy Living (NHL) Study.
Design:
Cross-sectional. Nutritional intakes from seven-day weighed food records were compared with the Nutrient Reference Values for Australia and New Zealand, Australian Guide to Healthy Eating, Australian Dietary Guidelines and World Health Organisation Free Sugar Guidelines. Associations between P:NP and intakes of dietary components were assessed through linear regression analyses.
Setting:
NHL Study.
Participants:
113 participants.
Results:
Eighty-eight (59 female and 29 male) with plausible dietary data had a median (interquartile range) age of 69 years (67–71), high education level (86 %) and sources of income apart from the age pension (81 %). Substantial proportions had intakes below recommendations for dairy and alternatives (89 %), wholegrain (89 %) and simultaneously exceeded recommendations for discretionary foods (100 %) and saturated fat (92 %). In adjusted analyses, P:NP (per 1 % increment) was associated with lower intakes of energy, saturated fat, free sugar and discretionary foods and higher intakes of vitamin B12, Zn, meat and alternatives, red meat, poultry and wholegrain % (all P < 0·05).
Conclusions:
Higher P:NP was associated with lower intakes of energy, saturated fat, free sugar and discretionary. Our study revealed substantial nutritional inadequacy in this group of higher socio-economic individuals aged 65–75 years.
In November 2019, an outbreak of Shiga toxin-producing Escherichia coli O157:H7 was detected in South Yorkshire, England. Initial investigations established consumption of milk from a local dairy as a common exposure. A sample of pasteurised milk tested the next day failed the phosphatase test, indicating contamination of the pasteurised milk by unpasteurised (raw) milk. The dairy owner agreed to immediately cease production and initiate a recall. Inspection of the pasteuriser revealed a damaged seal on the flow divert valve. Ultimately, there were 21 confirmed cases linked to the outbreak, of which 11 (52%) were female, and 12/21 (57%) were either <15 or >65 years of age. Twelve (57%) patients were treated in hospital, and three cases developed haemolytic uraemic syndrome. Although the outbreak strain was not detected in the milk samples, it was detected in faecal samples from the cattle on the farm. Outbreaks of gastrointestinal disease caused by milk pasteurisation failures are rare in the UK. However, such outbreaks are a major public health concern as, unlike unpasteurised milk, pasteurised milk is marketed as ‘safe to drink’ and sold to a larger, and more dispersed, population. The rapid, co-ordinated multi-agency investigation initiated in response to this outbreak undoubtedly prevented further cases.
Yarkoni's analysis clearly articulates a number of concerns limiting the generalizability and explanatory power of psychological findings, many of which are compounded in infancy research. ManyBabies addresses these concerns via a radically collaborative, large-scale and open approach to research that is grounded in theory-building, committed to diversification, and focused on understanding sources of variation.
To assess the contribution of different food groups to total salt purchases and to evaluate the estimated reduction in salt purchases if mandatory maximum salt limits in South African legislation were being complied with.
Design:
This study conducted a cross-sectional analysis of purchasing data from Discovery Vitality members. Data were linked to the South African FoodSwitch database to determine the salt content of each food product purchased. Food category and total annual salt purchases were determined by summing salt content (kg) per each unit purchased across a whole year. Reductions in annual salt purchases were estimated by applying legislated maximum limits to product salt content.
Setting:
South Africa.
Participants:
The study utilised purchasing data from 344 161 households, members of Discovery Vitality, collected for a whole year between January and December 2018.
Results:
Vitality members purchased R12·8 billion worth of food products in 2018, representing 9562 products from which 264 583 kg of salt was purchased. The main contributors to salt purchases were bread and bakery products (23·3 %); meat and meat products (19 %); dairy (12·2 %); sauces, dressings, spreads and dips (11·8 %); and convenience foods (8·7 %). The projected total quantity of salt that would be purchased after implementation of the salt legislation was 250 346 kg, a reduction of 5·4 % from 2018 levels.
Conclusions:
A projected reduction in salt purchases of 5·4 % from 2018 levels suggests that meeting the mandatory maximum salt limits in South Africa will make a meaningful contribution to reducing salt purchases.
To assess the associations between nutrient intake and dietary patterns with different sarcopenia definitions in older men.
Design:
Cross-sectional study.
Setting:
Sarcopenia was defined using the Foundation for the National Institutes of Health (FNIH), the European Working Group on Sarcopenia in Older People (EWGSOP) and the European Working Group on Sarcopenia in Older People 2 (EWGSOP2). Dietary adequacy of fourteen nutrients was assessed by comparing participants’ intakes with the Nutrient Reference Values (NRV). Attainment of NRV for nutrients was incorporated into a variable ‘poor’ (meeting ≤ 9) v. ‘good’ (meeting ≥ 10) using the cut-point method. Also, two different dietary patterns, monounsaturated:saturated fat and n-6:n-3 fatty acids ratio and individual nutrients were used as predictor variables.
Participants:
A total of 794 men aged ≥75 years participated in this study.
Results:
The prevalence of sarcopenia by the FNIH, EWGSOP and EWGSOP2 definitions was 12·9 %, 12·9 % and 19·6 %, respectively. With the adjustment, poor nutrient intake was significantly associated with FNIH-defined sarcopenia (OR: 2·07 (95 % CI 1·16, 3·67)), but not with EWGSOP and EWGSPOP2 definitions. The lowest and second-lowest quartiles of protein, Mg and Ca and the lowest quartiles of n-6 PUFA and n-3 PUFA intakes were significantly associated with FNIH-defined sarcopenia. Each unit decrease in n-6:n-3 ratio was significantly associated with a 9 % increased risk of FNIH-defined sarcopenia (OR: 1·09 (95 % CI 1·04, 1·16)).
Conclusions:
Inadequate intakes of nutrients are associated with FNIH-defined sarcopenia in older men, but not with the other two sarcopenia definitions. Further studies are required to understand these relationships.
To examine changes in micronutrient intake over 3 years and identify any associations between socio-economic, health, lifestyle and meal-related factors and these changes in micronutrient intakes among older men.
Design:
Prospective study.
Setting:
Dietary adequacy of individual micronutrient was compared to the estimated average requirement of the nutrient reference values (NRV). Attainment of the NRV for twelve micronutrients was incorporated into a dichotomised variable ‘not meeting’ (meeting ≤ 6) or ‘meeting’ (meeting ≥ 7) and categorised into four categories to assess change in micronutrient intake over 3 years. The multinomial logistic regression analyses were conducted to model predictors of changes in micronutrient intake.
Participants:
Seven hundred and ninety-four men participated in a detailed diet history interview at the third wave (baseline nutrition) and 718 men participated at the fourth wave (3-year follow-up).
Results:
The mean age was 81 years (range 75–99 years). Median intakes of the majority of micronutrients decreased significantly over a 3-year follow-up. Inadequacy of the NRV for thiamine, dietary folate, Zn, Mg, Ca and I were significantly increased at a 3-year follow-up than baseline nutrition. The incidence of inadequate micronutrient intake was 21 % and remained inadequate micronutrient intake was 16·4 % at 3-year follow-up. Changes in micronutrient intakes were significantly associated with participants born in the UK and Italy, low levels of physical activity, having ≥2 medical conditions and used meal services.
Conclusions:
Micronutrient intake decreases with age in older men. Our results suggest that strategies to improve some of the suboptimal micronutrient intakes might need to be developed and implemented for older men.
Nutritional therapy is a cornerstone of burns management. The optimal macronutrient intake for wound healing after burn injury has not been identified, although high-energy, high-protein diets are favoured. The present study aimed to identify the optimal macronutrient intake for burn wound healing. The geometric framework (GF) was used to analyse wound healing after a 10 % total body surface area contact burn in mice ad libitum fed one of the eleven high-energy diets, varying in macronutrient composition with protein (P5−60 %), carbohydrate (C20−75 %) and fat (F20−75 %). In the GF study, the optimal ratio for wound healing was identified as a moderate-protein, high-carbohydrate diet with a protein:carbohydrate:fat (P:C:F) ratio of 1:4:2. High carbohydrate intake was associated with lower mortality, improved body weight and a beneficial pattern of body fat reserves. Protein intake was essential to prevent weight loss and mortality, but a protein intake target of about 7 kJ/d (about 15 % of energy intake) was identified, above which no further benefit was gained. High protein intake was associated with delayed wound healing and increased liver and spleen weight. As the GF study demonstrated that an initial very high protein intake prevented mortality, a very high-protein, moderate-carbohydrate diet (P40:C42:F18) was specifically designed. The dynamic diet study was also designed to combine and validate the benefits of an initial very high protein intake for mortality, and subsequent moderate protein, high carbohydrate intake for optimal wound healing. The dynamic feeding experiment showed switching from an initial very high-protein diet to the optimal moderate-protein, high-carbohydrate diet accelerated wound healing whilst preventing mortality and liver enlargement.
This study identified factors affecting seniors’ ability to self-manage their health following an Emergency Department (ED) visit. Surveys (n = 380) completed by older adults and their caregivers in the ED assessed their understanding of information provided. Interviews (n = 51) completed with a participant subsample up to four weeks post-ED visit examined self-management factors. Perceived understanding of the information (“Yes, definitely”) received in the ED was greater at the time of the visit (91%) than at follow-up (71%). Patients reported self-management was influenced by communication with ED staff, understanding of post-discharge expectations and the health condition(s), caregiver availability, and various external factors. Caregivers also identified support for caregivers and patient resistance to recommendations. Senior-friendly strategies (e.g., recommendations in writing, confirmed understanding of recommendations), particularly those related to identifying those at risk and needing greater transitional supports, and greater access to and integration with community supports could enhance post-ED self-management.
Herbicide active ingredients, formulation type, ambient temperature, and humidity can influence volatility. A method was developed using volatility chambers to compare relative volatility of different synthetic auxin herbicide formulations in controlled environments. 2,4-D or dicamba acid vapors emanating after application were captured in air-sampling tubes at 24, 48, 72, and 96 h after herbicide application. The 2,4-D or dicamba was extracted from sample tubes and quantified using liquid chromatography and tandem mass spectrometry. Volatility from 2,4-D dimethylamine (DMA) was determined to be greater than that of 2,4-D choline in chambers where temperatures were held at 30 or 40 C and relative humidity (RH) was 20% or 50%. Air concentration of 2,4-D DMA was 0.399 µg m−3 at 40 C and 20% RH compared with 0.005 µg m−3 for 2,4-D choline at the same temperature and humidity at 24 h after application. Volatility from 2,4-D DMA and 2,4-D choline increased as temperature increased from 30 to 40 C. However, volatility from 2,4-D choline was lower than observed from 2,4-D DMA. Volatility from 2,4-D choline at 40 C increased from 0.00458 to 0.0263 µg m−3 and from 0.00341 to 0.025 µg m−3 when humidity increased from 20% to 50% at 72 and 96 h after treatment, respectively, whereas, volatility from 2,4-D DMA tended to be higher at 20% RH compared with 50% RH. Air concentration of dicamba diglycolamine was similar at all time points when measured at 40 C and 20% RH. By 96 h after treatment, there was a trend for lower air concentration of dicamba compared with earlier timings. This method using volatility chambers provided good repeatability with low variability across replications, experiments, and herbicides.