To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction. Some medical centers and surgeons require patients to stop smoking cigarettes prior to elective orthopaedic surgeries in an effort to decrease surgical complications. Given higher rates of smoking among rural individuals, rural patients may be disproportionately impacted by these requirements. We assessed the perceptions and experiences of rural-residing Veterans and clinicians related to this requirement. Methods. We conducted qualitative semistructured one-on-one interviews of 26 rural-residing veterans, 10 VA orthopaedic surgery staff (from two Veterans Integrated Services Networks), 24 PCPs who serve rural veterans (14 VA; 10 non-VA), and 4 VA pharmacists. Using the knowledge, attitudes, and behavior framework, we performed conventional content analysis. Results. We found three primary themes across respondents: (1) knowledge of and the evidence base for the requirement varied widely; (2) strong personal attitudes toward the requirement; and (3) implementation and possible implications of this requirement. All surgery staff reported knowledge of requirements at their institution. VA PCPs reported knowledge of requirements but typically could not recall specifics. Most patients were unaware. The majority of respondents felt this requirement could increase motivation to quit smoking. Some PCPs felt a more thorough explanation of smoking-related complications would result in increased quit attempts. About half of all patients reported belief that the requirement was reasonable regardless of initial awareness. Respondents expressed little concern that the requirement might increase rural-urban disparities. Most PCPs and patients felt that there should be exceptions for allowing surgery, while surgical staff disagreed. Discussion. Most respondents thought elective surgery was a good motivator to quit smoking; but patients, PCPs, and surgical staff differed on whether there should be exceptions to the requirement that patients quit preoperatively. Future efforts to augment perioperative smoking cessation may benefit from improving coordination across services and educating patients more about the benefits of quitting.
Growth failure in infants born with CHD is a persistent problem, even in those provided with adequate nutrition.
Objective:
To summarise the published data describing the change in urinary metabolites during metabolic maturation in infants with CHD and identify pathways amenable to therapeutic intervention
Design:
Scoping review.
Eligibility criteria:
Studies using qualitative or quantitative methods to describe urinary metabolites pre- and post-cardiac surgery and the relationship with growth in infants with CHD.
Sources of evidence:
NICE Healthcare Databases website was used as a tool for multiple searches.
Results:
347 records were identified, of which 37 were duplicates. Following the removal of duplicate records, 310 record abstracts and titles were screened for inclusion. The full texts of eight articles were reviewed for eligibility, of which only two related to infants with CHD. The studies included in the scoping review described urinary metabolites in 42 infants. A content analysis identified two overarching themes of metabolic variation predictive of neurodevelopmental abnormalities associated with anaerobic metabolism and metabolic signature associated with the impact on gut microbiota, inflammation, energy, and lipid digestion.
Conclusion:
The results of this scoping review suggest that there are considerable gaps in our knowledge relating to metabolic maturation of infants with CHD, especially with respect to growth. Surgery is a key early life feature for CHD infants and has an impact on the developing biochemical phenotype with implications for metabolic pathways involved in immunomodulation, energy, gut microbial, and lipid metabolism. These early life fingerprints may predict those individuals at risk for neurodevelopmental abnormalities.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Non-invasive prenatal testing (NIPT) for the detection of foetal aneuploidy through analysis of cell-free DNA (cfDNA) in maternal blood is offered routinely by many healthcare providers across the developed world. This testing has recently been recommended for evaluative implementation in the UK National Health Service (NHS) foetal anomaly screening pathway as a contingent screen following an increased risk of trisomy 21, 18 or 13. In preparation for delivering a national service, we have implemented cfDNA-based NIPT in our Regional Genetics Laboratory. Here, we describe our validation and verification processes and initial experiences of the technology prior to rollout of a national screening service.
Methods
Data are presented from more than 1000 patients (215 retrospective and 840 prospective) from ‘high- and low-risk pregnancies’ with outcome data following birth or confirmatory invasive prenatal sampling. NIPT was by the Illumina Verifi® test.
Results
Our data confirm a high-fidelity service with a failure rate of ~0.24% and a high sensitivity and specificity for the detection of foetal trisomy 13, 18 and 21. Secondly, the data show that a significant proportion of patients continue their pregnancies without prenatal invasive testing or intervention after receiving a high-risk cfDNA-based result. A total of 46.5% of patients referred to date were referred for reasons other than high screen risk. Ten percent (76/840 clinical service referrals) of patients were referred with ultrasonographic finding of a foetal structural anomaly, and data analysis indicates high- and low-risk scan indications for NIPT.
Conclusions
NIPT can be successfully implemented into NHS regional genetics laboratories to provide high-quality services. NHS provision of NIPT in patients with high-risk screen results will allow for a reduction of invasive testing and partially improve equality of access to cfDNA-based NIPT in the pregnant population. Patients at low risk for a classic trisomy or with other clinical indications are likely to continue to access cfDNA-based NIPT as a private test.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
A field study was conducted for the 2014 and 2015 growing season in Arkansas, Indiana, Illinois, Missouri, Ohio, and Tennessee to determine the effect of cereal rye and either oats, radish, or annual ryegrass on the control of Amaranthus spp. when integrated with comprehensive herbicide programs in glyphosate-resistant and glufosinate-resistant soybean. Amaranthus species included redroot pigweed, waterhemp, and Palmer amaranth. The two herbicide programs included were: a PRE residual herbicide followed by POST application of foliar and residual herbicide (PRE/POST); or PRE residual herbicide followed by POST application of foliar and residual herbicide, followed by another POST application of residual herbicide (PRE/POST/POST). Control was not affected by type of soybean resistance trait. At the end of the season, herbicides controlled 100 and 96% of the redroot pigweed and Palmer amaranth, respectively, versus 49 and 29% in the absence of herbicides, averaged over sites and other factors. The PRE/POST and PRE/POST/POST herbicide treatments controlled 83 and 90% of waterhemp at the end of the season, respectively, versus 14% without herbicide. Cover crop treatments affected control of waterhemp and Palmer amaranth and soybean yield, only in the absence of herbicides. The rye cover crop consistently reduced Amaranthus spp. density in the absence of herbicides compared to no cover treatment.
Pigweeds are among the most abundant and troublesome weed species across Midwest and mid-South soybean production systems because of their prolific growth characteristics and ability to rapidly evolve resistance to several herbicide sites of action. This has renewed interest in diversifying weed management strategies by implementing integrated weed management (IWM) programs to efficiently manage weeds, increase soybean light interception, and increase grain yield. Field studies were conducted across 16 site-years to determine the effectiveness of soybean row width, seeding rate, and herbicide strategy as components of IWM in glufosinate-resistant soybean. Sites were grouped according to optimum adaptation zones for soybean maturity groups (MGs). Across all MG regions, pigweed density and height at the POST herbicide timing, and end-of-season pigweed density, height, and fecundity were reduced in IWM programs using a PRE followed by (fb) POST herbicide strategy. Furthermore, a PRE fb POST herbicide strategy treatment increased soybean cumulative intercepted photosynthetically active radiation (CIPAR) and subsequently, soybean grain yield across all MG regions. Soybean row width and seeding rate manipulation effects were highly variable. Narrow row width (≤ 38 cm) and a high seeding rate (470,000 seeds ha−1) reduced end-of-season height and fecundity variably across MG regions compared with wide row width (≥ 76 cm) and moderate to low (322,000 to 173,000 seeds ha−1) seeding rates. However, narrow row widths and high seeding rates did not reduce pigweed density at the POST herbicide application timing or at soybean harvest. Across all MG regions, soybean CIPAR increased as soybean row width decreased and seeding rate increased; however, row width and seeding rate had variable effects on soybean yield. Furthermore, soybean CIPAR was not associated with end-of-season pigweed growth and fecundity. A PRE fb POST herbicide strategy was a necessary component for an IWM program as it simultaneously managed pigweeds, increased soybean CIPAR, and increased grain yield.
Herbicide-resistant Amaranthus spp. continue to cause management difficulties in soybean. New soybean technologies under development, including resistance to various combinations of glyphosate, glufosinate, dicamba, 2,4-D, isoxaflutole, and mesotrione, will make possible the use of additional herbicide sites of action in soybean than is currently available. When this research was conducted, these soybean traits were still regulated and testing herbicide programs with the appropriate soybean genetics in a single experiment was not feasible. Therefore, the effectiveness of various herbicide programs (PRE herbicides followed by POST herbicides) was evaluated in bare-ground experiments on glyphosate-resistant Palmer amaranth and glyphosate-resistant waterhemp (both tall and common) at locations in Arkansas, Illinois, Indiana, Missouri, Nebraska, and Tennessee. Twenty-five herbicide programs were evaluated; 5 of which were PRE herbicides only, 10 were PRE herbicides followed by POST herbicides 3 to 4 wks after (WA) the PRE application (EPOST), and 10 were PRE herbicides followed by POST herbicides 6 to 7 WA the PRE application (LPOST). Programs with EPOST herbicides provided 94% or greater control of Palmer amaranth and waterhemp at 3 to 4 WA the EPOST. Overall, programs with LPOST herbicides resulted in a period of weed emergence in which weeds would typically compete with a crop. Weeds were not completely controlled with the LPOST herbicides because weed sizes were larger (≥ 15 cm) compared with their sizes at the EPOST application (≤ 7 cm). Most programs with LPOST herbicides provided 80 to 95% control at 3 to 4 WA applied LPOST. Based on an orthogonal contrast, using a synthetic-auxin herbicide LPOST improves control of Palmer amaranth and waterhemp over programs not containing a synthetic-auxin LPOST. These results show herbicides that can be used in soybean and that contain auxinic- or HPPD-resistant traits will provide growers with an opportunity for better control of glyphosate-resistant Palmer amaranth and waterhemp over a wide range of geographies and environments.
Field studies were conducted at 35 sites throughout the north-central United States in 1998 and 1999 to determine the effect of postemergence glyphosate application timing on weed control and grain yield in glyphosate-resistant corn. Glyphosate was applied at various timings based on the height of the most dominant weed species. Weed control and corn grain yields were considerably more variable when glyphosate was applied only once. The most effective and consistent season-long annual grass and broadleaf weed control occurred when a single glyphosate application was delayed until weeds were 15 cm or taller. Two glyphosate applications provided more consistent weed control when weeds were 10 cm tall or less and higher corn grain yields when weeds were 5 cm tall or less, compared with a single application. Weed control averaged at least 94 and 97% across all sites in 1998 and 1999, respectively, with two glyphosate applications but was occasionally less than 70% because of late emergence of annual grass and Amaranthus spp. or reduced control of Ipomoea spp. With a single application of glyphosate, corn grain yield was most often reduced when the application was delayed until weeds were 23 cm or taller. Averaged across all sites in 1998 and 1999, corn grain yields from a single glyphosate application at the 5-, 10-, 15-, 23-, and 30-cm timings were 93, 94, 93, 91, and 79% of the weed-free control, respectively. There was a significant effect of herbicide treatment on corn grain yield in 23 of the 35 sites when weed reinfestation was prevented with a second glyphosate application. When weed reinfestation was prevented, corn grain yield at the 5-, 10-, and 15-cm application timings was 101, 97, and 93% of the weed-free control, respectively, averaged across all sites. Results of this study suggested that the optimum timing for initial glyphosate application to avoid corn grain yield loss was when weeds were less than 10 cm in height, no more than 23 d after corn planting, and when corn growth was not more advanced than the V4 stage.
Palmer amaranth and waterhemp have become increasingly troublesome weeds throughout the United States. Both species are highly adaptable and emerge continuously throughout the summer months, presenting the need for a residual PRE application in soybean. To improve season-long control of Amaranthus spp., 19 PRE treatments were evaluated on glyphosate-resistant Palmer amaranth in 2013 and 2014 at locations in Arkansas, Indiana, Nebraska, Illinois, and Tennessee; and on glyphosate-resistant waterhemp at locations in Illinois, Missouri, and Nebraska. The two Amaranthus species were analyzed separately; data for each species were pooled across site-years, and site-year was included as a random variable in the analyses. The dissipation of weed control throughout the course of the experiments was compared among treatments with the use of regression analysis where percent weed control was described as a function of time (the number of weeks after treatment [WAT]). At the mean (i.e., average) WAT (4.3 and 3.2 WAT for Palmer amaranth and waterhemp, respectively) isoxaflutole + S-metolachlor + metribuzin had the highest predicted control of Palmer amaranth (98%) and waterhemp (99%). Isoxaflutole + S-metolachlor + metribuzin, S-metolachlor + mesotrione, and flumioxazin + pyroxasulfone had a predicted control ≥ 97% and similar model parameter estimates, indicating control declined at similar rates for these treatments. Dicamba and 2,4-D provided some, short-lived residual control of Amaranthus spp. When dicamba was added to metribuzin or S-metolachlor, control increased compared to dicamba alone. Flumioxazin + pyroxasulfone, a currently labeled PRE, performed similarly to treatments containing isoxaflutole or mesotrione. Additional sites of action will provide soybean growers more opportunities to control these weeds and reduce the potential for herbicide resistance.
Thirteen field experiments were conducted in Illinois, Indiana, Ohio, and Ontario from 2005 to 2007 to determine the effects of simulated glyphosate drift followed by in-crop applications of nicosulfuron/rimsulfuron plus dicamba/diflufenzopyr or foramsulfuron plus bromoxynil plus atrazine on nontransgenic corn injury, height, stand count, shoot dry weight, and yield. Simulated glyphosate drift at 100 and 200 g/ha, resulted in 11 to 61% visual crop injury and a 19 to 45% decrease in corn height. Simulated glyphosate drift at 200 g/ha caused a reduction in shoot dry weight by 46%, stand count by 28% and yield by 49 to 56%. Generally, simulated glyphosate drift followed by the in-crop herbicides resulted in an additive response with respect to visual crop injury, height, stand count, shoot dry weight, and yield.
Field experiments were conducted across the north-central United States to determine the benefits of various weed control strategies in corn. Weed control, corn yield, and economic return increased when a preemergence (PRE) broad-spectrum herbicide was followed by (fb) postemergence (POST) herbicides. Weed control decisions based on field scouting after a PRE broad-spectrum herbicide application increased weed control and economic return. Application of a PRE grass herbicide fb a POST herbicide based on field scouting resulted in less control of velvetleaf and morningglory species, corn yield, and economic return compared with a PRE broad-spectrum herbicide application fb scouting. Cultivation after a PRE broad-spectrum herbicide application increased weed control and corn yield compared with the herbicide applied alone, but economic return was not increased. An early-postemergence herbicide application fb cultivation resulted in the highest level of broadleaf weed control, the highest corn yield, and the greatest economic return compared with all other strategies. Weed control based on scouting proved to be useful in reducing the effect of weed escapes on corn yield and increased economic return compared with PRE herbicide application alone. However, economic return was not greater than the PRE fb planned POST or total POST strategies.
The Middle Jurassic is a poorly sampled time interval for non-pelagic neosuchian crocodyliforms, which obscures our understanding of the origin and early evolution of major clades. Here we report a lower jaw from the Middle Jurassic (Bathonian) Duntulm Formation of the Isle of Skye, Scotland, UK, which consists of an isolated and incomplete left dentary and part of the splenial. Morphologically, the Skye specimen closely resembles the Cretaceous neosuchians Pachycheilosuchus and Pietraroiasuchus, in having a proportionally short mandibular symphysis, shallow dentary alveoli and inferred weakly heterodont dentition. It differs from other crocodyliforms in that the Meckelian canal is dorsoventrally expanded posterior to the mandibular symphysis and drastically constricted at the 7th alveolus. The new specimen, together with the presence of Theriosuchus sp. from the Valtos Formation and indeterminate neosuchians from the Kilmaluag Formation, indicates the presence of a previously unrecognised, diverse crocodyliform fauna in the Middle Jurassic of Skye, and Europe more generally. Small-bodied neosuchians were present, and ecologically and taxonomically diverse, in nearshore environments in the Middle Jurassic of the UK.
The amplitude of the cortically generated somatosensory evoked potential (SSEP) is used to predict outcome in comatose patients. The relationship between epileptiform discharges and SSEP amplitude has not been elucidated in those patients.
Methods
Bilateral median nerve SSEP and electroencephalograph (EEG) studies were performed in a comatose patient (patient 1) 1 day after cardiac surgery and repeated 4 days later. He had tranexamic acid administered before and during surgery. Another comatose patient (patient 2) had the same studies performed 1 day after sustaining 10 minutes of pulseless electrical cardiac activity.
Results
Both comatose patients had epileptiform discharges (on EEG) that were coincident with giant cortically generated SSEPs. In patient 1, the EEG and SSEP studies repeated 5 days postoperatively showed no epileptiform discharges, and the cortically generated SSEP amplitude was decreased (normalized) compared with that obtained one day postoperatively. He emerged from coma and had a good recovery. Patient 2 died shortly after EEG and SSEP testing.
Conclusions
Epileptiform discharges were associated with giant cortically generated median nerve SSEP amplitude (tranexamic acid was implicated in patient 1 and anoxic brain injury in patient 2). Accordingly, those who use the amplitude of cortically generated SSEPs for predicting outcome in comatose patients should consider the presence of epileptiform discharges (detected by EEG) as a potential confounding factor.