We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Evidence suggests the crucial role of dysfunctional default mode (DMN), salience and frontoparietal (FPN) networks, collectively termed the triple network model, in the pathophysiology of treatment-resistant depression (TRD).
Aims
Using the graph theory- and seed-based functional connectivity analyses, we attempted to elucidate the role of low-dose ketamine in the triple networks, namely the DMN, salience and FPN.
Method
Resting-state functional connectivity magnetic resonance imaging (rs–fcMRI) data derived from two previous clinical trials of a single, low-dose ketamine infusion were analysed. In clinical trial 1 (Trial 1), patients with TRD were randomised to either a ketamine or normal saline group, while in clinical trial 2 (Trial 2) those patients with TRD and pronounced suicidal symptoms received a single infusion of either 0.05 mg/kg ketamine or 0.045 mg/kg midazolam. All participants underwent rs–fcMRI pre and post infusion at Day 3. Both graph theory- and seed-based functional connectivity analyses were performed independently.
Results
Trial 1 demonstrated significant group-by-time effects on the degree centrality and cluster coefficient in the right posterior cingulate cortex (PCC) cortex ventral 23a and b (DMN) and the cluster coefficient in the right supramarginal gyrus perisylvian language (salience). Trial 2 found a significant group-by-time effect on the characteristic path length in the left PCC 7Am (DMN). In addition, both ketamine and normal saline infusions exerted a time effect on the cluster coefficient in the right dorsolateral prefrontal cortex a9-46v (FPN) in Trial 1.
Conclusions
These findings may support the utility of the triple-network model in elucidating ketamine’s antidepressant effect. Alterations in DMN, salience and FPN function may underlie this effect.
This Element provides a transregional overview of Pride in Asia, exploring the multifaceted nature of Pride in contemporary LGBTQIA+ events in Thailand, the Philippines, Taiwan, and Hong Kong. This collaborative research that combines individual studies draws on linguistic landscapes as an analytical and methodological approach. Each section examines the different manifestations of Pride as a discourse and the affordances and limitations of this discourse in facilitating the social, political, and cultural projects of LGBTQIA+ people in Asia, illustrating both commonalities and specificities in Asian Pride movements. Analyzing a variety of materials such as protest signs, t-shirts, and media reports, each section illustrates how modes of semiosis, through practice, intersect notions of gender and sexuality with broader social and political formations. The authors thus emphasize the need to view Pride not as a uniform global phenomenon but as a dynamic, locally shaped expression of LGBTQIA+ solidarity.
We developed a real-world evidence (RWE) based Markov model to project the 10-year cost of care for patients with depression from the public payer’s perspective to inform early policy and resource planning in Hong Kong.
Methods
The model considered treatment-resistant depression (TRD) and development of comorbidities along the disease course. The outcomes included costs for all-cause and psychiatric care. From our territory-wide electronic medical records, we identified 25,190 patients with newly diagnosed depression during the period from 2014 to 2016, with follow-up until December 2020 for real-world time-to-event patterns. Costs and time varying transition inputs were derived using negative binomial and parametric survival modeling. The model is available as a closed cohort, which studies a fixed cohort of incident patients, or an open cohort that introduces new patients every year. Utilities values and the number of incident cases per year were derived from published sources.
Results
There were 9,217 new patients with depression in 2023. Our closed cohort model projected that the cumulative cost of all-cause and psychiatric care for these patients would reach USD309 million and USD58 million by 2032, respectively. In our open cohort model, 55,849 to 57,896 active prevalent cases would cost more than USD322 million and USD61 million annually in all-cause and psychiatric care, respectively. Although less than 20 percent of patients would develop TRD or its associated comorbidities, they contribute 31 to 54 percent of the costs. The key cost drivers were the number of annual incident cases and the probability of developing TRD and associated comorbidities and of becoming a low-intensity service user. These factors are relevant to early disease stages.
Conclusions
A small proportion of patients with depression develop TRD, but they contribute to a high proportion of the care costs. Our projection also demonstrates the application of RWE to model the long-term costs of care, which can aid policymakers in anticipating foreseeable burden and undertaking budget planning to prepare for future care needs.
Purple nutsedge (Cyperus rotundus L.) is one of the world’s resilient upland weeds, primarily spreading through its tubers. Its emergence in rice (Oryza sativa L.) fields has been increasing, likely due to changing paddy-farming practices. This study aimed to investigate how C. rotundus, an upland weed, can withstand soil flooding and become a problematic weed in rice fields. The first comparative analysis focused on the survival and recovery characteristics of growing and mature tubers of C. rotundus exposed to soil-flooding conditions. Notably, mature tubers exhibited significant survival and recovery abilities in these environments. Based on this observation, further investigation was carried out to explore the morphological structure, nonstructural carbohydrates, and respiratory mechanisms of mature tubers in response to prolonged soil flooding. Over time, the mature tubers did not form aerenchyma but instead gradually accumulated lignified sclerenchymal fibers, with lignin content also increasing. After 90 d, the lignified sclerenchymal fibers and lignin contents were 4.0 and 1.1 times higher than those in the no soil-flooding treatment. Concurrently, soluble sugar content decreased while starch content increased, providing energy storage, and alcohol dehydrogenase activity rose to support anaerobic respiration via alcohol fermentation. These results indicated that mature tubers survived in soil-flooding conditions by adopting a low-oxygen quiescence strategy, which involves morphological adaptations through the development of lignified sclerenchymal fibers, increased starch reserves for energy storage, and enhanced anaerobic respiration. This mechanism likely underpins the flooding tolerance of mature C. rotundus tubers, allowing them to endure unfavorable conditions and subsequently germinate and grow once flooding subsides. This study provides a preliminary explanation of the mechanism by which mature tubers of C. rotundus from the upland areas confer flooding tolerance, shedding light on the reasons behind this weed’s increasing presence in rice fields.
The right inferior frontal gyrus (RIFG) is a potential beneficial brain stimulation target for autism. This randomized, double-blind, two-arm, parallel-group, sham-controlled clinical trial assessed the efficacy of intermittent theta burst stimulation (iTBS) over the RIFG in reducing autistic symptoms (NCT04987749).
Methods
Conducted at a single medical center, the trial enrolled 60 intellectually able autistic individuals (aged 8–30 years; 30 active iTBS). The intervention comprised 16 sessions (two stimulations per week for eight weeks) of neuro-navigated iTBS or sham over the RIFG. Fifty-seven participants (28 active) completed the intervention and assessments at Week 8 (the primary endpoint) and follow-up at Week 12.
Results
Autistic symptoms (primary outcome) based on the Social Responsiveness Scale decreased in both groups (significant time effect), but there was no significant difference between groups (null time-by-treatment interaction). Likewise, there was no significant between-group difference in changes in repetitive behaviors and exploratory outcomes of adaptive function and emotion dysregulation. Changes in social cognition (secondary outcome) differed between groups in feeling scores on the Frith-Happe Animations (Week 8, p = 0.026; Week 12, p = 0.025). Post-hoc analysis showed that the active group improved better on this social cognition than the sham group. Dropout rates did not vary between groups; the most common adverse event in both groups was local pain. Notably, our findings would not survive stringent multiple comparison corrections.
Conclusions
Our findings suggest that iTBS over the RIFG is not different from sham in reducing autistic symptoms and emotion dysregulation. Nonetheless, RIFG iTBS may improve social cognition of mentalizing others' feelings in autistic individuals.
National health insurance (NHI) Taiwan has provided additional markups on dental service fees for people with specific disabilities, and the expenditure has increased significantly from TWD473 million (USD15 million) in 2016 to TWD722 million (USD24 million) in 2022. The purpose of this study was to determine oral health risk and to develop a risk assessment model for capitation outpatient dental payments in children with Autism.
Methods
Based on the literature and expert opinion, we developed a level of oral health risk model from the claim records of 2019. The model uses oral outpatient claim data to analyze: (i) the degree of caries disease; (ii) the level of dental fear or cooperation; and (iii) the level of tooth structure. Each factor was given a score from zero to four and a total score was calculated. Low-, medium-, and high-risk groups were formed based on the total points. The oral health risk capitation models are estimated by ordinary least squares using an individual’s annual outpatient dental expenditure in 2019 as the dependent variable. For subgroups based on age group and level of disability, expenditures predicted by the models are compared with actual outpatient dental expenditures. Predictive R-squared and predictive ratios were used to evaluate the model’s predictability.
Results
The demographic variables, level of oral health risk, preventive dental care, and the type of dental health care predicted 30 percent of subsequent outpatient dental expenditure in children with autism. For subgroups (age group and disability level) of high-risk patients, the model substantially overpredicted the expenditure, whereas underprediction occurred in the low-risk group.
Conclusions
The risk-adjusted model based on principal oral health was more accurate in predicting an individual’s future expenditure than the relevant study in Taiwan. The finding provides insight into the important risk factor in the outpatient dental expenditure of children with autism and the fund planning of dental services for people with specific disabilities.
To evaluate the mental health of paediatric cochlear implant users and analyse the relationship between six dimensions (movements, cognitive ability, emotion and will, sociality, living habits and language) and hearing and speech rehabilitation.
Methods
Eighty-two cochlear implant users were assessed using the Mental Health Survey Questionnaire. Age at implantation, time of implant use and listening modes were investigated. Categories of Auditory Performance and the Speech Intelligibility Rating Scale were used to score hearing and speech abilities.
Results
More recipients scored lower in cognitive ability and language. Age at implantation was statistically significant (p < 0.05) for movements, cognitive ability, emotion and will, and language. The time of implant usage and listening mode indicated statistical significance (p < 0.05) in cognitive ability, sociality and language.
Conclusion
Timely attention should be paid to the mental health of paediatric cochlear implant users, and corresponding psychological interventions should be implemented to make personalised rehabilitation plans.
Longitudinal studies on the variations of phenotypic and genotypic characteristics of K. pneumoniae across two decades are rare. We aimed to determine the antimicrobial susceptibility and virulence factors for K. pneumoniae isolated from patients with bacteraemia or urinary tract infection (UTI) from 1999 to 2022. A total of 699 and 1,267 K. pneumoniae isolates were isolated from bacteraemia and UTI patients, respectively, and their susceptibility to twenty antibiotics was determined; PCR was used to identify capsular serotypes and virulence-associated genes. K64 and K1 serotypes were most frequently observed in UTI and bacteraemia, respectively, with an increasing frequency of K20, K47, and K64 observed in recent years. entB and wabG predominated across all isolates and serotypes; the least frequent virulence gene was htrA. Most isolates were susceptible to carbapenems, amikacin, tigecycline, and colistin, with the exception of K20, K47, and K64 where resistance was widespread. The highest average number of virulence genes was observed in K1, followed by K2, K20, and K5 isolates, which suggest their contribution to the high virulence of K1. In conclusion, we found that the distribution of antimicrobial susceptibility, virulence gene profiles, and capsular types of K. pneumoniae over two decades were associated with their clinical source.
Purple nutsedge (Cyperus rotundus L.) is a globally distributed noxious weed that poses a significant challenge for control due to its fast and efficient propagation through the tuber, which is the primary reproductive organ. Gibberellic acid (GA3) has proven to be crucial for tuberization in tuberous plants. Therefore, understanding the relationship between GA3 and tuber development and propagation of C. rotundus will provide valuable information for controlling this weed. This study shows that the GA3 content decreases with tuber development, which corresponds to lower expression of bioactive GA3 synthesis genes (CrGA20ox, two CrGA3ox genes) and two upregulated GA3 catabolism genes (CrGA2ox genes), indicating that GA3 is involved in tuber development. Simultaneously, the expression of two CrDELLA genes and CrGID1 declines with tuber growth and decreased GA3, and yeast two-hybrid assays confirm that the GA3 signaling is DELLA-dependent. Furthermore, exogenous application of GA3 markedly reduces the number and the width of tubers and represses the growth of the tuber chain, further confirming the negative impact that GA3 has on tuber development and propagation. Taken together, these results demonstrate that GA3 is involved in tuber development and regulated by the DELLA-dependent pathway in C. rotundus and plays a negative role in tuber development and propagation.
Randomized clinical trials (RCT) are the foundation for medical advances, but participant recruitment remains a persistent barrier to their success. This retrospective data analysis aims to (1) identify clinical trial features associated with successful participant recruitment measured by accrual percentage and (2) compare the characteristics of the RCTs by assessing the most and least successful recruitment, which are indicated by varying thresholds of accrual percentage such as ≥ 90% vs ≤ 10%, ≥ 80% vs ≤ 20%, and ≥ 70% vs ≤ 30%.
Methods:
Data from the internal research registry at Columbia University Irving Medical Center and Aggregated Analysis of ClinicalTrials.gov were collected for 393 randomized interventional treatment studies closed to further enrollment. We compared two regularized linear regression and six tree-based machine learning models for accrual percentage (i.e., reported accrual to date divided by the target accrual) prediction. The outperforming model and Tree SHapley Additive exPlanations were used for feature importance analysis for participant recruitment. The identified features were compared between the two subgroups.
Results:
CatBoost regressor outperformed the others. Key features positively associated with recruitment success, as measured by accrual percentage, include government funding and compensation. Meanwhile, cancer research and non-conventional recruitment methods (e.g., websites) are negatively associated with recruitment success. Statistically significant subgroup differences (corrected p-value < .05) were found in 15 of the top 30 most important features.
Conclusion:
This multi-source retrospective study highlighted key features influencing RCT participant recruitment, offering actionable steps for improvement, including flexible recruitment infrastructure and appropriate participant compensation.
Evidence has suggested that emotional dysregulation is a transdiagnostic feature in schizophrenia and major affective disorders. However, the relationship between emotional dysregulation and appetite hormone disturbance remains unknown in nonobese adolescents with first-episode schizophrenia, bipolar disorder, and major depressive disorder.
Methods
In total, 22 adolescents with schizophrenia; 31 with bipolar disorder; 33 with major depressive disorder; and 41 healthy age-, sex-, and body mass index (BMI)/BMI percentile-matched controls were enrolled for assessing levels of appetite hormones, namely leptin, ghrelin, insulin, and adiponectin. Emotional regulation symptoms were measured using the parent-reported Child Behavior Checklist―Dysregulation Profile.
Results
Adolescents with first-episode schizophrenia, bipolar disorder, and major depressive disorder exhibited greater emotional dysregulation symptoms than the control group (P = .037). Adolescents with bipolar disorder demonstrated higher log-transformed levels of insulin (P = .029) and lower log-transformed levels of leptin (P = .018) compared with the control group. BMI (P < .05) and log-transformed ghrelin levels (P = .028) were positively correlated with emotional dysregulation symptoms.
Discussion
Emotional dysregulation and appetite hormone disturbance may occur in the early stage of severe mental disorders. Further studies are required to clarify the unidirectional or bidirectional association of emotional dysregulation with BMI/BMI percentile and appetite hormones among patients with severe mental disorder.
We obtained 24 air samples in 8 general wards temporarily converted into negative-pressure wards admitting coronavirus disease 2019 (COVID-19) patients infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) omicron variant BA.2.2 in Hong Kong. SARS-CoV-2 RNA was detected in 19 (79.2%) of 24 samples despite enhanced indoor air dilution. It is difficult to prevent airborne transmission of SARS-CoV-2 in hospitals.
Air dispersal of respiratory viruses other than SARS-CoV-2 has not been systematically reported. The incidence and factors associated with air dispersal of respiratory viruses are largely unknown.
Methods:
We performed air sampling by collecting 72,000 L of air over 6 hours for pediatric and adolescent patients infected with parainfluenza virus 3 (PIF3), respiratory syncytial virus (RSV), rhinovirus, and adenovirus. The patients were singly or 2-patient cohort isolated in airborne infection isolation rooms (AIIRs) from December 3, 2021, to January 26, 2022. The viral load in nasopharyngeal aspirates (NPA) and air samples were measured. Factors associated with air dispersal were investigated and analyzed.
Results:
Of 20 singly isolated patients with median age of 30 months (range, 3 months–15 years), 7 (35%) had air dispersal of the viruses compatible with their NPA results. These included 4 (40%) of 10 PIF3-infected patients, 2 (66%) of 3 RSV-infected patients, and 1 (50%) of 2 adenovirus-infected patients. The mean viral load in their room air sample was 1.58×103 copies/mL. Compared with 13 patients (65%) without air dispersal, these 7 patients had a significantly higher mean viral load in their NPA specimens (6.15×107 copies/mL vs 1.61×105 copies/mL; P < .001). Another 14 patients were placed in cohorts as 7 pairs infected with the same virus (PIF3, 2 pairs; RSV, 3 pairs; rhinovirus, 1 pair; and adenovirus, 1 pair) in double-bed AIIRs, all of which had air dispersal. The mean room air viral load in 2-patient cohorts was significantly higher than in rooms of singly isolated patients (1.02×104 copies/mL vs 1.58×103 copies/mL; P = .020).
Conclusion:
Air dispersal of common respiratory viruses may have infection prevention and public health implications.
Contrasting the well-described effects of early intervention (EI) services for youth-onset psychosis, the potential benefits of the intervention for adult-onset psychosis are uncertain. This paper aims to examine the effectiveness of EI on functioning and symptomatic improvement in adult-onset psychosis, and the optimal duration of the intervention.
Methods
360 psychosis patients aged 26–55 years were randomized to receive either standard care (SC, n = 120), or case management for two (2-year EI, n = 120) or 4 years (4-year EI, n = 120) in a 4-year rater-masked, parallel-group, superiority, randomized controlled trial of treatment effectiveness (Clinicaltrials.gov: NCT00919620). Primary (i.e. social and occupational functioning) and secondary outcomes (i.e. positive and negative symptoms, and quality of life) were assessed at baseline, 6-month, and yearly for 4 years.
Results
Compared with SC, patients with 4-year EI had better Role Functioning Scale (RFS) immediate [interaction estimate = 0.008, 95% confidence interval (CI) = 0.001–0.014, p = 0.02] and extended social network (interaction estimate = 0.011, 95% CI = 0.004–0.018, p = 0.003) scores. Specifically, these improvements were observed in the first 2 years. Compared with the 2-year EI group, the 4-year EI group had better RFS total (p = 0.01), immediate (p = 0.01), and extended social network (p = 0.05) scores at the fourth year. Meanwhile, the 4-year (p = 0.02) and 2-year EI (p = 0.004) group had less severe symptoms than the SC group at the first year.
Conclusions
Specialized EI treatment for psychosis patients aged 26–55 should be provided for at least the initial 2 years of illness. Further treatment up to 4 years confers little benefits in this age range over the course of the study.
Cognitive impairment is common in late-life depression, which may increase Alzheimer disease (AD) risk. Therefore, we aimed to investigate whether late-life major depressive disorder (MDD) has worse cognition and increases the characteristic AD neuropathology. Furthermore, we carried out a comparison between treatment-resistant depression (TRD) and non-TRD. We hypothesized that patients with late-life depression and TRD may have increased β-amyloid (Aβ) deposits in brain regions responsible for global cognition.
Methods
We recruited 81 subjects, including 54 MDD patients (27 TRD and 27 non-TRD) and 27 matched healthy controls (HCs). Neurocognitive tasks were examined, including Mini-Mental State Examination and Montreal Cognitive Assessment to detect global cognitive functions. PET with Pittsburgh compound-B and fluorodeoxyglucose were used to capture brain Aβ pathology and glucose use, respectively, in some patients.
Results
MDD patients performed worse in Montreal Cognitive Assessment (p = 0.003) and had more Aβ deposits than HCs across the brain (family-wise error-corrected p < 0.001), with the most significant finding in the left middle frontal gyrus. Significant negative correlations between global cognition and prefrontal Aβ deposits existed in MDD patients, whereas positive correlations were noted in HCs. TRD patients had significantly more deposits in the left-sided brain regions (corrected p < 0.001). The findings were not explained by APOE genotypes. No between-group fluorodeoxyglucose difference was detected.
Conclusions
Late-life depression, particularly TRD, had increased brain Aβ deposits and showed vulnerability to Aβ deposits. A detrimental role of Aβ deposits in global cognition in patients with late-onset or non-late-onset MDD supported the theory that late-life MDD could be a risk factor for AD.
Honeybees cannot synthesize arachidonic acid (ARA) themselves, only obtain it from food. Most pollen is deficient or contains a small amount of ARA. The necessity of supplementary ARA in bees’ diet has not been studied. The objective of this study was to investigate the effects of dietary ARA levels on the growth and immunity of Apis mellifera ligustica. A total of 25 honeybee colonies were randomly assigned to five dietary groups which were fed basic diets supplemented with 0, 2, 4, 6, and 8% of ARA. The diet with 4% ARA improved the body weight of newly emerged worker bees compared with the control group. Supplement of ARA in honeybee diets changed the fatty acid composition of honeybee body. SFA and MUFA contents of bees’ body declined, and PUFA content rised in the ARA group. Compared with the control group, the supplement of ARA in honeybee diets increased the contents of ARA, C22:6n-3 (DHA) and C18:3n-6 in bees’ body significantly, but decreased the contents of C16:1 and C18:3n-3. The diet supplied with 4% ARA reduced the mortality rate of honeybee infected with Escherichia coli. The activity of immune enzymes (phenoloxidase, antitrypsin, and lysozyme) and the mRNA expression levels of immune genes (defensin-2, toll, myd88, and dorsal) were improved by ARA diets to varying degrees depending on the ARA levels, especially 4% ARA. These results suggested that dietary ARA could improve the growth, survival, and immune functions of honeybees. Supplement of ARA in bees’ diet would be valuable for the fitness of honeybees.
Frozen embryo transfer (FET) has been adopted by growing number of reproductive medicine centers due to the improved outcome compared with fresh embryo transfer. However, few studies have focused on the impact of embryo cryopreservation duration on pregnancy-related complications and neonatal birthweight. Thus, a retrospective cohort study including all FET cycles with livebirth deliveries in a university affiliated hospital from May 2010 to September 2017 was conducted. These deliveries were grouped by the cryopreservation duration of the transferred embryo (≤3 months, 4–6 months, 7–12 months, and >12 months). The associations between embryo cryopreservation duration and pregnancy-related complications were evaluated among the groups using multinomial logistic regression. Neonatal birthweight was compared according to the stratification of singletons and multiples using multinomial and multilevel logistic regression, respectively. Among all 12,158 FET cycles, a total of 3864 livebirth deliveries comprising 2995 singletons and 1739 multiples were included. Compared with those within 3 months, women undergoing FET after a cryopreservation time longer than 3 months did not show any increased risk of gestational diabetes mellitus, gestational hypertension, preeclampsia, meconium staining of the amniotic fluid, or preterm birth. Furthermore, the risk of lower birthweight, macrosomia, small-for-gestational-age, or large-for-gestational-age for either singletons or multiples was not affected by long-term cryopreservation. In summary, embryo cryopreservation duration does not have negative effects on pregnancy-related complications or birthweight after FET.
Recent imaging studies of large datasets suggested that psychiatric disorders have common biological substrates. This study aimed to identify all the common neural substrates with connectomic abnormalities across four major psychiatric disorders by using the data-driven connectome-wide association method of multivariate distance matrix regression (MDMR).
Methods
This study analyzed a resting functional magnetic resonance imaging dataset of 100 patients with schizophrenia, 100 patients with bipolar I disorder, 100 patients with bipolar II disorder, 100 patients with major depressive disorder, and 100 healthy controls (HCs). We calculated a voxel-wise 4,330 × 4,330 matrix of whole-brain functional connectivity (FC) with 8-mm isotropic resolution for each participant and then performed MDMR to identify structures where the overall multivariate pattern of FC was significantly different between each patient group and the HC group. A conjunction analysis was performed to identify common neural regions with FC abnormalities across these four psychiatric disorders.
Results
The conjunction of the MDMR maps revealed that the four groups of patients shared connectomic abnormalities in distributed cortical and subcortical structures, which included bilateral thalamus, cerebellum, frontal pole, supramarginal gyrus, postcentral gyrus, lingual gyrus, lateral occipital cortex, and parahippocampus. The follow-up analysis based on pair-wise FC of these regions demonstrated that these psychiatric disorders also shared similar patterns of FC abnormalities characterized by sensory/subcortical hyperconnectivity, association/subcortical hypoconnectivity, and sensory/association hyperconnectivity.
Conclusions
These findings suggest that major psychiatric disorders share common connectomic abnormalities in distributed cortical and subcortical regions and provide crucial support for the common network hypothesis of major psychiatric disorders.