To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The long-tailed goral Naemorhedus caudatus is a small ungulate that inhabits mountainous regions in eastern Russia, China and Korea. It is highly sensitive to human disturbance and is categorized as Vulnerable on the IUCN Red List. We present the first distribution map of the long-tailed goral in South Korea and identify habitats critical for enhancing conservation and recovery efforts. We conducted a two-step modelling process, using MaxEnt to identify sites for field surveys, and subsequently draft a distribution map, and then used linear mixed-effects modelling to identify predictors of goral presence. Based on 641 records of the goral, we used MaxEnt to identify 364 of 1,027 10×10 km grid cells as potentially suitable for the species. In field surveys during 2019–2022, we confirmed goral presence at 892 of 1,232 survey sites in 123 of the 364 grid cells, primarily in the north-eastern and central-eastern mountains. There were no detections south of latitude 36°16′N. Using linear mixed-effects models, we examined the contribution of 14 environmental and anthropogenic variables to the prediction of goral presence. Elevation, land-cover type, human footprint, distances to nearest express highway, paved road and national park, and land price were significant predictors of goral presence. In combination, the distribution map and predictive model of goral presence can be used to monitor and protect remaining goral populations.
Structural abnormalities in cortical and subcortical brain regions are consistently observed in schizophrenia; however, substantial inter-individual variability complicates identifying clear neurobiological biomarkers. The Person-Based Similarity Index (PBSI) quantifies individual structural variability; however, its applicability across schizophrenia stages remains unclear. This study aimed to compare cortical and subcortical structural variability in recent-onset and chronic schizophrenia and explore associations with clinical measures.
Methods:
Neuroimaging data from 41 patients with recent-onset schizophrenia, 32 with chronic schizophrenia, and 59 healthy controls were analysed. The PBSI scores were calculated for cortical thickness, surface area, cortical grey matter volume, and subcortical volumes. Group differences in PBSI scores were assessed using linear regression and analysis of variance. Correlations between the PBSI scores and clinical measures were also examined.
Results:
Both patients with recent-onset and chronic schizophrenia exhibited significantly lower PBSI scores than healthy controls, indicating greater morphometric heterogeneity. However, significant differences between the recent-onset and chronic patient groups were limited to subcortical and cortical thickness PBSI scores. Correlations between PBSI scores and clinical symptoms are sparse and primarily restricted to surface area variability and symptom severity in patients with recent-onset schizophrenia.
Conclusion:
Patients with schizophrenia show marked structural brain heterogeneity compared with healthy controls, which is detectable even in the early stages of the illness. Although there were few differences in PBSI scores between the recent-onset and chronic schizophrenia groups and limited correlations between PBSI scores and clinical measures, the PBSI may still provide valuable insights into individual differences contributing to clinical heterogeneity in schizophrenia.
This study examined the interaction of different types of crosslinguistic cues in second language (L2) morphosyntactic processing. Our target constructions, Korean morphological causatives, contain morphosyntactic constraints that present interlingual overlap for Japanese speakers when the construction is derived from an intransitive verb, while constituting interlingual contrast when derived from a transitive verb. For Chinese speakers, these constraints only exist in the L2 and thus constitute L2-unique information. In two self-paced reading experiments involving proficiency-matched Japanese- and Chinese-speaking learners of Korean, we found that Japanese speakers successfully detected morphosyntactic errors only in the intransitive-based construction, which shares overlapping constraints with Japanese, but not in the transitive-based construction whose morphosyntactic constraints contrast with the Japanese counterparts. In contrast, Chinese speakers exhibited sensitivity to the violations in both intransitive- and transitive-based constructions. These findings suggest that crosslinguistic competition causes a major problem in L2 sentence processing.
The integration of multiple linguistic modules – syntax, semantics and pragmatics – poses a persistent challenge for adult second language (L2) learners, as posited by the interface hypothesis (IH). This study examines how crosslinguistic influence impacts L2 learners’ acquisition and processing of Korean quotative constructions at the syntax–semantics–pragmatics interface. Using offline acceptability judgment and online self-paced reading tasks, we compared Japanese- and Chinese-speaking learners of Korean. The results revealed that Japanese-speaking learners outperformed Chinese-speaking learners in offline tasks, demonstrating native-like sensitivity to case-marking constraints, likely due to the structural similarities between Japanese and Korean. However, neither learner group exhibited sensitivity to case-marking violations during real-time processing, unlike native Korean speakers. These findings suggest a dissociation between explicit knowledge and online processing abilities, supporting the IH and emphasizing the persistent challenges of integrating multiple linguistic domains in L2 processing. This study underscores the role of crosslinguistic influence in facilitating explicit knowledge acquisition while revealing its limitations in fostering native-like automaticity in online processing.
The questions of whether first language (L1) speakers and second language (L2) learners can both predict what follows based on given linguistic cues and what factors may influence this predictive processing are still underexplored. Prior research has focused on the success or failure of predictions in real-time processing, paying relatively less attention to the speed of prediction. This study addresses these gaps by investigating the role of word co-occurrence frequency and proficiency in L1 and L2 predictive processing, using the Korean classifier system. In a webcam-based visual-world eye-tracking experiment, both L1-Korean speakers and L2-Korean learners showed sound predictive processing, with the frequency of co-occurrence between classifiers and nouns playing a crucial role. Higher co-occurrence frequency expedited predictive processing for L1-Korean speakers and boosted the ability to make online predictions for L2-Korean learners. The study also revealed a proficiency effect, where more advanced L2-Korean learners made predictions regardless of co-occurrence frequency, unlike their less advanced counterparts. Our findings suggest that predictive mechanisms in L1 and L2 operate in a qualitatively similar way. In addition, the use of webcam eye-tracking is expected to create a more inclusive and equitable research environment for (applied) psycholinguistics.
This study aimed to identify the roles of community pharmacists (CPs) during the coronavirus disease 2019 (COVID-19) pandemic, the differences in their role performance compared with their perceived importance, and limiting factors.
Methods:
A cross-sectional online survey of CPs was conducted. The CPs self-measured the importance and performance of each role during the pandemic using a 5-point Likert scale. A paired t-test was used to compare each role’s importance and performance scores. A logistic regression analysis of the roles with low performance scores, despite their level of importance, was conducted to determine the factors affecting performance. The limiting factors were also surveyed.
Results:
The 436 responses to the questionnaire were analyzed. The performance scores were significantly lower than the perceived importance scores for 15 of the 17 roles. The source and update frequency of COVID-19 information and participation in outreach pharmaceutical services were associated with low performance scores. Insufficient economic compensation, the lack of communication channels, and legal limitations were the limiting factors in performing the CPs’ roles.
Conclusions:
The participation in outreach pharmaceutical services, economic compensation, and communication channel should be improved to motivate the CPs in performing their roles.
Recent arguments claim that behavioral science has focused – to its detriment – on the individual over the system when construing behavioral interventions. In this commentary, we argue that tackling economic inequality using both framings in tandem is invaluable. By studying individuals who have overcome inequality, “positive deviants,” and the system limitations they navigate, we offer potentially greater policy solutions.
Identifying more homogenous subtypes of patients with obsessive–compulsive disorder (OCD) using biological evidence is critical for understanding complexities of the disorder in this heterogeneous population. Age of onset serves as a useful subtyping scheme for distinguishing OCD into two subgroups that aligns with neurodevelopmental perspectives. The underlying neurobiological markers for these distinct neurodevelopmental differences can be identified by investigating gyrification changes to establish biological evidence-based homogeneous subtypes.
Methods
We compared whole-brain cortical gyrification in 84 patients with early-onset OCD, 84 patients with late-onset OCD, and 152 healthy controls (HCs) to identify potential markers for early neurodevelopmental deficits using the local gyrification index (lGI). Then, the relationships between lGI in clusters showing significant differences and performance in visuospatial memory and verbal fluency, which are considered trait-related neurocognitive impairments in OCD, were further examined in early-onset OCD patients.
Results
The early-onset OCD patients exhibited significantly greater gyrification than those with late-onset OCD patients and HCs in frontoparietal and cingulate regions, including the bilateral precentral, postcentral, precuneus, paracentral, posterior cingulate, superior frontal, and caudal anterior cingulate gyri. Moreover, impaired neurocognitive functions in early-onset OCD patients were correlated with increased gyrification.
Conclusions
Our findings provide a neurobiological marker to distinguish the OCD population into more neurodevelopmentally homogeneous subtypes, which may contribute to the understanding of the neurodevelopmental underpinnings of an etiology in early-onset OCD consistent with the accumulated phenotypic evidence of greater neurodevelopmental deficits in early-onset OCD than in late-onset OCD.
It has been suggested that psychosocial factors are related to survival time of inpatients with cancer. However, there are not many studies examining the relationship between spiritual well-being (SWB) and survival time among countries. This study investigated the relationship between SWB and survival time among three East Asian countries.
Methods
This international multicenter cohort study is a secondary analysis involving newly admitted inpatients with advanced cancer in palliative care units in Japan, South Korea, and Taiwan. SWB was measured using the Integrated Palliative Outcome Scale (IPOS) at admission. We performed multivariate analysis using the Cox proportional hazards model to identify independent prognostic factors.
Results
A total of 2,638 patients treated at 37 palliative care units from January 2017 to September 2018 were analyzed. The median survival time was 18.0 days (95% confidence interval [CI] 16.5–19.5) in Japan, 23.0 days (95% CI 19.9–26.1) in Korea, and 15.0 days (95% CI 13.0–17.0) in Taiwan. SWB was a significant factor correlated with survival in Taiwan (hazard ratio [HR] 1.27; 95% CI 1.01–1.59; p = 0.04), while it was insignificant in Japan (HR 1.10; 95% CI 1.00–1.22; p = 0.06), and Korea (HR 1.02; 95% CI 0.77–1.35; p = 0.89).
Significance of results
SWB on admission was associated with survival in patients with advanced cancer in Taiwan but not Japan or Korea. The findings suggest the possibility of a positive relationship between spiritual care and survival time in patients with far advanced cancer.
Background: Although small- and medium-sized hospitals comprise most healthcare providers in South Korea, data on antibiotic usage is limited in these facilities. We evaluated the pattern of antibiotic usage and its appropriateness in hospitals with <400 beds in South Korea. Methods: A multicenter retrospective study was conducted in 10 hospitals (6 long-term care hospitals, 3 acute-care hospitals, and 1 orthopedic hospital), with <400 beds in South Korea. We analyzed patterns of antibiotic prescription and their appropriateness in the participating hospitals. Data on the monthly antibiotic prescriptions and patient days for hospitalized patients were collected using electronic databases from each hospital. To avoid the effect of the COVID-19 pandemic, data were collected from January to December 2019. For the evaluation of the appropriateness of the prescription, 25 patients under antibiotic therapy were randomly selected at each hospital over 2 separate periods. Due to the heterogeneity of their characteristics, the orthopedics hospital was excluded from the analysis. The collected data were reviewed, and the appropriateness of antibiotic prescriptions was evaluated by 5 specialists in infectious diseases (adult and pediatric). Data from 2 hospitals were assigned to each specialist. The appropriateness of antibiotic prescriptions was evaluated from 3 aspects: route of administration, dose, and class. If the 3 aspects were ‘optimal,’ the prescription was considered ‘optimal.’ If only the route was ‘optimal,’ and the dose and/or class was ‘suboptimal,’ but not ‘inappropriate,’ it was considered ‘suboptimal.’ If even 1 aspect was ‘inappropriate,’ it was classified as ‘inappropriate.’ Results: The most commonly prescribed antibiotics in long-term care hospitals was fluoroquinolone, followed by β-lactam/β-lactamase inhibitor (antipseudomonal). In acute-care hospitals, these were third-generation cephalosporin, followed by first-generation cephalosporin and second-generation cephalosporin. The major antibiotics that were prescribed in the orthopedics hospital was first-generation cephalosporin. Only 2.3% of the antibiotics were administered inappropriately. In comparison, 15.3% of patients were prescribed an inappropriate dose. The proportion of inappropriate antibiotic prescriptions was 30.6% of the total antibiotic prescriptions. Conclusions: The antibiotic usage patterns vary between small- and medium-sized hospitals in South Korea. The proportion of inappropriate prescriptions exceeded 30% of the total antibiotic prescriptions.
Background: The δ (delta) variant has spread rapidly worldwide and has become the predominant strain of SARS-CoV-2. We analyzed an outbreak caused by a vaccine breakthrough infection in a hospital with an active infection control program where 91.9% of healthcare workers were vaccinated. Methods: We investigated a SARS-CoV-2 outbreak between September 9 and October 2, 2021, in a referral teaching hospital in Korea. We retrospectively collected data on demographics, vaccination history, transmission, and clinical features of confirmed COVID-19 in patients, healthcare workers, and caregivers. Results: During the outbreak, 94 individuals tested positive for SARS-CoV-2 using reverse transcription-polymerase chain reaction (rtPCR) testing. Testing identified infections in 61 health care workers, 18 patients, and 15 caregivers, and 70 (74.5%) of 94 cases were vaccine breakthrough infections. We detected 3 superspreading events: in the hospital staff cafeteria and offices (n = 47 cases, 50%), the 8th floor of the main building (n = 22 cases, 23.4%), and the 7th floor in the maternal and child healthcare center (n = 12 cases, 12.8%). These superspreading events accounted for 81 (86.2%) of 94 transmissions (Fig. 1, 2). The median interval between completion of vaccination and COVID-19 infection was 117 days (range, 18–187). There was no significant difference in the mean Ct value of the RdRp/ORF1ab gene between fully vaccinated individuals (mean 20.87, SD±6.28) and unvaccinated individuals (mean 19.94, SD±5.37, P = .52) at the time of diagnosis. Among healthcare workers and caregivers, only 1 required oxygen supplementation. In contrast, among 18 patients, there were 4 fatal cases (22.2%), 3 of whom were unvaccinated (Table 1). Conclusions: Superspreading infection among fully vaccinated individuals occurred in an acute-care hospital while the δ (delta) variant was dominant. Given the potential for severe complications, as this outbreak demonstrated, preventive measures including adequate ventilation should be emphasized to minimize transmission in hospitals.
Nosocomial transmission of COVID-19 among immunocompromised hosts can have a serious impact on COVID-19 severity, underlying disease progression and SARS-CoV-2 transmission to other patients and healthcare workers within hospitals. We experienced a nosocomial outbreak of COVID-19 in the setting of a daycare unit for paediatric and young adult cancer patients. Between 9 and 18 November 2020, 473 individuals (181 patients, 247 caregivers/siblings and 45 staff members) were exposed to the index case, who was a nursing staff. Among them, three patients and four caregivers were infected. Two 5-year-old cancer patients with COVID-19 were not severely ill, but a 25-year-old cancer patient showed prolonged shedding of SARS-CoV-2 RNA for at least 12 weeks, which probably infected his mother at home approximately 7–8 weeks after the initial diagnosis. Except for this case, no secondary transmission was observed from the confirmed cases in either the hospital or the community. To conclude, in the day care setting of immunocompromised children and young adults, the rate of in-hospital transmission of SARS-CoV-2 was 1.6% when applying the stringent policy of infection prevention and control, including universal mask application and rapid and extensive contact investigation. Severely immunocompromised children/young adults with COVID-19 would have to be carefully managed after the mandatory isolation period while keeping the possibility of prolonged shedding of live virus in mind.
The dissipation of ion-acoustic surface waves propagating in a semi-bounded and collisional plasma which has a boundary with vacuum is theoretically investigated and this result is used for the analysis of edge-relevant plasma simulated by Divertor Plasma Simulator-2 (DiPS-2). The collisional damping of the surface wave is investigated for weakly ionized plasmas by comparing the collisionless Landau damping with the collisional damping as follows: (1) the ratio of ion temperature $({T_i})$ to electron temperature $({T_e})$ should be very small for the weak collisionality $({T_i}/{T_e} \ll 1)$; (2) the effect of collisionless Landau damping is dominant for the small parallel wavenumber, and the decay constant is given as $\gamma \approx{-} \sqrt {\mathrm{\pi }/2} {k_\parallel }{\lambda _{De}}\omega _{pi}^2/{\omega _{pe}}$; and (3) the collisional damping dominates for the large parallel wavenumber, and the decay constant is given as $\gamma \approx{-} {\nu _{in}}/16$, where ${\nu _{in}}$ is the ion–neutral collisional frequency. An experimental simulation of the above theoretical prediction has been done in the argon plasma of DiPS-2, which has the following parameters: plasma density ${n_e} = (\textrm{2--9)} \times \textrm{1}{\textrm{0}^{11}}\;\textrm{c}{\textrm{m}^{ - 3}}$, ${T_e} = 3.7- 3.8\;\textrm{eV}$, ${T_i} = 0.2- 0.3\;\textrm{eV}$ and collision frequency ${\nu _{in}} = 23- 127\;\textrm{kHz}$. Although the wavelength should be specified with the given parameters of DiPS-2, the collisional damping is found to be $\gamma = ( - 0.9\;\textrm{to}\; - 5) \times {10^4}\;\textrm{rad}\;{\textrm{s}^{ - 1}}$ for ${k_\parallel }{\lambda _{De}} = 10$, while the Landau damping is found to be $\gamma = ( - 4\;\textrm{to}\; - 9) \times {10^4}\;\textrm{rad}\;{\textrm{s}^{ - 1}}$ for ${k_\parallel }{\lambda _{De}} = 0.1$.
Litter-dwelling arthropods play an important role in maintaining forest ecosystem function. This study was designed to understand seasonal variations and diversity of litter-dwelling adult beetles, one of the most diverse groups of arthropods. Sampling was conducted in mixed-wood forests of South Korea between March and December 2019, covering all seasons, including winter. We used a sifting method and a Berlese funnel to collect arthropods living in leaf litter and soil. We collected a total of 5820 invertebrates representing six orders, of which 1422 were beetles representing 24 families and minimum 141 species. Beetle species richness was highest in spring and lowest in summer based on rarefaction and extrapolation. However, beetle abundance was lowest in spring, but abundance was similar among the other seasons. Beetle assemblage composition was correlated significantly with soil surface and atmospheric temperature. The assemblage composition differed among seasons, except between spring and winter, which overlapped slightly. The combined sifting–Berlese funnel method showed great advantages for investigating the diversity of overwintering arthropods. Continued study of the relationship between arthropods and the leaf-litter environment is essential to understand this microecosystem and will increase the chance of discovering new beetle species.
Accurate prognostication is important for patients and their families to prepare for the end of life. Objective Prognostic Score (OPS) is an easy-to-use tool that does not require the clinicians’ prediction of survival (CPS), whereas Palliative Prognostic Score (PaP) needs CPS. Thus, inexperienced clinicians may hesitate to use PaP. We aimed to evaluate the accuracy of OPS compared with PaP in inpatients in palliative care units (PCUs) in three East Asian countries.
Method
This study was a secondary analysis of a cross-cultural, multicenter cohort study. We enrolled inpatients with far-advanced cancer in PCUs in Japan, Korea, and Taiwan from 2017 to 2018. We calculated the area under the receiver operating characteristics (AUROC) curve to compare the accuracy of OPS and PaP.
Results
A total of 1,628 inpatients in 33 PCUs in Japan and Korea were analyzed. OPS and PaP were calculated in 71.7% of the Japanese patients and 80.0% of the Korean patients. In Taiwan, PaP was calculated for 81.6% of the patients. The AUROC for 3-week survival was 0.74 for OPS in Japan, 0.68 for OPS in Korea, 0.80 for PaP in Japan, and 0.73 for PaP in Korea. The AUROC for 30-day survival was 0.70 for OPS in Japan, 0.71 for OPS in Korea, 0.79 for PaP in Japan, and 0.74 for PaP in Korea.
Significance of results
Both OPS and PaP showed good performance in Japan and Korea. Compared with PaP, OPS could be more useful for inexperienced physicians who hesitate to estimate CPS.
Several studies supported the usefulness of “the surprise question” in terms of 1-year mortality of patients. “The surprise question” requires a “Yes” or “No” answer to the question “Would I be surprised if this patient died in [specific time frame].” However, the 1-year time frame is often too long for advanced cancer patients seen by palliative care personnel. “The surprise question” with shorter time frames is needed for decision making. We examined the accuracy of “the surprise question” for 7-day, 21-day, and 42-day survival in hospitalized patients admitted to palliative care units (PCUs).
Method
This was a prospective multicenter cohort study of 130 adult patients with advanced cancer admitted to 7 hospital-based PCUs in South Korea. The accuracy of “the surprise question” was compared with that of the temporal question for clinician's prediction of survival.
Results
We analyzed 130 inpatients who died in PCUs during the study period. The median survival was 21.0 days. The sensitivity, specificity, and overall accuracy for the 7-day “the surprise question” were 46.7, 88.7, and 83.9%, respectively. The sensitivity, specificity, and overall accuracy for the 7-day temporal question were 6.7, 98.3, and 87.7%, respectively. The c-indices of the 7-day “the surprise question” and 7-day temporal question were 0.662 (95% CI: 0.539–0.785) and 0.521 (95% CI: 0.464–0.579), respectively. The c-indices of the 42-day “the surprise question” and 42-day temporal question were 0.554 (95% CI: 0.509–0.599) and 0.616 (95% CI: 0.569–0.663), respectively.
Significance of results
Surprisingly, “the surprise questions” and temporal questions had similar accuracies. The high specificities for the 7-day “the surprise question” and 7- and 21-day temporal question suggest they may be useful to rule in death if positive.
The explosive outbreak of COVID-19 led to a shortage of medical resources, including isolation rooms in hospitals, healthcare workers (HCWs) and personal protective equipment. Here, we constructed a new model, non-contact community treatment centres to monitor and quarantine asymptomatic and mildly symptomatic COVID-19 patients who recorded their own vital signs using a smartphone application. This new model in Korea is useful to overcome shortages of medical resources and to minimise the risk of infection transmission to HCWs.
We aim to determine the association between Fe status and the metabolic syndrome (MetS) during menopause. Records of 1069 premenopausal and 703 postmenopausal Korean women were retrieved from the database of the fifth Korean National Health and Nutrition Examination Survey (KNHANES V 2012) and analysed. The association between the MetS and Fe status was performed using multivariable-adjusted analyses, subsequently develop a prediction model for the MetS by margin effects. We found that the risk of Fe depletion among postmenopausal women was lower than premenopausal women (PR = 0·813, 95 % CI 0·668, 0·998, P = 0·038). The risk of the MetS was 2·562-fold lower among premenopausal women with than without Fe depletion (PR = 0·390, 95 % CI 0·266, 0·571, P < 0·001). In contrast, the risk of the MetS tended to be higher among postmenopausal women with than without Fe depletion (PR = 1·849, 95 % CI 1·406, 2·432, P < 0·001). When the serum ferritin levels increased, the risk of the MetS increased in both premenopausal women and postmenopausal women. The margin effects showed that an increase in serum Hb and ferritin was associated with an increase in the risk of the MetS according to menopausal status and age group. Therefore, ferritin is the most validated and widely used Fe marker, could be a potential clinical value in predicting and monitoring the MetS during menopause. Further prospective or longitudinal studies, especially, clinically related studies on menopause and Fe status, are needed to clarify the causality between serum ferritin levels and the MetS that could offer novel treatments for the MetS.
Over the past two decades, early detection and early intervention in psychosis have become essential goals of psychiatry. However, clinical impressions are insufficient for predicting psychosis outcomes in clinical high-risk (CHR) individuals; a more rigorous and objective model is needed. This study aims to develop and internally validate a model for predicting the transition to psychosis within 10 years.
Methods
Two hundred and eight help-seeking individuals who fulfilled the CHR criteria were enrolled from the prospective, naturalistic cohort program for CHR at the Seoul Youth Clinic (SYC). The least absolute shrinkage and selection operator (LASSO)-penalized Cox regression was used to develop a predictive model for a psychotic transition. We performed k-means clustering and survival analysis to stratify the risk of psychosis.
Results
The predictive model, which includes clinical and cognitive variables, identified the following six baseline variables as important predictors: 1-year percentage decrease in the Global Assessment of Functioning score, IQ, California Verbal Learning Test score, Strange Stories test score, and scores in two domains of the Social Functioning Scale. The predictive model showed a cross-validated Harrell's C-index of 0.78 and identified three subclusters with significantly different risk levels.
Conclusions
Overall, our predictive model showed a predictive ability and could facilitate a personalized therapeutic approach to different risks in high-risk individuals.
Radiocarbon (14C) dating has been widely used to determine the age of deposits, but there have been frequent reports of inconsistencies in age among different dating materials. In this study, we performed radiocarbon dating on a total of 33 samples from 8-m-long sediment cores recovered from the wetland of the Muljangori volcanic cone on Jeju Island, South Korea. Ten pairs of humic acid (HA) and plant fragments (PF) samples, and three pairs of HA and humin samples, from the same depths were compared in terms of age. The PF were consistently younger than the HA. Interestingly, the age difference between HA and PF samples showed a long-term change during the past 8000 years. To test whether there was an association between this long-term age difference and climate change, we compared with the carbon/nitrogen (C/N) ratios and total organic carbon isotope (δ13CTOC) values of the sediments, as indicators of the relative abundance of terrestrial and aquatic plants; these parameters showed similar long-term trends. This suggests that the increasing (decreasing) trend in age difference was influenced by long-term dry (wet) climate change.