To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Adverse childhood experiences (ACEs) are associated with physical and mental health difficulties in adulthood. This study examines the associations of ACEs with functional impairment and life stress among military personnel, a population disproportionately affected by ACEs. We also evaluate the extent to which the associations of ACEs with functional outcomes are mediated through internalizing and externalizing disorders.
Methods
The sample included 4,666 STARRS Longitudinal Study (STARRS-LS) participants who provided information about ACEs upon enlistment in the US Army (2011–2012). Mental disorders were assessed in wave 1 (LS1; 2016–2018), and functional impairment and life stress were evaluated in wave 2 (LS2; 2018–2019) of STARRS-LS. Mediation analyses estimated the indirect associations of ACEs with physical health-related impairment, emotional health-related impairment, financial stress, and overall life stress at LS2 through internalizing and externalizing disorders at LS1.
Results
ACEs had significant indirect effects via mental disorders on all functional impairment and life stress outcomes, with internalizing disorders displaying stronger mediating effects than externalizing disorders (explaining 31–92% vs 5–15% of the total effects of ACEs, respectively). Additionally, ACEs exhibited significant direct effects on emotional health-related impairment, financial stress, and overall life stress, implying ACEs are also associated with these longer-term outcomes via alternative pathways.
Conclusions
This study indicates ACEs are linked to functional impairment and life stress among military personnel in part because of associated risks of mental disorders, particularly internalizing disorders. Consideration of ACEs should be incorporated into interventions to promote psychosocial functioning and resilience among military personnel.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
This study examines the prospective associations of alcohol and drug misuse with suicidal behaviors among service members who have left active duty. We also evaluate potential moderating effects of other risk factors and whether substance misuse signals increased risk of transitioning from thinking about to attempting suicide.
Method
US Army veterans and deactivated reservists (N = 6,811) completed surveys in 2016–2018 (T1) and 2018–2019 (T2). Weights-adjusted logistic regression was used to estimate the associations of binge drinking, smoking/vaping, cannabis use, prescription drug abuse, illicit drug use, alcohol use disorder (AUD), and drug use disorder (DUD) at T1 with suicide ideation, plan, and attempt at T2. Interaction models tested for moderation of these associations by sex, depression, and recency of separation/deactivation. Suicide attempt models were also fit in the subgroup with ideation at T1 (n = 1,527).
Results
In models controlling for socio-demographic characteristics and prior suicidality, binge drinking, cannabis use, prescription drug abuse, illicit drug use, and AUD were associated with subsequent suicidal ideation (AORs = 1.42–2.60, ps < .01). Binge drinking, AUD, and DUD were associated with subsequent suicide plan (AORs = 1.23–1.95, ps < .05). None of the substance use variables had a main effect on suicide attempt; however, interaction models suggested certain types of drug use predicted attempts among those without depression. Additionally, the effects of smoking/vaping and AUD differed by sex. Substance misuse did not predict the transition from ideation to attempt.
Conclusions
Alcohol and drug misuse are associated with subsequent suicidal behaviors in this population. Awareness of differences across sex and depression status may inform suicide risk assessment.
While previous studies have reported high rates of documented suicide attempts (SAs) in the U.S. Army, the extent to which soldiers make SAs that are not identified in the healthcare system is unknown. Understanding undetected suicidal behavior is important in broadening prevention and intervention efforts.
Methods
Representative survey of U.S. Regular Army enlisted soldiers (n = 24 475). Reported SAs during service were compared with SAs documented in administrative medical records. Logistic regression analyses examined sociodemographic characteristics differentiating soldiers with an undetected SA v. documented SA. Among those with an undetected SA, chi-square tests examined characteristics associated with receiving a mental health diagnosis (MH-Dx) prior to SA. Discrete-time survival analysis estimated risk of undetected SA by time in service.
Results
Prevalence of undetected SA (unweighted n = 259) was 1.3%. Annual incidence was 255.6 per 100 000 soldiers, suggesting one in three SAs are undetected. In multivariable analysis, rank ⩾E5 (OR = 3.1[95%CI 1.6–5.7]) was associated with increased odds of undetected v. documented SA. Females were more likely to have a MH-Dx prior to their undetected SA (Rao-Scott χ21 = 6.1, p = .01). Over one-fifth of undetected SAs resulted in at least moderate injury. Risk of undetected SA was greater during the first four years of service.
Conclusions
Findings suggest that substantially more soldiers make SAs than indicated by estimates based on documented attempts. A sizable minority of undetected SAs result in significant injury. Soldiers reporting an undetected SA tend to be higher ranking than those with documented SAs. Undetected SAs require additional approaches to identifying individuals at risk.
Insecure attachment styles are associated with retrospectively reported suicide attempts (SAs). It is not known if attachment styles are prospectively associated with medically documented SAs.
Methods
A representative sample of US Army soldiers entering service (n = 21 772) was surveyed and followed via administrative records for their first 48 months of service. Attachment style (secure, preoccupied, fearful, dismissing) was assessed at baseline. Administrative medical records identified SAs. Discrete-time survival analysis examined associations of attachment style with future SA during service, adjusting for time in service, socio-demographics, service-related variables, and mental health diagnosis (MH-Dx). We examined whether associations of attachment style with SA differed based on sex and MH-Dx.
Results
In total, 253 respondents attempted suicide. Endorsed attachment styles included secure (46.8%), preoccupied (9.1%), fearful (15.7%), and dismissing (19.2%). Examined separately, insecure attachment styles were associated with increased odds of SA: preoccupied [OR 2.5 (95% CI 1.7–3.4)], fearful [OR 1.6 (95% CI 1.1–2.3)], dismissing [OR 1.8 (95% CI 1.3–2.6)]. Examining attachment styles simultaneously along with other covariates, preoccupied [OR 1.9 (95% CI 1.4–2.7)] and dismissing [OR 1.7 (95% CI 1.2–2.4)] remained significant. The dismissing attachment and MH-Dx interaction was significant. In stratified analyses, dismissing attachment was associated with SA only among soldiers without MH-Dx. Other interactions were non-significant. Soldiers endorsing any insecure attachment style had elevated SA risk across the first 48 months in service, particularly during the first 12 months.
Conclusions
Insecure attachment styles, particularly preoccupied and dismissing, are associated with increased future SA risk among soldiers. Elevated risk is most substantial during first year of service but persists through the first 48 months. Dismissing attachment may indicate risk specifically among soldiers not identified by the mental healthcare system.
Risk of suicide-related behaviors is elevated among military personnel transitioning to civilian life. An earlier report showed that high-risk U.S. Army soldiers could be identified shortly before this transition with a machine learning model that included predictors from administrative systems, self-report surveys, and geospatial data. Based on this result, a Veterans Affairs and Army initiative was launched to evaluate a suicide-prevention intervention for high-risk transitioning soldiers. To make targeting practical, though, a streamlined model and risk calculator were needed that used only a short series of self-report survey questions.
Methods
We revised the original model in a sample of n = 8335 observations from the Study to Assess Risk and Resilience in Servicemembers-Longitudinal Study (STARRS-LS) who participated in one of three Army STARRS 2011–2014 baseline surveys while in service and in one or more subsequent panel surveys (LS1: 2016–2018, LS2: 2018–2019) after leaving service. We trained ensemble machine learning models with constrained numbers of item-level survey predictors in a 70% training sample. The outcome was self-reported post-transition suicide attempts (SA). The models were validated in the 30% test sample.
Results
Twelve-month post-transition SA prevalence was 1.0% (s.e. = 0.1). The best constrained model, with only 17 predictors, had a test sample ROC-AUC of 0.85 (s.e. = 0.03). The 10–30% of respondents with the highest predicted risk included 44.9–92.5% of 12-month SAs.
Conclusions
An accurate SA risk calculator based on a short self-report survey can target transitioning soldiers shortly before leaving service for intervention to prevent post-transition SA.
Identification of genetic risk factors may inform the prevention and treatment of posttraumatic stress disorder (PTSD). This study evaluates the associations of polygenic risk scores (PRS) with patterns of posttraumatic stress symptoms following combat deployment.
Method
US Army soldiers of European ancestry (n = 4900) provided genomic data and ratings of posttraumatic stress symptoms before and after deployment to Afghanistan in 2012. Latent growth mixture modeling was used to model posttraumatic stress symptom trajectories among participants who provided post-deployment data (n = 4353). Multinomial logistic regression models tested independent associations between trajectory membership and PRS for PTSD, major depressive disorder (MDD), schizophrenia, neuroticism, alcohol use disorder, and suicide attempt, controlling for age, sex, ancestry, and exposure to potentially traumatic events, and weighted to account for uncertainty in trajectory classification and missing data.
Results
Participants were classified into low-severity (77.2%), increasing-severity (10.5%), decreasing-severity (8.0%), and high-severity (4.3%) posttraumatic stress symptom trajectories. Standardized PTSD-PRS and MDD-PRS were associated with greater odds of membership in the high-severity v. low-severity trajectory [adjusted odds ratios and 95% confidence intervals, 1.23 (1.06–1.43) and 1.18 (1.02–1.37), respectively] and the increasing-severity v. low-severity trajectory [1.12 (1.01–1.25) and 1.16 (1.04–1.28), respectively]. Additionally, MDD-PRS was associated with greater odds of membership in the decreasing-severity v. low-severity trajectory [1.16 (1.03–1.31)]. No other associations were statistically significant.
Conclusions
Higher polygenic risk for PTSD or MDD is associated with more severe posttraumatic stress symptom trajectories following combat deployment. PRS may help stratify at-risk individuals, enabling more precise targeting of treatment and prevention programs.
Emotion reactivity and risk behaviors (ERRB) are transdiagnostic dimensions associated with suicide attempt (SA). ERRB patterns may identify individuals at increased risk of future SAs.
Methods
A representative sample of US Army soldiers entering basic combat training (n = 21 772) was surveyed and followed via administrative records for their first 48 months of service. Latent profile analysis of baseline survey items assessing ERRB dimensions, including emotion reactivity, impulsivity, and risk-taking behaviors, identified distinct response patterns (classes). SAs were identified using administrative medical records. A discrete-time survival framework was used to examine associations of ERRB classes with subsequent SA during the first 48 months of service, adjusting for time in service, socio-demographic and service-related variables, and mental health diagnosis (MH-Dx). We examined whether associations of ERRB classes with SA differed by year of service and for soldiers with and without a MH-Dx.
Results
Of 21 772 respondents (86.2% male, 61.8% White non-Hispanic), 253 made a SA. Four ERRB classes were identified: ‘Indirect Harming’ (8.9% of soldiers), ‘Impulsive’ (19.3%), ‘Risk-Taking’ (16.3%), and ‘Low ERRB’ (55.6%). Compared to Low ERRB, Impulsive [OR 1.8 (95% CI 1.3–2.4)] and Risk-Taking [OR 1.6 (95% CI 1.1–2.2)] had higher odds of SA after adjusting for covariates. The ERRB class and MH-Dx interaction was non-significant. Within each class, SA risk varied across service time.
Conclusions
SA risk within the four identified ERRB classes varied across service time. Impulsive and Risk-Taking soldiers had increased risk of future SA. MH-Dx did not modify these associations, which may therefore help identify risk in those not yet receiving mental healthcare.
Personality traits (e.g. neuroticism) and the social environment predict risk for internalizing disorders and suicidal behavior. Studying these characteristics together and prospectively within a population confronted with high stressor exposure (e.g. U.S. Army soldiers) has not been done, yet could uncover unique and interactive predictive effects that may inform prevention and early intervention efforts.
Methods
Five broad personality traits and social network size were assessed via self-administered questionnaires among experienced soldiers preparing for deployment (N = 4645) and new soldiers reporting for basic training (N = 6216). Predictive models examined associations of baseline personality and social network variables with recent distress disorders or suicidal behaviors assessed 3- and 9-months post-deployment and approximately 5 years following enlistment.
Results
Among the personality traits, elevated neuroticism was consistently associated with increased mental health risk following deployment. Small social networks were also associated with increased mental health risk following deployment, beyond the variance accounted for by personality. Limited support was found for social network size moderating the association between personality and mental health outcomes. Small social networks also predicted distress disorders and suicidal behavior 5 years following enlistment, whereas unique effects of personality traits on these more distal outcomes were rare.
Conclusions
Heightened neuroticism and small social networks predict a greater risk for negative mental health sequelae, especially following deployment. Social ties may mitigate adverse impacts of personality traits on psychopathology in some contexts. Early identification and targeted intervention for these distinct, modifiable factors may decrease the risk of distress disorders and suicidal behavior.
In times of repeated disaster events, including natural disasters and pandemics, public health workers must recover rapidly to respond to subsequent events. Understanding predictors of time to recovery and developing predictive models of time to recovery can aid planning and management.
Methods:
We examined 681 public health workers (21-72 y, M(standard deviation [SD]) = 48.25(10.15); 79% female) 1 mo before (T1) and 9 mo after (T2) the 2005 hurricane season. Demographics, trauma history, social support, time to recover from previous hurricane season, and predisaster work productivity were assessed at T1. T2 assessed previous disaster work, initial emotional response, and personal hurricane injury/damage. The primary outcome was time to recover from the most recent hurricane event.
Results:
Multivariate analyses found that less support (T1; odds ratio [OR] = .74[95% confidence interval [CI] = .60-.92]), longer previous recovery time (T1; OR = 5.22[95%CI = 3.01-9.08]), lower predisaster work productivity (T1; OR = 1.98[95%CI = 1.08-3.61]), disaster-related personal injury/damage (T2; OR = 3.08[95%CI = 1.70-5.58]), and initial emotional response (T2; OR = 1.71[95%CI = 1.34-2.19]) were associated with longer recovery time (T2).
Conclusions:
Recovery time was adversely affected in disaster responders with a history of longer recovery time, personal injury/damage, lower work productivity following prior hurricanes, and initial emotional response, whereas responders with social support had shorter recovery time. Predictors of recovery time should be a focus for disaster preparedness planners.
The transition from military service to civilian life is a high-risk period for suicide attempts (SAs). Although stressful life events (SLEs) faced by transitioning soldiers are thought to be implicated, systematic prospective evidence is lacking.
Methods
Participants in the Army Study to Assess Risk and Resilience in Servicemembers (STARRS) completed baseline self-report surveys while on active duty in 2011–2014. Two self-report follow-up Longitudinal Surveys (LS1: 2016–2018; LS2: 2018–2019) were subsequently administered to probability subsamples of these baseline respondents. As detailed in a previous report, a SA risk index based on survey, administrative, and geospatial data collected before separation/deactivation identified 15% of the LS respondents who had separated/deactivated as being high-risk for self-reported post-separation/deactivation SAs. The current report presents an investigation of the extent to which self-reported SLEs occurring in the 12 months before each LS survey might have mediated/modified the association between this SA risk index and post-separation/deactivation SAs.
Results
The 15% of respondents identified as high-risk had a significantly elevated prevalence of some post-separation/deactivation SLEs. In addition, the associations of some SLEs with SAs were significantly stronger among predicted high-risk than lower-risk respondents. Demographic rate decomposition showed that 59.5% (s.e. = 10.2) of the overall association between the predicted high-risk index and subsequent SAs was linked to these SLEs.
Conclusions
It might be possible to prevent a substantial proportion of post-separation/deactivation SAs by providing high-risk soldiers with targeted preventive interventions for exposure/vulnerability to commonly occurring SLEs.
Problematic anger is frequently reported by soldiers who have deployed to combat zones. However, evidence is lacking with respect to how anger changes over a deployment cycle, and which factors prospectively influence change in anger among combat-deployed soldiers.
Methods
Reports of problematic anger were obtained from 7298 US Army soldiers who deployed to Afghanistan in 2012. A series of mixed-effects growth models estimated linear trajectories of anger over a period of 1–2 months before deployment to 9 months post-deployment, and evaluated the effects of pre-deployment factors (prior deployments and perceived resilience) on average levels and growth of problematic anger.
Results
A model with random intercepts and slopes provided the best fit, indicating heterogeneity in soldiers' levels and trajectories of anger. First-time deployers reported the lowest anger overall, but the most growth in anger over time. Soldiers with multiple prior deployments displayed the highest anger overall, which remained relatively stable over time. Higher pre-deployment resilience was associated with lower reports of anger, but its protective effect diminished over time. First- and second-time deployers reporting low resilience displayed different anger trajectories (stable v. decreasing, respectively).
Conclusions
Change in anger from pre- to post-deployment varies based on pre-deployment factors. The observed differences in anger trajectories suggest that efforts to detect and reduce problematic anger should be tailored for first-time v. repeat deployers. Ongoing screening is needed even for soldiers reporting high resilience before deployment, as the protective effect of pre-deployment resilience on anger erodes over time.
Definition of disorder subtypes may facilitate precision treatment for posttraumatic stress disorder (PTSD). We aimed to identify PTSD subtypes and evaluate their associations with genetic risk factors, types of stress exposures, comorbidity, and course of PTSD.
Methods
Data came from a prospective study of three U.S. Army Brigade Combat Teams that deployed to Afghanistan in 2012. Soldiers with probable PTSD (PTSD Checklist for Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition ≥31) at three months postdeployment comprised the sample (N = 423) for latent profile analysis using Gaussian mixture modeling and PTSD symptom ratings as indicators. PTSD profiles were compared on polygenic risk scores (derived from external genomewide association study summary statistics), experiences during deployment, comorbidity at three months postdeployment, and persistence of PTSD at nine months postdeployment.
Results
Latent profile analysis revealed profiles characterized by prominent intrusions, avoidance, and hyperarousal (threat-reactivity profile; n = 129), anhedonia and negative affect (dysphoric profile; n = 195), and high levels of all PTSD symptoms (high-symptom profile; n = 99). The threat-reactivity profile had the most combat exposure and the least comorbidity. The dysphoric profile had the highest polygenic risk for major depression, and more personal life stress and co-occurring major depression than the threat-reactivity profile. The high-symptom profile had the highest rates of concurrent mental disorders and persistence of PTSD.
Conclusions
Genetic and trauma-related factors likely contribute to PTSD heterogeneity, which can be parsed into subtypes that differ in symptom expression, comorbidity, and course. Future studies should evaluate whether PTSD typology modifies treatment response and should clarify distinctions between the dysphoric profile and depressive disorders.
Community characteristics, such as collective efficacy, a measure of community strength, can affect behavioral responses following disasters. We measured collective efficacy 1 month before multiple hurricanes in 2005, and assessed its association to preparedness 9 months following the hurricane season.
Methods:
Participants were 631 Florida Department of Health workers who responded to multiple hurricanes in 2004 and 2005. They completed questionnaires that were distributed electronically approximately 1 month before (6.2005-T1) and 9 months after (6.2006-T2) several storms over the 2005 hurricane season. Collective efficacy, preparedness behaviors, and socio-demographics were assessed at T1, and preparedness behaviors and hurricane-related characteristics (injury, community-related damage) were assessed at T2. Participant ages ranged from 21-72 (M(SD) = 48.50 (10.15)), and the majority were female (78%).
Results:
In linear regression models, univariate analyses indicated that being older (B = 0.01, SE = 0.003, P < 0.001), White (B = 0.22, SE = 0.08, P < 0.01), and married (B = 0.05, SE = 0.02, p < 0.001) was associated with preparedness following the 2005 hurricanes. Multivariate analyses, adjusting for socio-demographics, preparedness (T1), and hurricane-related characteristics (T2), found that higher collective efficacy (T1) was associated with preparedness after the hurricanes (B = 0.10, SE = 0.03, P < 0.01; and B = 0.47, SE = 0.04, P < 0.001 respectively).
Conclusion:
Programs enhancing collective efficacy may be a significant part of prevention practices and promote preparedness efforts before disasters.
This study examined the relationship of perceived safety and confidence in local law enforcement and government to changes in daily life activities during the Washington, DC, sniper attacks.
Methods:
Participants were 1238 residents from the Washington, DC metropolitan area who were assessed using an Internet survey that included items related to safety at work, at home, and in general, confidence in law enforcement/government, and changes in routine daily life activities.
Results:
A majority of participants (52%, n = 640) reported changing their daily life activities, with approximately one-third identifying changes related to being in large places and getting gas. Perceived safety was associated with confidence in local law enforcement/government. After adjusting for demographics, lower feelings of safety and less confidence in law enforcement/government were related to a higher likelihood of altered daily activities. Confidence in local law enforcement/government modified the association of safety with changes in daily activities. Among participants with high safety, less confidence in local law enforcement/government was associated with greater changes in daily life activities.
Conclusions:
Serial shooting events affect feelings of safety and disrupt routine life activities. Focus on enhancing experiences of safety and confidence in local law enforcement and government may decrease the life disruption associated with terrorist shootings.
Unit cohesion may protect service member mental health by mitigating effects of combat exposure; however, questions remain about the origins of potential stress-buffering effects. We examined buffering effects associated with two forms of unit cohesion (peer-oriented horizontal cohesion and subordinate-leader vertical cohesion) defined as either individual-level or aggregated unit-level variables.
Methods
Longitudinal survey data from US Army soldiers who deployed to Afghanistan in 2012 were analyzed using mixed-effects regression. Models evaluated individual- and unit-level interaction effects of combat exposure and cohesion during deployment on symptoms of post-traumatic stress disorder (PTSD), depression, and suicidal ideation reported at 3 months post-deployment (model n's = 6684 to 6826). Given the small effective sample size (k = 89), the significance of unit-level interactions was evaluated at a 90% confidence level.
Results
At the individual-level, buffering effects of horizontal cohesion were found for PTSD symptoms [B = −0.11, 95% CI (−0.18 to −0.04), p < 0.01] and depressive symptoms [B = −0.06, 95% CI (−0.10 to −0.01), p < 0.05]; while a buffering effect of vertical cohesion was observed for PTSD symptoms only [B = −0.03, 95% CI (−0.06 to −0.0001), p < 0.05]. At the unit-level, buffering effects of horizontal (but not vertical) cohesion were observed for PTSD symptoms [B = −0.91, 90% CI (−1.70 to −0.11), p = 0.06], depressive symptoms [B = −0.83, 90% CI (−1.24 to −0.41), p < 0.01], and suicidal ideation [B = −0.32, 90% CI (−0.62 to −0.01), p = 0.08].
Conclusions
Policies and interventions that enhance horizontal cohesion may protect combat-exposed units against post-deployment mental health problems. Efforts to support individual soldiers who report low levels of horizontal or vertical cohesion may also yield mental health benefits.
Whereas genetic susceptibility increases the risk for major depressive disorder (MDD), non-genetic protective factors may mitigate this risk. In a large-scale prospective study of US Army soldiers, we examined whether trait resilience and/or unit cohesion could protect against the onset of MDD following combat deployment, even in soldiers at high polygenic risk.
Methods
Data were analyzed from 3079 soldiers of European ancestry assessed before and after their deployment to Afghanistan. Incident MDD was defined as no MDD episode at pre-deployment, followed by a MDD episode following deployment. Polygenic risk scores were constructed from a large-scale genome-wide association study of major depression. We first examined the main effects of the MDD PRS and each protective factor on incident MDD. We then tested the effects of each protective factor on incident MDD across strata of polygenic risk.
Results
Polygenic risk showed a dose–response relationship to depression, such that soldiers at high polygenic risk had greatest odds for incident MDD. Both unit cohesion and trait resilience were prospectively associated with reduced risk for incident MDD. Notably, the protective effect of unit cohesion persisted even in soldiers at highest polygenic risk.
Conclusions
Polygenic risk was associated with new-onset MDD in deployed soldiers. However, unit cohesion – an index of perceived support and morale – was protective against incident MDD even among those at highest genetic risk, and may represent a potent target for promoting resilience in vulnerable soldiers. Findings illustrate the value of combining genomic and environmental data in a prospective design to identify robust protective factors for mental health.
Distinguishing a disorder of persistent and impairing grief from normative grief allows clinicians to identify this often undetected and disabling condition. As four diagnostic criteria sets for a grief disorder have been proposed, their similarities and differences need to be elucidated.
Methods
Participants were family members bereaved by US military service death (N = 1732). We conducted analyses to assess the accuracy of each criteria set in identifying threshold cases (participants who endorsed baseline Inventory of Complicated Grief ⩾30 and Work and Social Adjustment Scale ⩾20) and excluding those below this threshold. We also calculated agreement among criteria sets by varying numbers of required associated symptoms.
Results
All four criteria sets accurately excluded participants below our identified clinical threshold (i.e. correctly excluding 86–96% of those subthreshold), but they varied in identification of threshold cases (i.e. correctly identifying 47–82%). When the number of associated symptoms was held constant, criteria sets performed similarly. Accurate case identification was optimized when one or two associated symptoms were required. When employing optimized symptom numbers, pairwise agreements among criteria became correspondingly ‘very good’ (κ = 0.86–0.96).
Conclusions
The four proposed criteria sets describe a similar condition of persistent and impairing grief, but differ primarily in criteria restrictiveness. Diagnostic guidance for prolonged grief disorder in International Classification of Diseases, 11th Edition (ICD-11) functions well, whereas the criteria put forth in Section III of Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) are unnecessarily restrictive.
Community characteristics, such as perceived collective efficacy, a measure of community strength, can affect mental health outcomes following disasters. We examined the association of perceived collective efficacy with posttraumatic stress disorder (PTSD) and frequent mental distress (14 or more mentally unhealthy days in the past month) following exposure to the 2004 and 2005 hurricane seasons.
Methods
Participants were 1486 Florida Department of Health workers who completed anonymous questionnaires that were distributed electronically 9 months after the 2005 hurricane season. Participant ages ranged from 20 to 79 years (mean, 48; SD, 10.7), and the majority were female (79%), white (75%), and currently married (64%). Fifty percent had a BA/BS degree or higher.
Results
In 2 separate logistic regression models, each adjusted for individual sociodemographics, community socioeconomic characteristics, individual injury/damage, and community storm damage, lower perceived collective efficacy was significantly associated with a greater likelihood of having PTSD (OR, 0.93; 95% CI, 0.90-0.96), and lower collective efficacy was significantly associated with frequent mental distress (OR, 0.94; 95% CI, 0.92-0.96).
Conclusions
Programs enhancing community collective efficacy may be a significant part of prevention practices and possibly lead to a reduction in the rate of PTSD and persistent distress postdisaster. (Disaster Med Public Health Preparedness. 2019;13:44–52).