Objective:While early-life adverse experiences have been linked to late-life cognitive decline, few studies have explored war exposure. Paradoxically, one study even indicated a late-life cognitive advantage of early-childhood war exposure. In the present study, we explored these associations.
Methods:We examined older adults exposed to World War II (1940–1944; n = 1179) and the subsequent Civil war (1946–1949; n = 962) in Greece during early and middle childhood with a comprehensive neuropsychological assessment and for ApoE-ε allele status, including demographic information and medical history.
Results:Higher cognitive performance in language tasks predicted middle childhood, relative to early childhood, WWII-exposure group membership (B = .316, p = .038, OR:1.372, 95%CI:1.018–1.849), primarily for men, while higher attention/speed (B = .818, p = .002, OR:2.265, 95%CI:1.337–3.838) and total cognitive score (B = .536, p = .040, OR:1.709, 95%CI:1.026–2.849) were predictors of belonging to the middle-childhood group, only in men. Individuals who did not meet criteria for Mild Cognitive Impairment (MCI)/dementia were more likely to belong to the middle-childhood war-exposure group. Similarly, for the Civil war, higher cognitive scores and reduced likelihood to meet criteria for MCI/dementia were predictors of middle, relative to early childhood war exposure group membership (visuospatial score: B = .544, p = .001, OR:1.723, 95%CI:1.246–2.381, MMSE: B = .134, p = .020, OR:1.143, 95%CI:1.021–1.297), primarily for women. Results remained consistent when adjusting for multimorbidity, sex, education, current age, depression, and anxiety.
Conclusion:The present findings suggest that better cognitive performance and lower likelihood of MCI or dementia were associated with being exposed to significant hardships, such as war, during middle childhood, regardless of potentially confounding factors. Further studies are needed to shed light on this relationship.