A fascinating study exploring the longevity of Black Death survivors in fourteenth century England has shone light on how famine and early life malnutrition can affect health well into adulthood.
The research, in Science Advances, reveals the biophysical trade-off made by the body to cope with severe nutritional stress in the womb or in childhood, which may still apply today.
While surviving famine increased the likelihood of reaching adolescence, it came at the cost of increased susceptibility to death later in life from common causes such as cardiovascular disease and cancer.
“What our findings suggest is that relationships between malnutrition early in life and poor health in middle and late adulthood have a very deep history,” researcher Sharon DeWitte, PhD, from the University of Colorado, told Inside Precision Medicine.
“This highlights the need to ensure that pregnant individuals and young children have access to nutritious diets in order to lessen the risks of poor health across the lifespan, and for people who know they experienced malnutrition as children to be really vigilant about diets and other healthy lifestyle habits as adults.”
The study examined the relationship between nutritional stress in childhood and health outcomes using skeletal remains from 275 people during and after the Black Death in London and rural Lincolnshire, from approximately 1000 to 1540 CE.
There were famines in England every 14 years on average in the eleventh to thirteenth centuries in the common era, with a slight decrease in the fourteenth to sixteenth centuries CE.
These are often attributed to the climate change at that time, with their effects exacerbated by social and economic factors, which tap into the concept of the “syndemic.”
This recognizes that diseases are not just biological and that outcomes are not simply the result of the interaction of a pathogen with a person’s immune system but operate with synergistic effects on a broader individual and societal level.
To therefore understand disease susceptibility and outcomes, careful attention needs to be paid to biological, environmental, physical, and social contexts that can include coinfections with multiple pathogens, malnutrition, wealth inequality, and social marginalization.
The team examined bones and teeth, focusing on isotopic profiles stored in dentine—a bony part of the tooth under the enamel layer that can reveal periods of nutritional stress.
Because teeth form in a well-understood and robust manner during childhood and adolescence and do not remodel, changes in the isotope ratios of carbon and nitrogen can be used to investigate changes in diet and physiology over set periods of life.
Restricting their analysis to people who died when they were less than 30 years of age, the researchers found that survivors of nutritional stress in utero and early childhood were resilient when it came to dying in childhood, adolescence, or early adulthood versus those without such stress signatures.
“However, in our study, the lack of a significant difference in hazards of death for those who died below the age of 30 with and without signatures of childhood nutritional stress cautions us against overinterpreting the survival analysis findings for this age group,” the authors caution.
Perhaps more meaningful was that, on looking at people older than 30 years, survivorship was significantly higher and hazards of death significantly lower for people without signatures of severe childhood nutritional stress than those who had these signatures.
“Perhaps the people in our study who suffered early-life nutritional stress underwent developmental adaptations as a result that shaped their physiology for a future of nutritional deprivations, but they subsequently experienced relative nutritional abundance in adulthood,” the authors speculated in their article.
“Thus, they may have suffered higher rates of diseases common in later adulthood, such as cardiovascular disease and other ‘degenerative’ diseases, as has been suggested to explain the relationships between early-life stress and noncommunicable disease outcomes for present-day people.”