Category: 8. Health

  • Is fruit juice good or bad for diabetes? New research reveals a genetic twist

    Is fruit juice good or bad for diabetes? New research reveals a genetic twist

    New research reveals that enjoying fruit juice may help offset genetic risk for type 2 diabetes, at least for some Japanese adults, underscoring the promise of personalized nutrition in disease prevention.

    Inverse association between fruit juice consumption and type 2 diabetes among individuals with high genetic risk on type 2 diabetes: The J-MICC Study. Image Credit: Garna Zarina / Shutterstock

    In a recent study published in the British Journal of Nutrition, researchers investigated whether 100% fruit juice consumption influences the risk of type 2 diabetes (T2D) in Japanese adults.

    T2D prevalence has steadily increased in Japan, affecting about 8% of adults. Adiposity, genetic predisposition, and age are the known drivers of T2D, with modifiable dietary factors being important targets for T2D prevention. Fruit juice is often studied in this context, albeit the findings remain inconsistent. Various studies have demonstrated null, positive, or inverse associations between fruit juice and incident T2D.

    This heterogeneity reflects differences in juice type, a key distinction being between sugar-sweetened beverages and 100% fruit juice, and population differences in obesity, diet, genetic risks, and analytical approaches. Rapidly absorbable sugars could increase postprandial glucose levels, weight gain, and hepatic lipogenesis, elevating T2D risk, whereas fruit-derived micronutrients may improve insulin sensitivity and reduce oxidative stress, decreasing risk.

    Polygenic risk scores (PRSs) summarize thousands of common T2D-related variants and estimate inherited risk. A growing body of evidence suggests that gene-diet interactions could attenuate or amplify nutritional effects on glycemic outcomes. However, no study has investigated fruit juice consumption across strata of T2D polygenic risk in East Asians, and whether fruit juice intake is associated with T2D among Japanese adults remains unclear.

    About the study

    In the present study, researchers evaluated the associations between fruit juice intake and T2D risk in Japanese adults. They used data from the Japan Multi-Institutional Collaborative Cohort (J-MICC) study, which recruited around 100,000 adults aged 35-69 from 2005 to 2014. It is important to note that this was a cross-sectional study, which examines data at a single point in time and therefore cannot establish a cause-and-effect relationship. Participants completed a questionnaire on sociodemographics, lifestyle, and medical history at baseline.

    Anthropometric measurements were taken, and blood samples were collected at baseline. The International Physical Activity Questionnaire was used to assess physical activity. The Food Frequency Questionnaire was administered to estimate food intake. The questionnaire evaluated the consumption of 100% fruit juice across seven categories, which for primary analysis were grouped into “no intake” or “at least once weekly.”

    The primary outcome was self-reported, physician-diagnosed T2D. In total, 14,068 J-MICC participants were genotyped. Two East-Asian polygenic scores for T2D were identified from the Polygenic Score Catalog: PGS002379 with 920,930 variants and PGS001294 with 3,496 variants. The primary analyses used PGS002379 as it had broader genomic coverage.

    PRS was calculated for each participant and standardized to a Z-score. The Z-score was stratified into quintiles (high, middle-high, middle, middle-low, and low), with the top quintile being the high genetic risk group. Multivariate logistic regression models were used to estimate odds ratios and 95% confidence intervals for T2D. Model 1 was adjusted for participants’ age, sex, and site (or residential location).

    Model 2 was additionally adjusted for hypertension, dyslipidemia, education, and family history of T2D. Model 3 was further adjusted for physical activity, alcohol and smoking status, and daily average sleep duration. In addition, the team investigated the effect of interactions between PRS and fruit juice intake on T2D. Finally, they assessed the association between fruit juice intake and T2D stratified by PRS quintiles.

    Findings

    The study included 13,769 participants, including 7,517 females. Of these, 814 individuals (267 females and 547 males) were diagnosed with T2D. Individuals with T2D were significantly older and had higher weight and height than those without T2D. Further, T2D subjects had longer daily sleep duration and lower physical activity than non-T2D subjects.

    Alcohol intake, smoking, hypertension, and hyperlipidemia were more prevalent among T2D subjects than non-T2D subjects. Notably, more T2D subjects quit both smoking and alcohol consumption compared to those without T2D. Participants who consumed 100% fruit juice had significantly lower odds of T2D than those who did not. A more detailed analysis showed a dose-response pattern, where higher consumption was associated with lower odds of T2D.

    This association was sustained after additional adjustment for confounding variables (models 2 and 3). After finding statistical evidence of a gene-diet interaction, the key finding emerged from a stratified analysis: T2D was significantly lower only in fruit juice consumers with a high genetic risk, with no significant association found in those with low or moderate genetic risk. However, the study’s design means it is also possible that this association could be due to reverse causation, where individuals may have reduced their fruit juice intake after receiving a T2D diagnosis.

    Conclusions

    Taken together, the findings suggest an inverse association between T2D and 100% fruit juice intake among Japanese people with high genetic risk for T2D. Notably, this association was not observed in people with low genetic risk for T2D, strengthening the suggestion of potential interactions between dietary factors and genetic predisposition. Given the study’s limitations, particularly its cross-sectional nature, these findings do not prove that fruit juice prevents diabetes. Further longitudinal studies are needed to clarify whether this association is causal and to identify the specific genetic variants that may modulate the metabolic response to fruit juice constituents.

    Continue Reading

  • One Piece of Advice to Parents Slashed Food Allergies in Children : ScienceAlert

    One Piece of Advice to Parents Slashed Food Allergies in Children : ScienceAlert

    New research highlights a crucial time window very early on in life, during which the introduction of eggs and peanut butter into the diets of babies significantly reduces the chances of them becoming allergic to these foods later on.

    The findings run counter to previous advice given to parents to avoid giving these foods to their kids until they’re at least a year old over concerns that they might trigger allergic reactions.

    Here, researchers led by a team from the University of Western Australia compared the experiences of two groups of children in Australia: 506 whose parents got no specific feeding advice, and 566 whose parents were advised to start adding eggs and peanut butter to the diets of the infants at around six months.

    Related: Skin Injuries And Food Allergies May Have a Mysterious Connection

    “For the babies in group two – whose caregivers followed the updated guidelines and introduced peanut butter and egg around six months of age – egg allergy reduced from 12 percent to 3 percent, and peanut allergy reduced from around 6 percent to 1 percent,” says Summer Walker, a health scientist at the University of Western Australia.

    In other words, earlier introduction of these foods at the six-month mark made a notable difference to the number of kids who went on to develop allergies by 12 months of age. Cow’s milk was also included, though here the difference was smaller.

    The advice itself isn’t new, and the six-month milestone has in fact now been added to the official Infant Feeding and Allergy Prevention Guidelines proposed by the Australasian Society of Clinical Immunology and Allergy (ASCIA). Testing the guidelines in real population groups confirms the recommendations are safe and effective.

    The researchers tracked allergic reactions in babies up to 12 months old. (Walker et al., J Allergy Clin Immunol Pract., 2025)

    The parents of the second group of 566 were provided with hard copies of the ASCIA guidelines, and the researchers are keen to raise awareness of the latest expert advice on how best to reduce allergy risk.

    “By increasing the distribution of guidelines and encouraging health professionals to share the information, we can considerably reduce the incidence of food allergies in the community,” says Walker.

    Understanding why allergies develop is a complex challenge, and it’s important to note that these infants were only tested for allergies up to 12 months – and that allergies to peanut butter and eggs weren’t completely eradicated.

    Nonetheless, amid signs that food allergies in children have been on the rise in spite of instructions to avoid specific food items, a review of the relationship between diet and immune responses is critical.

    This is just one part of the picture, but it’s strong evidence that the latest guidelines do make a difference – even in relation to a higher level of genetic risk. All the babies involved in the research had a close relative with an allergy to one of these foods, and allergies often run in families.

    “Some parents are still confused about when to introduce allergens – especially those families with a history of allergies,” says research dietitian Debbie Palmer, from the University of Western Australia.

    The research has been published in the Journal of Allergy and Clinical Immunology: In Practice.

    Continue Reading

  • Ultra-processed foods threaten brain health in kids and teens, review warns

    Ultra-processed foods threaten brain health in kids and teens, review warns

    A major review finds that diets high in ultra-processed foods may rewire the developing brain, amplifying risks for ADHD, depression, and even dementia, spotlighting the urgent need to rethink what children and expectant mothers eat.

    Review: The consequences of ultra-processed foods on brain development during prenatal, adolescent and adult stages. Image Credit: Lightspring / Shutterstock

    In a recent review article published in the journal Frontiers in Public Health, researchers in Switzerland examined the impacts of consuming ultra-processed foods (UPFs) on brain function and development during the critical periods of childhood, adolescence, and pregnancy.

    Their conclusions raise concerns that exposure to UPFs during early life could impair cognitive development and increase risks to long-term mental health, including neurodevelopmental disorders such as ADHD and ASD, and later-life risks such as dementia and Alzheimer’s disease. The review also describes habitual overconsumption and reward dysfunction related to UPF-driven changes in brain reward circuitry, though it does not explicitly characterize these as “addiction-like eating behaviours.” This highlights the urgent need for public health strategies targeting maternal and child nutrition.

    Growing concerns over UPFs

    UPFs, which are energy-dense products high in unhealthy fats, salt, and sugar, have become a major part of modern diets and are increasingly being linked to mental disorders, metabolic disease, and obesity.

    While effects on adults are well-documented, the impact of UPFs on brain development during vulnerable life stages, such as early childhood, adolescence, and pregnancy, is less understood.

    The development and spread of convenience foods began in the mid-20th century, with products like frozen dinners and the introduction of microwaves boosting their popularity.

    By the 1980s, concerns over health impacts began to grow, and in 2009, the NOVA classification system formally defined UPFs, distinguishing them from minimally processed and whole foods.

    UPFs are designed for palatability, affordability, and long shelf life, but are nutritionally poor and often contain additives and harmful byproducts from processing and packaging. The NOVA system groups foods into four categories based on processing levels, with UPFs being the most altered and least nutritious.

    UPFs now account for more than half of dietary energy intake in many developed nations, with consumption also rising in middle-income countries.

    Of particular concern is the increasing intake among children and adolescents, a population highly susceptible to nutritional deficiencies. This trend could have profound effects on brain development and mental health, as well as exacerbate sensory-driven selective eating behaviours (such as those seen in ARFID), a selective eating disorder that the review discusses in the context of UPFs’ uniform texture and sensory properties but does not causally link to UPF consumption. Such effects may reinforce a cycle of poor health outcomes across generations.

    Given the complexity of brain maturation and the role of nutrition in shaping outcomes throughout one’s life, understanding the impacts of UPFs on neurodevelopment is crucial.

    The lifelong and intergenerational impact of ultra-processed foods (UPFs) on health and neurodevelopment. This figure illustrates the profound and cumulative effects of UPF consumption across different life stages, prenatal period, childhood, adolescence, adulthood, and old age, highlighting their role in a broad spectrum of neurodevelopmental, metabolic, cardiovascular, and cognitive disorders. The interconnected arrows emphasize how exposure to UPFs in one stage can amplify health risks in later stages, creating a continuous and reinforcing cycle of adverse health outcomes.The lifelong and intergenerational impact of ultra-processed foods (UPFs) on health and neurodevelopment. This figure illustrates the profound and cumulative effects of UPF consumption across different life stages, prenatal period, childhood, adolescence, adulthood, and old age, highlighting their role in a broad spectrum of neurodevelopmental, metabolic, cardiovascular, and cognitive disorders. The interconnected arrows emphasize how exposure to UPFs in one stage can amplify health risks in later stages, creating a continuous and reinforcing cycle of adverse health outcomes.

    Health consequences of UPF consumption

    Large-scale studies consistently tie UPFs to weight gain and obesity across age groups. Diets rich in UPFs also elevate risks of certain cancers, cardiovascular disease, type 2 diabetes, metabolic syndrome, dyslipidemia, and hypertension.

    During pregnancy, high UPF intake predicts pre-eclampsia, gestational diabetes, and poorer neonatal outcomes such as congenital heart defects and pre-term births.

    Beyond these disorders, UPFs displace nutrient-dense foods, producing micronutrient deficits that are especially harmful during rapid growth and neural maturation. Emerging evidence further links heavy UPF consumption to hyperactivity, inattention, depression, and anxiety, with potential for cumulative, lifelong neurocognitive harm.

    The review notes that deficiencies in specific nutrients, such as iron and zinc, associated with high UPF intake, may impair neurodevelopmental processes and cognitive functions in offspring. Animal evidence is also discussed, such as findings that trans-fat intake during pregnancy can induce hippocampal inflammation and memory deficits in offspring.

    UPFs thrive where food skills, budgets, and time are limited. Lower-income, single-parent or dual-working households lean on cheap, ready-to-eat products, while school meal programmes often reinforce these habits, as UPFs provide 65% or more of lunchtime calories in many UK schools.

    Urbanization, frequent snacking, dining out, poor sleep, and persuasive marketing magnify exposure. Adolescents are the heaviest consumers, although intake falls with age. Cultural context matters, with Japan’s education-focused school lunches showing that policy can curb reliance on UPFs.

    Eating is regulated by intertwined homeostatic (energy need) and hedonic (reward seeking) systems. Dopaminergic pathways from the ventral tegmental area to the striatum and prefrontal cortex drive the powerful reward response evoked by palatable UPFs, often overriding satiety signals. The review discusses how repeated UPF exposure can hypersensitize reward circuits and reinforce habitual or compulsive consumption, using terms such as “reward dysfunction” and “habitual overconsumption” rather than “addiction-like behaviours.”

    Key hubs include the hypothalamus, amygdala, hippocampus, and insular cortex, all integrating metabolic cues, memories, and emotions to shape food choice. Disruption of these circuits is implicated in attention deficit hyperactivity disorder (ADHD), autism spectrum disorder (ASD), and binge eating behaviours.

    The third trimester and early childhood are highly plastic phases; inadequate maternal or infant nutrition can permanently alter synaptogenesis and myelination. Adolescence represents a second vulnerable window: the prefrontal cortex and mesolimbic dopamine system are still maturing, heightening sensitivity to rewarding foods and emotional stress.

    Repeated UPF exposure during these periods strengthens hedonic pathways and weakens inhibitory control.

    UPF availability, aggressive advertising, and screen time create an obesogenic environment that cements taste preferences for energy-dense, sweet, and salty foods. Early life UPF consumption predicts chronic inflammation, persistent obesity, metabolic dysfunction, and a greater risk of mental health disorders into adulthood. The review also notes that the prevalence of ARFID and other eating disorders may be exacerbated by UPFs’ sensory properties, but does not present a direct causal link.

    Curbing UPF intake in mothers, children, and adolescents is therefore vital to break the intergenerational cycle of diet-related disease.

    Maternal consumption of UPFs during pregnancy can negatively affect fetal brain development during the critical gestational window of 24–42 weeks. UPFs may disrupt key neurodevelopmental processes like synapse formation, myelination, and neurotransmitter signaling, primarily through inflammation, oxidative stress, epigenetic changes, and gut microbiome alterations.

    Deficiencies in nutrients like long-chain fatty acids, zinc, iron, and protein due to high UPF intake may impair emotional regulation, memory, and cognition in children. These impacts can be long-lasting and may also increase the risk for neurodevelopmental disorders such as ADHD and ASD. Some UPF components, including nanoparticles and additives, may further harm the developing brain. For example, nanoparticles such as titanium dioxide and certain additives can cross the blood–brain barrier, potentially impairing memory and learning, while exposure to bisphenols may disrupt dopamine and serotonin signaling in the developing brain.

    The review further emphasizes the role of the gut–brain axis as a mechanistic link between UPF intake and brain health, highlighting how UPF-induced alterations in the gut microbiome may impair the synthesis of neurotransmitters like serotonin and brain-derived neurotrophic factor (BDNF), both critical for cognitive development and mood regulation.

    Conclusions

    Cumulative exposure to UPFs from fetal life through adulthood is now clearly associated with a wide range of neurocognitive consequences, from early executive dysfunction to increased dementia risk later in life.

    The mechanisms include altered brain reward signaling, gut–brain axis disruption, and inflammation-driven neural changes. Because these effects begin early and build over time, preventive action during pregnancy and childhood offers the greatest potential benefit.

    The authors call for public health policy levers—such as reducing UPF availability, mandating unambiguous front-of-pack food labelling, and stimulating product reformulation—as well as prioritizing longitudinal neuroimaging research to confirm causality and pinpoint sensitive developmental windows.

    Policy efforts should aim to reduce UPF availability, improve food labelling, and promote reformulation. Clinicians should also encourage diets rich in fiber and minimally processed foods to support brain development and long-term cognitive health.

    Continue Reading

  • The #1 Food to Limit to Reduce Your Risk of Dementia

    The #1 Food to Limit to Reduce Your Risk of Dementia

    • Limiting candy in your diet may support brain health, as high added sugar intake could increase the risk of Alzheimer’s disease. 
    • A brain-healthy lifestyle includes regular exercise, managing chronic diseases, staying socially engaged and following diets like the MIND diet. 
    • Making mindful dietary and lifestyle choices can enhance cognitive health and overall well-being over time.

    More than 55 million people have dementia worldwide, with Alzheimer’s disease being the most common form, contributing to 60% to 70% of dementia cases. Having Alzheimer’s disease means living with a progressive disorder that causes brain cells to degenerate and die, leading to a continuous decline in memory, thinking skills and the ability to perform everyday tasks. Sadly, as the disease progresses, even basic activities and communication become challenging.

    Several factors influence the risk of developing dementia, with some being completely beyond your control. Aging is the most significant risk factor, as individuals over the age of 65 are more susceptible. Genetics also play a crucial role, with specific genetic mutations directly linked to Alzheimer’s disease. However, along with unchangeable factors, certain lifestyle choices can help lower the risk of cognitive decline, with diet being a pivotal piece of the puzzle. “Some of the best foods for brain health are antioxidant-rich wild blueberries, salad greens for B vitamins, salmon for its anti-inflammatory fatty acids, fiber-rich black beans, and walnuts, the best source of plant-based omega-3 ALA among nuts,” says Maggie Moon, M.S., RD. There are some foods you should avoid when focusing on brain health support too, with candy being the #1 food on that list. 

    Why You Should Limit Candy for Brain Health

    Taking steps to reduce dementia risk is one positive step for brain health. While there isn’t one food that will cause dementia, high-added-sugar candy tops the list of foods that should be limited on a brain-healthy diet. 

    “Candies are not your brain’s friend,” Moon says. She points to a study that found that eating too much added sugar more than doubled the risk for dementia. “That includes added sugar from candies, as well as other sweets like pastries, sweetened café drinks and sodas,” she says. Researchers think that high blood sugar and insulin levels are risk factors for Alzheimer’s because insulin resistance may also occur in the brain, which may impact memory.

    Of course, everything can be eaten in moderation in a healthy, balanced eating plan. “While fine once in a while, research has found that a diet that is consistently high in added sugar may increase the amyloid plaque buildup in the brain,” says Laura M. Ali, M.S., RDN. “These plaques disrupt the communication system in our brain, and scientists have found that people with Alzheimer’s disease tend to have more of these plaques.”

    In fact, says Ali, one study found that every 10 grams of added sugar consumed per day (equivalent to 2½ teaspoons of sugar or 8 gummy candies) was associated with a 1.3% to 1.4% increased risk of developing Alzheimer’s disease. Those with the highest daily added sugar intake had 19% higher odds of developing Alzheimer’s disease.

    Other Ways to Reduce Your Risk of Dementia 

    Limiting sweetened candy doesn’t guarantee that you won’t get dementia, but it is a positive step forward. Along with limiting added sugar in your diet, here are some other ways to reduce your dementia risk:

    • Exercise by participating in both aerobic activity and resistance exercise. 
    • If you smoke cigarettes, take the first steps to quit. 
    • Limit alcohol intake. If you regularly drink alcohol, try to do so in moderation. Excessive drinking is linked to cognitive decline. Moderate drinking means two drinks or less in a day for men and one drink or less in a day for women.
    • Stay socially engaged. Maintaining social connections builds your cognitive reserve to maintain good brain function with age. 
    • If you have chronic diseases, such as high blood pressure and diabetes, make sure you’re managing these well. Stiffness in arteries and blood vessels can damage the brain. If you need help or individualized advice, reach out to a healthcare professional. 
    • Include brain-healthy foods in your diet. The MIND diet emphasizes foods like whole grains, nuts, berries, vegetables and olive oil, which research shows may help support brain health. “The brain-healthy MIND diet limits foods high in saturated fats and added sugars because both are linked to oxidative stress, inflammation and the brain plaques and tangles associated with Alzheimer’s disease,” says Moon. She clarifies that this diet limits—but does not eliminate—fried foods, pastries and sweets, red meat, whole-fat cheese and butter.

    Our Expert Take 

    Nothing will guarantee that you will live a life free from dementia. But certain steps may help reduce your risk, with your dietary choices being one factor. And along with eating brain-healthy foods, limiting your candy intake can help keep you cognitively sharp. Enjoying a small handful of candy corn on Halloween or conversation hearts on Valentine’s Day won’t “cause” dementia. “It’s important to remember that no single food eaten once, or even once in a while, is going to make or break your brain health,” Moon adds.

    Continue Reading

  • New research urges redesign of sensory spaces for autistic adults

    New research urges redesign of sensory spaces for autistic adults

    Researchers are calling for a rethinking of calming environments, as new findings highlight how autistic people experience the world differently.

    Published in Autism in Adulthood, the research surveyed 96 autistic adults across multiple countries. Respondents identified common elements that support wellbeing, including music, nature, solitude and the ability to personalise their surroundings.

    The study also revealed that sensory experiences differ significantly among autistic adults—what calms one person may overstimulate or distress another.

    Flexibility is essential

    Lead author and University of South Australia (UniSA) PhD candidate Connor McCabe said calming spaces must move beyond child-focused models and prioritise autonomy.

    “Our research highlights the incredible diversity of sensory needs within the autistic community and the importance of offering flexibility and personal control within these spaces,” McCabe said.

    Participants highlighted lighting, sound, and touch as key factors that affected their ability to relax. Dim or adjustable lighting, access to television, books, video games, and natural soundscapes were commonly mentioned as beneficial. While trends were identified, the researchers warned against a “one-size-fits-all” approach.

    “That’s why it’s so important that these spaces offer choice – adjustable lighting, varied seating, different soundscapes and – above all – privacy,” McCabe said.

    Traditional features fall short

    The study, conducted with Dr Nigel Newbutt from the University of Florida, found that many conventional sensory room features—such as vibration-based devices, wall projections, and standard sensory toys—were not widely valued by participants.

    Instead, respondents called for more natural features, including greenery, calming water elements, and even interaction with animals.

    Co-author professor Tobias Loetscher, a cognitive psychologist at UniSA, said participants frequently stressed the importance of controlling aspects of the space, such as temperature, sound levels, and who is permitted entry.

    Virtual reality as an emerging solution

    McCabe is finalising a second study involving the co-design of a virtual reality (VR) sensory room in collaboration with autistic adults. The project aims to offer personalised, adaptable environments through immersive digital technology.

    “This VR sensory experience differs quite largely from what is typically found in a sensory room, as the virtual aspect allows much more freedom in terms of the environments we can create, and the stimulation that can be provided,” McCabe said.

    “With virtual reality, people can engage in calming activities like virtual forest walks or immersive soundscapes without needing large physical spaces.”

    Continue Reading

  • Stop scaring the young witless about cancer

    Stop scaring the young witless about cancer

    There have been countless reports recently warning of the ‘explosion’ of various forms of cancer among young(ish) people. Prompted by research documenting the rising rate of cancer diagnoses among people under 50, media outlets and politicians, from the Guardian and the BBC to RFK Jr, have been eagerly speculating and fearmongering about the likely causes.

    What’s often missing from this discussion, however, is the crucial context of year-on-year improvements in cancer incidence and mortality, and substantial declines in the age-standardised death rates. Among people of the same ages, the cancer mortality risk has fallen by about one-third since 1990.

    Collectively, the evidence suggests that true increases in early-onset cancers, especially gastrointestinal cancers and breast cancer, are real, but they still account for only around 10 per cent of all cancer diagnoses. The remaining 90 per cent occur in people over 50. Early-onset cancers are certainly distressing, not least because of the emotional toll, protracted treatment and associated side effects like infertility. But their public-health significance depends not just on how serious they are, but also on how many people they affect.

    Plenty of causes have been proposed for the uptick, but the usual suspects dominate: rising levels of obesity, excess alcohol consumption, lack of exercise and poor sleep. In the US, obesity roughly doubled from the early 1990s to 2017. That timing makes for an awkward fit: today’s 30- to 50-year-olds were largely too old to have grown up during the peak of that surge, and most cancers – especially solid tumours like those in the gastrointestinal tract – have long latency periods. Even if we were to assume obesity is the cause, that still leaves many other early-onset cancers unaccounted for. Gastrointestinal cancer represents only about 10 per cent of the increasing cancer incidence. The rest includes cancers like thyroid, testicular and melanoma, which have no strong lifestyle links.

    Then there are many other possible causes to explore. Rising parental age, increased childlessness and reduced breast feeding may shape early-life biological risk and contribute to an increased incidence of breast cancer in women. There is ongoing speculation about the role of endocrine-disrupting chemicals, antibiotics and other environmental exposures. There is also the intriguing possibility that the sharp decline in smoking means those genetically predisposed to cancer are not developing typical smoking cancers, but other forms, such as gastrointestinal cancer.


    Enjoying spiked?

    Why not make an instant, one-off donation?

    We are funded by you. Thank you!




    Please wait…

    Of course, the expanded use of screening – alongside broader access to healthcare, insurance and routine checks – inevitably leads to more cancers being caught earlier. The fact that mortality remains flat even as incidence rises supports the idea that overall, we’re seeing more diagnosis, not more disease.

    It is an entirely reasonable endeavour to try to reduce all incidences of cancer, but there are limits to what current public-health advice can achieve. Even if everyone ate mostly plants, drank little or no alcohol, exercised daily and slept soundly, cancer incidence might drop by 30 to 50 per cent at most. That is because the modifiable risk factors, while not irrelevant, aren’t especially powerful.

    Obesity is associated with a two- to four-fold increase in the risk of developing certain cancers – most notably endometrial, esophageal, colorectal, kidney and postmenopausal breast cancer. For most cancer types, however, the increased risk associated with obesity is below two-fold. Far more impactful is excessive alcohol consumption, which roughly doubles the risk of liver and esophageal cancers, and increases breast and colorectal cancer risk by 30 to 60 per cent. Even these numbers pale in comparison with smoking. Smoking increases the lifetime risk of lung cancer by about 25 times, doubles the risk of serious illness in middle age and triples or quadruples the risk of fatal cancers overall. The effects of other lifestyle factors, like sleep, stress and physical inactivity – which increasingly crop up in cancer-related headlines – are very small.

    It is morally dubious to give the impression that one’s health is entirely under one’s personal control because (with the exception of chronic smoking) it isn’t. More than that, trying to optimise your health above all else often means giving up the kind of life most people actually want to live. If you find joy in military-grade dietary discipline, precision-tuned workouts and lifelong abstinence, then fine – you might gain an extra two to four years of life. But scaring people into conformity because of a modest rise in early cancer is unreasonable.

    Yet that’s exactly where some interventions courted by healthcare bodies are headed. The ‘health belief model‘, for example, urges people to see themselves as high-risk and to absorb that risk as a personal concern. This way, they can be more readily nudged into preventative health behaviours. Applying that logic to adolescents means fuelling fresh anxieties in a generation already spooked by crime, climate collapse and economic doom.

    For years, the medical authorities have inflated health risks beyond proportion, fuelling an unhealthy obsession with staying well. But health at all costs is too high a price. It’s frankly anti-social to stoke panic over a modest rise in early-onset cancers just to herd everyone under 30 into bland diets, joyless workouts, routine screenings and a state of constant bodily surveillance. That’s not healthy living, it’s lifestyle asceticism for zealots. Most people accept a little risk in exchange for a life that’s actually worth living. What they reasonably reject is misery now in return for a few extra years at the end – years that are never guaranteed to be gained and are always guaranteed to end.

    Stuart Derbyshire is an associate professor of psychology at the National University of Singapore.

    Who funds spiked? You do

    We are funded by you. And in this era of cancel culture and advertiser boycotts, we rely on your donations more than ever. Seventy per cent of our revenue comes from our readers’ donations – the vast majority giving just £5 per month. If you make a regular donation – of £5 a month or £50 a year – you can become a  and enjoy:

    –Ad-free reading
    –Exclusive events
    –Access to our comments section

    It’s the best way to keep spiked going – and growing. Thank you!

    Continue Reading

  • Common genetic mutation increases the risk of dementia in men

    Common genetic mutation increases the risk of dementia in men

    About one man in 36 carries two copies of a tiny change in the HFE gene and that detail can more than double his odds of developing dementia, according to a new analysis of nearly 12,200 Australians and Americans.

    Dementia already affects about 433,000 Australians, and while women still outnumber men, this fresh genetic signal shows that older males with the variant face a steeper climb toward memory loss.

    HFE gene, men, and dementia


    The haemochromatosis gene keeps the body’s iron traffic moving smoothly, yet its H63D version is anything but rare, turning up in one in three people as a single copy and in roughly one in 36 as a double hit.

    H63D does not usually overload the liver the way the better‑known C282Y variant can, but laboratory work shows that mutant HFE can upset cellular iron sensors and let free iron trigger damaging chemical reactions.

    Genes, iron, brains and memory

    Iron is vital but volatile, and when it piles up inside microglia, the brain’s immune cells, it sparks oxidative stress and inflammation tied to Alzheimer’s pathology.

    Reviews of animal and human data echo the theme: iron dyshomeostasis fans reactive oxygen species, injures neurons, and may even set off ferroptosis, a specialized form of cell death linked to cognitive decline.

    HFE variants have also been linked to Parkinson’s disease and motor neuron disorders, suggesting that iron mismanagement may be a shared pathway across multiple brain conditions.

    Studies have shown that abnormal iron levels in regions like the substantia nigra and motor cortex are associated with earlier onset and faster progression of these diseases in genetically susceptible individuals.

    What the ASPREE data revealed

    The ASPREE trial followed 19,114 healthy seniors for a median 6.4 years to test low‑dose aspirin, generating a gold‑mine of aging data in the process.

    Genome scans within ASPREE showed that men with two H63D copies logged an adjusted hazard ratio of 2.39 for incident dementia, while women with the same genotype saw no extra risk.

    “Having two copies of the variant more than doubled the risk of dementia in men, but not women,” Professor John Olynyk, of Curtin University Medical School, who co‑led the project.

    Why only men

    Menstruation, pregnancy, and lower mid‑life ferritin give women a natural outlet for iron, which may blunt the variant’s impact; haemochromatosis complications likewise appear later and milder in females.

    Brain‑imaging studies find that women who stop menstrual bleeding early accumulate more iron in deep brain regions, hinting that lifelong iron exposure could be the pivot that tips male brains toward damage.

    Sex hormones complicate matters too, because estrogen modulates iron metabolism and offers anti‑inflammatory benefits that may shield neural circuits.

    HFE analysis is already part of routine work‑ups when doctors suspect iron overload, but the new results suggest that broader screening in aging men could flag a hidden dementia risk before symptoms start.

    Identifying carriers is inexpensive, and unlike many predictive markers, this one points to an actionable pathway, iron handling and its downstream inflammatory effects.

    Practical steps today

    The research team found no clear link between blood ferritin levels and dementia, yet keeping iron in the normal range, treating chronic inflammation, staying active, and eating a plant‑biased diet all intersect with brain health.

    Regular physical activity alone can slash dementia risk by roughly 30 percent, according to World Health Organization estimates.

    Doctors also emphasize blood pressure control, smoking cessation, and social engagement, interventions that help every older adult, variant carrier or not.

    Current dementia prevention efforts often rely on broad lifestyle advice or age thresholds, but this variant suggests a need for sex‑aware screening policies.

    If men with double H63D copies truly carry double the risk, early genetic testing could steer them toward more aggressive prevention long before symptoms appear.

    Men, dementia, and the future

    Professor Paul Lacaze from Monash University, hopes that untangling how H63D alters brain pathways will spark medicines that block the harm even when the gene cannot be changed.

    Future work will track variant carriers with MRI iron mapping and fluid biomarkers to catch early inflammatory changes, then test whether diet, phlebotomy, or anti‑ferroptotic drugs can tilt the balance back toward healthy aging.

    Genetics rarely hands scientists such a common, sex‑specific clue, and this one arrives in time for a generation of men now crossing the 70‑year threshold.

    The study is published in Neurology.

    —–

    Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

    Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

    —–


    Continue Reading

  • HIV Self-Testing Lauded, But It’s Not Included WHO Lenacapavir Recommendations

    HIV Self-Testing Lauded, But It’s Not Included WHO Lenacapavir Recommendations

    One way to cope with cuts in HIV prevention funding is to continue to push to increase the number of people who self-test for HIV instead of getting tested at a clinic or hospital, experts at a series of sessions on HIV self-testing said today at the International AIDS Society (IAS) 2025 meeting in Kigali, Rwanda.

    Self-testing can “help us preserve the hard-won gains across the treatment and prevention components of our programs,” said Cheryl Johnson, M.A., Ph.D., a technical officer on HIV testing services at the World Health Organization (WHO). Johnson, who spoke remotely to the session attendees, said the foundation for self-testing is strong, and “now it’s really time to act boldly to embed self-testing into routine service delivery.

    Michelle Rodolph, M.S., M.P.H.

    But the WHO’s message on self-testing was somewhat mixed today. After Johnson’s full-throated endorsement, supported by a raft of statistics, Michelle Rodolph, M.S., M.P.H., who is also a technical officer at WHO but in HIV prevention overall, said that self-testing was not recommended in the much anticipated WHO guidelines for twice-a-year lenacapavir injections for HIV preexposure prophylaxis that are scheduled to be released at the IAS meeting tomorrow. Instead, the guidelines strongly recommend rapid diagnostic tests.

    “There wasn’t enough evidence to come up with a recommendation on HIV self-testing for long-acting injectable practice,” Rodolph said at the mid-morning session. She stressed the distinction between the lack of evidence needed for a WHO recommendation and the advisability of using self-testing as a practical matter. “I want to emphasize that HIV self-testing may be an important implementation consideration in some contexts, which should enable greater program flexibility and increase testing frequency,” said Rodolph, adding that WHO will review implementation research as the results come in and update the recommendations accordingly. “For you implementation researchers sitting in the room, this is a call and a request to all of you.”

    The HIV self-testing was one of a set of topics explored in sponsored pre-conference sessions today that were held before the full-fledged conference. The sponsors of the self-testing sessions were The others included co-infections, success stories, African HIV vaccine research and “national leadership, integration, and sustainable HIV programs in an era of funding transition,” a veiled reference to the Trump administration’s pullback of funding of the U.S. President’s Emergency Plan for AIDS Relief, more commonly known as PEPFAR. In the fiscal year that ends in September 2025, the funding of PEPFAR was $.5 billion, with most of the money directed at HIV prevention and treatment in sub-Saharan Africa.

    HIV self-tests test for HIV antibodies. Some involve swabbing the gums, and others, a finger prick to collect a small amount of blood. Rodolph, the WHO recommends HIV self-testing for initiation, reinitiation, and continuation of other forms of PrEP, including oral PrEP and the dapivirine vaginal ring. That recommendation also includes post-exposure prophylaxis, a short course of antiretroviral after possible exposure to HIV.

    Johnson listed a wide range of benefits from HIV self-testing, including increased engagement, autonomy, access, equity and efficiency and reduced costs to health systems. She shared data showing 23 countries in Africa distributed 6.3 million HIV self-testing kits, mostly in eastern and southern Africa. Johnson said the uptake of self-testing was greater in men, including the 35-year-old to 49-year-old age group that has the largest number of undiagnosed people living with AIDS.

    Johnson also discussed temporal trends, including a drop in price from $5-$40 to $1-$3 in low- and middle-income countries.

    Brooke Nichols, M.Sc., Ph.D.

    The session also included a presentation by Brooke Nichols, M.Sc., Ph.D., an associate professor at the Boston University School of Public Health, of a study that combined data on the effects of HIV self-testing in South Africa, Zambia, Kenya, Uganda and Lesotho. The findings shared by Nichols showed that in those countries, HIV self-testing resulted in 0.6-6 additional HIV-positive diagnoses per 100 tests distributed. Kenya, though, was the only country to show a significant effect on additional antiretroviral initiations, according to Nichols’ findings.

    Johnson urged the audience to check out a poster Nichols is presenting later in the conference that shows that self-testing leads to savings and reduces the burden on healthcare facilities and staff. “We’re seeing across Africa some new work showing this [HIV self-testing] can be quite cost-effective and could be a really good opportunity when we don’t have maybe the same capacity in the health facilities that we’ve had before.”

    Continue Reading

  • A Cross-Sectional Study Based on the National Health Interview Surveys

    A Cross-Sectional Study Based on the National Health Interview Surveys

    5 Correlation Between Visual Impairment and Breast Cancer: A Cross-Sectional Study Based on the National Health Interview Surveys

    Background/Significance

    Previous studies suggesting a negative correlation between breast cancer and visual impairment are limited by small sample sizes, underscoring the need for larger-scale analyses to clarify this relationship and its clinical implications.

    Materials and Methods

    To better understand this correlation, we conducted a cross-sectional study using data from 39,439 individuals from the National Health Interview Surveys, ensuring sufficient sample sizes across all degrees of visual impairment.

    Results

    Our results showed an increase in breast cancer among all degrees of visual impairment, with the highest prevalence of breast cancer being among women who were completely blind. These results show that the melatonin hypothesis may not be applicable outside of animal models, and that lifestyle challenges faced by visually impaired women may increase the risk of developing breast cancer.

    Conclusion

    Further studies should be conducted to draw definitive conclusions, keeping in mind the possibility that there may be a positive correlation between breast cancer and visual impairment in spite of conclusions established by past studies.

    Continue Reading

  • Early-life exposure to endocrine-disrupting chemicals may fuel food preferences

    Early-life exposure to endocrine-disrupting chemicals may fuel food preferences

    Exposure to endocrine-disrupting chemicals in early life, including during gestation and infancy, results in a higher preference for sugary and fatty foods later in life, according to an animal study being presented Sunday at ENDO 2025, the Endocrine Society’s annual meeting in San Francisco, Calif.

    Endocrine-disrupting chemicals are substances in the environment (air, soil or water supply), food sources, personal care products and manufactured products that interfere with the normal function of the body’s endocrine system. To determine if early-life exposure to these chemicals affects eating behaviors and preferences, researchers from the University of Texas at Austin conducted a study of 15 male and 15 female rats exposed to a common mixture of these chemicals during gestation or infancy.

    “Our research indicates that endocrine-disrupting chemicals can physically alter the brain’s pathways that control reward preference and eating behavior. These results may partially explain increasing rates of obesity around the world,” said Emily N. Hilz, Ph.D., a postdoctoral research fellow at the University of Texas at Austin in Austin, Texas. “Understanding the harmful health impact that exposure to these types of chemicals can have on eating patterns may help inform public health recommendations and personal efforts to improve diet-related health complications.”

    Researchers administered behavioral studies throughout the rats’ lifespans, including into adulthood, to observe preferences for high-fat foods and a sucrose solution. Findings showed that male rats with early-life exposure to endocrine-disrupting chemicals had a temporary preference for the sucrose solution, while female rats showed a strong preference for high-fat food that resulted in weight gain. In addition, testosterone was reduced in exposed males, while estradiol in females remained unchanged.

    During the study, areas of the brain were sequenced to determine if early-life exposure to endocrine-disrupting chemicals resulted in physical changes to the regions important to controlling food intake and responding to reward. Researchers observed changes to gene expression throughout all areas sequenced in male rat brains, and varying changes to gene expression in the region of female rat brains associated with reward. These physical changes were predictive of changes to eating behavior and food preferences.

    “It’s important that people understand that there are negative impacts associated with consuming or being near endocrine-disrupting chemicals early in life. With this knowledge in hand, consumers may want to consider reducing personal interaction with environments, food and other types of products containing these chemicals during pregnancy and early childhood to reduce the risk of developing obesity later in life,” Hilz added.

    This research was supported by the National Institute of Health’s National Institute of Environmental Health Sciences.

     

    About Endocrine Society
    Endocrinologists are at the core of solving the most pressing health problems of our time, from diabetes and obesity to infertility, bone health, and hormone-related cancers. The Endocrine Society is the world’s oldest and largest organization of scientists devoted to hormone research and physicians who care for people with hormone-related conditions.

    The Society has more than 18,000 members, including scientists, physicians, educators, nurses, and students in 122 countries. To learn more about the Society and the field of endocrinology, visit our site at www.endocrine.org. Follow us on X (formerly Twitter) at @TheEndoSociety and @EndoMedia.


    Continue Reading