Category: 8. Health

  • ‘Shampoo’ could protect against hair loss during chemo

    ‘Shampoo’ could protect against hair loss during chemo



    Researchers have developed a shampoo-like gel that has been studied in animal models and could protect hair from falling out during chemotherapy treatment.

    Baldness from chemotherapy-induced alopecia causes personal, social, and professional anxiety for everyone who experiences it.

    Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient’s head, which are expensive and have their own extensive side effects.

    Bryan Smith, an associate professor in the Michigan State University College of Engineering and with MSU’s Institute for Qualitative Health Science and Engineering, has developed a gel the consistency of shampoo that he hopes will help protect patients’ hair throughout treatment.

    When Smith was a trainee at Stanford University, he learned and used a process that inverted the typical engineering process, seeking to objectively identify and completely characterize critical clinical needs prior to solving them.

    “This unmet need of chemotherapy-induced alopecia appealed to me because it is adjacent to the typical needs in medicine such as better treatments and earlier, more accurate diagnostics for cancer,” Smith says.

    “This is a need on the personal side of cancer care that, as an engineer, I didn’t fully recognize until I began interviewing cancer physicians and former cancer patients about it. Once I understood, it became clear to me that better solutions are very important to many cancer patients’ quality of life.”

    This rigorous process of specifying the need, identifying possible solutions, developing an initial prototype, and refining and testing it led to the development of a gel described in a new paper in Biomaterials Advances.

    The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient’s scalp. The hydrogel is designed to be applied to the patient’s scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.

    During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out. The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.

    To support practical use of this “shampoo,” the gel is designed to be temperature responsive. For example, at body temperatures the gel is thicker and clings to the patient’s hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.

    Smith and his team hope to obtain federal and/or venture funding to move this research forward into clinical trials and, eventually, to human patients.

    “The research has the potential to help many people,” Smith says. “All the individual components are well-established, safe materials, but we can’t move forward with follow-up studies and clinical trials on humans without the support of substantial funding.”

    Source: Michigan State University

    Continue Reading

  • Listen: How trees boost your focus and mental health

    Listen: How trees boost your focus and mental health



    In a new podcast episode, a psychologist explains how trees boost your attention, improve mental health—and even reduce crime.

    University of Chicago psychologist Marc Berman’s research on “soft fascination” and nature’s cognitive effects is reshaping how we think about everything from urban planning to depression treatment.

    From groundbreaking hospital studies to surprising results with plastic plants, Berman’s work uncovers the deep—and often invisible—power that natural environments hold over our minds and bodies.

    Whether you’re a city planner, a parent, or just someone feeling mentally fatigued, the conversation on this episode of the Big Brains podcast may just change the way you think about a walk in the park:

    Source: University of Chicago

    Continue Reading

  • Single Dose of MM120 (LSD) Shows Lasting Anxiety Reduction in GAD: Phase 2b Results

    Single Dose of MM120 (LSD) Shows Lasting Anxiety Reduction in GAD: Phase 2b Results

    Maurizio Fava, MD

    Credit: Massachusetts General Hospital

    A single 100 μg dose of MM120 (lysergide D-tartrate, LSD) significantly reduced anxiety symptoms in adults with generalized anxiety disorder (GAD), with improvements sustained through 12 weeks, according to the first randomized, placebo-controlled trial of its kind published today in JAMA.1

    “This study is a true turning point in the field of psychiatry,” investigator Maurizio Fava, MD, member of the MindMed Scientific Advisory Board and chair of Mass General Brigham Department of Psychiatry, said in a statement.2 “For the first time, LSD has been studied with modern scientific rigor, and the results are both clinically meaningful and potentially paradigm-shifting for the treatment of GAD.”

    Despite 26 million adults in the US diagnosed with GAD, the US Food & Drug Administration (FDA) has not approved a new medication for this indication since 2007.2 However, 50% of patients fail first-line GAD treatments.

    “I have seen firsthand the devastating toll GAD takes on patients and their families, which is why it is so significant that a single dose of MM120 delivered rapid, robust, and lasting effects,” Fava continued.2 “These results highlight the promise of psychedelics in psychiatric medicine.”

    MindMed conducted MMED008, a multicenter, randomized, placebo-blind, phase 2b study evaluating a single administration of MM120 at 4 dose levels (25 [n = 39], 50 [n = 40], 100 [n = 40], or 200 [n = 40]) μg as monotherapy, without any psychotherapeutic intervention, in adults with moderate to severe GAD.1 The sample included 198 adults aged 18 – 74 years (mean age, 41.3 years), with 56.7% female and 83% White (7.7% Black or African American and 3.6% Asian). The dose-response relationship was evaluated using the multiple comparison procedure modeling (MCP-Mod) method for change in Hamilton Anxiety Rating Scale at week 4.

    The study met its primary endpoint, with MM120 demonstrating a dose-response relationship at week 4 with the 100 μg (least-squares mean difference, −5.0 points; 95% confidence interval [CI], −9.6 to −0.4 points) and the 200-μg (−6.0 points; 95% CI, −9.8 to −2.0 points) dose groups vs placebo. The study found no significant difference between the 25 μg (least-squares mean difference, −1.2 points; 95% CI, −6.0 to 3.5 points) and 50 μg (−1.8points; 95% CI, −7.6 to 4.0 points) doses compared with placebo.1

    The trial also met its key secondary endpoint, demonstrating a significant symptom improvement versus placebo on the Hamilton Anxiety Rating Scale (HAM-A). The analysis showed that 100 μg was the optimal dose of MM120. At week 4, this dose achieved a 7.6-point greater reduction in HAM-A scores compared to placebo (-21.3 vs. -13.7; P <.0004). Participants on MM120 100 μg had a 65% clinical response rate and a 48% clinical remission rate sustained to week 12.1

    This study also met other secondary outcomes, including clinician-rated disease severity measured by the Clinical Global Impression-Severity (CGI-S) scale, changes from baseline in the Montgomery-Åsberg Depression Rating Scale (MADRS), and measures of functional disability and quality of life.

    On average, CGI-S scores improved from 4.8 to 2.2 in the 100 μg dose group, reflecting a 2-category shift from “markedly ill” to “borderline ill” at week 12. This was compared to a 4.9 to 3.5 improvement in the placebo group (P =.003).1

    Furthermore, MM120 showed rapid clinical activity, working as early as day 2 and sustained through week 4 and 12. MM120 100 μg also led to significant MADRS improvement vs placebo, with a difference of 5.7 points at week 4 (P ≤.05) and a difference of 6.4 points at week 12 (P ≤.05).1

    The safety profile of MM120 was comparable to previous findings, with mild-to-moderate adverse events occurring on dosing day. Common adverse events included visual perceptual changes, including illusion, pseudo-hallucination, and visual hallucination.1 These visual perceptual changes occurred in 46.2% of participants who received 25 μg of MM120, 75% who received 50 μg, 92.5% who received 100 μg, 100% who received 200 μg, and 10.3% who received a placebo.

    Another common adverse event included nausea, occurring in 7.7% receiving 25 μg, 27.5% receiving 50 μg, 40% receiving 100 μg, 60% receiving 200 μg, and 7.7% receiving placebo.1 Headaches occurred in 12.8%, 22.5%, 35.0%,27.5%, and 23.1% of participants, respectively.

    The dose-response results inform MindMed’s ongoing phase 3 trial focusing on the MM120 Orally Disintegrating Tablet (ODT). Topline results for MindMed’s Voyage trial, evaluating the efficacy, durability, and safety of MM120 ODT for GAD, are expected in the first half of 2026.2

    “Our Phase 2b results—marking the first well-controlled clinical study to evaluate dose-response relationships of LSD in a psychiatric population—demonstrate the meaningful impact of a single 100 μg dose of MM120 in significantly reducing anxiety symptoms,” said Daniel Karlin, MD, chief medical officer of MindMed.2

    References

    1. Robison R, Barrow R, Conant C, et al. Single Treatment with MM120 (Lysergide) in Generalized Anxiety Disorder. JAMA.doi:10.1001/jama.2025.13481. Published online September 4, 2025.
    2. Journal of the American Medical Association (JAMA) Publishes Results from First-Ever Randomized, Placebo-Controlled Clinical Trial Assessing the Dose-Dependent of MM120 (Lysergide D-Tartrate, LSD) in Generalized Anxiety Disorder (GAD). MindMed. Assessed September 4, 2025.

    Continue Reading

  • Journal of the American Medical Association (JAMA) Publishes Results from First-Ever Randomized, Placebo-Controlled Clinical Trial Assessing the Dose-Dependent Efficacy of MM120 (Lysergide D-Tartrate, LSD) in Generalized Anxiety Disorder (GAD) – Business Wire

    1. Journal of the American Medical Association (JAMA) Publishes Results from First-Ever Randomized, Placebo-Controlled Clinical Trial Assessing the Dose-Dependent Efficacy of MM120 (Lysergide D-Tartrate, LSD) in Generalized Anxiety Disorder (GAD)  Business Wire
    2. A single dose of LSD seems to reduce anxiety  New Scientist
    3. Researchers pinpoint LSD dose that could keep anxiety at bay for weeks  Yahoo News Canada
    4. LSD shows promise for reducing anxiety in drugmaker’s midstage study  The Independent

    Continue Reading

  • New Gel May Help Prevent Hair Loss During Chemotherapy, Early Research Finds | Health

    New Gel May Help Prevent Hair Loss During Chemotherapy, Early Research Finds | Health



























    New Gel May Help Prevent Hair Loss During Chemotherapy, Early Research Finds | Health | nbcrightnow.com


    We recognize you are attempting to access this website from a country belonging to the European Economic Area (EEA) including the EU which
    enforces the General Data Protection Regulation (GDPR) and therefore access cannot be granted at this time.

    For any issues, contact news@kndu.com or call 509-737-6725.

    Continue Reading

  • Thermographic Assessment of Lyme Borreliosis Without Erythema Migrans

    Thermographic Assessment of Lyme Borreliosis Without Erythema Migrans

    Introduction

    Lyme disease (LD) is the most common vector-borne infection in the Northern Hemisphere, leading to substantial economic and medical burdens.1 In the USA alone, insurance claims for clinician-diagnosed LD rose from approximately 329,000 in 2010 to 476,000 in 2018.2,3 If left untreated, LD can result in serious complications affecting the skin, joints, nervous system, and heart.4,5 Although most patients recover fully after antibiotic therapy, a subset experiences persistent symptoms lasting months or even years. This post-treatment manifestation, which remains poorly understood, is the focus of ongoing research due to its clinical and economic implications. Annual medical costs in the United States are estimated at $712 million to $1.3 billion.6–8

    Diagnosing LD can be challenging in the absence of erythema migrans, which is considered pathognomonic for the infection. While erythema migrans appears in the majority of cases, its absence has been reported in both US and European studies. In the United States, 13–18% of early LD cases lacked erythema migrans despite systemic symptoms or laboratory confirmation.9,10 Population-based data from Germany indicate that approximately 11% of cases lack classic erythema migrans.11 In such cases, diagnosis relies on two-tiered serological testing, which detects specific antibodies to Borrelia. However, this approach has limitations: IgM appears in only 20–50% of acute cases and peaks 4–6 weeks after onset. Early testing may yield false negatives due to the serology window, immune suppression, or antigenic variability. False positives also occur in various infectious and autoimmune diseases.12,13

    To enhance diagnostic accuracy in patients without erythema migrans, we supplemented serological testing with thermography – a non-invasive technique that records infrared radiation from the body. Thermography detects pathological changes through alterations in thermal distribution, primarily reflecting vascular responses.14 This method has been explored in diverse medical fields, including oncology, surgery, neurology, rheumatology, dermatology, and infectious diseases.15–17 A recently published study18 demonstrated the use of infrared thermography to visualize erythema migrans in patients with early Lyme borreliosis, showing localized hyperthermia and distinct thermal patterns in visible lesions. While that work confirmed the feasibility of thermography in clinically apparent erythema migrans, it did not address cases without visible skin changes.

    The aim of the present study is to explore whether this infrared thermography can also detect subclinical cutaneous inflammation at the site of a tick bite in patients lacking erythema migrans.

    Patients and Methods

    We observed 16 patients with Lyme disease without erythema migrans who received outpatient or inpatient care at the Infectious Disease Department of the Ternopil City Communal Hospital of Emergency Medicine. The diagnosis was based on clinical history (notably, tick bites confirmed by visits to the hospital’s trauma center for tick removal) and laboratory findings (positive two-tiered Lyme disease serologic testing results)19 in serum samples taken 10–20 days after the bite. At the time of sampling, hyperemia at the site of the bite was typically resolved, and no visible erythema migrans was present. Antibodies to B. burgdorferi sensu lato (s.l). complex antigens in the blood serum of patients were detected using enzyme-linked immunosorbent assay (ELISA) with test systems from Euroimmun AG (Germany): IgM antibodies were determined using the Anti-Borrelia burgdorferi ELISA (IgM) test, and IgG antibodies using the Anti-Borrelia plus VIsE ELISA (IgG) test.

    A control group consisted of 22 individuals who had experienced a tick bite but tested seronegative for Borrelia burgdorferi (serum IgM and/or IgG antibodies negative on samples taken 10–20 days after the bite). These controls were representative of the patient group in terms of age and sex distribution.

    All subjects completed a questionnaire that included personal information, as well as history details about the time, place, and circumstances of the tick bite, along with any prior treatment. Participation in the study was based on voluntary consent.

    Thermographic Assessment

    It was hypothesized that Borrelia, after penetrating the skin during a tick bite, causes minimal local inflammation. Even in the absence of visible hyperemia, this inflammation may still be associated with a localized increase in body temperature, or “warming”, which can be detected by infrared thermography. Initial thermography was performed at the same visit when the first serology sample was taken (10–20 days after the bite), before results were available, to assess its potential as an early detection method.

    Preparation for thermography and the examination itself followed the guidelines provided by the manufacturer, ULIRvision (China).20 Patients refrained from physiotherapy, massage, vasoactive drugs, and topical preparations for at least 24 hours, and from smoking or eating for 40–60 minutes before the examination. The room temperature was maintained at 18–22 °C, with a 10–15 minute acclimatization period. The camera was positioned 100–150 cm from the site of interest. The focus was on the site of tick attachment and surrounding tissues. For paired areas (eg, limbs or lateral sides of the torso), the symmetrical part of the body was also examined.

    Thermal images were initially analyzed visually by comparing color patterns to those of surrounding areas. In cases where thermal asymmetry was observed, the thermograms were described using the following criteria: presence of asymmetry, localization of areas with increased or decreased infrared radiation intensity, absolute temperature values, and differences relative to the symmetrical area. Subsequently, the images were processed using IRSee software, which automatically recorded temperature values at all points within the image. For greater precision, thermographs and histograms were used to present temperature distributions in the region of interest, as well as in specific points. When analyzing the area of interest, the shape (focal or diffuse), uniformity (homogeneous or heterogeneous), and contour clarity (clear or fuzzy) of the thermally active region were considered. The system can simultaneously process up to 20 point objects, 10 lines, 20 fields, and 10 polygonal or elliptical shapes. To compare symmetrical areas or the thermographically detected warm focus with surrounding tissues, points, horizontal lines, and oval fields were primarily used. The temperature at marked points was recorded, enabling the calculation of the temperature difference (ΔT) along the line of interest. A temperature difference (ΔT) of more than 0.5 °C between symmetrical areas is considered indicative of localized inflammatory activity, as described in previous studies.15,21,22 Extreme values (maximum and minimum temperatures) were automatically indicated, and an oval field enabled the construction of a histogram that visually represented the temperature distribution from minimum to maximum. Processed thermograms were stored in an electronic archive for future comparison during patient follow-up. All study subjects, both patients and controls, underwent repeat examination 3–4 months after the initial visit.

    Results

    The patients ranged in age from 20 to 62 years, with an average age of 36.8 ± 3.4 years. Among the participants, 7 (43.8%) were men and 9 (56.2%) were women. Thermography of the affected areas in all 16 patients with the non-erythematous form of LB revealed hyperthermia (local inflammation of the skin) around the tick bite site in the form of a ring-shaped zone of higher temperature (ΔT = 0.6–3.8°C) resembling erythema migrans. This annular thermal pattern persisted for at least two weeks, even after antibacterial therapy. In approximately one-third of the patients, ΔT ranged from 0.6 to 1.1°C; in half, it ranged from 1.2 to 1.6°C; and in 18.7% of cases, ΔT exceeded 1.6°C (Table 1).

    Table 1 Results of Thermal Imaging Examination of Tick Bite Sites in Patients with Non-Erythematous Form of LB (n=16)

    In 17 of the 22 seronegative controls, a primary hyperemia at the site of tick attachment was observed but resolved spontaneously or after desensitizing therapy within 3–7 days. The remaining five controls did not develop hyperemia at the bite site. Thermographic examination performed 10–20 days after the bite revealed no abnormal localized warming in any control subject. A follow-up thermographic assessment conducted 3–4 months after the initial visit likewise revealed no local hyperthermia or pathological thermal asymmetry in either the patient or control groups. The transient hyperemia observed in 17 control individuals is consistent with a typical short-lived allergic reaction to the bite rather than infection-related inflammation.

    Illustrative Case Study

    A 36-year-old male, presented with a history of a tick bite, which had been successfully removed at the trauma center of the Ternopil City Municipal Emergency Hospital. The tick measured over 6 mm, indicating a prolonged attachment (more than 3 days). At the site of the bite (Figure 1A), a hyperemic spot approximately 5 mm in size was observed, which appeared to be an allergic reaction to the tick bite. No erythema migrans was noted.

    Figure 1 Primary lesion at the site of a tick bite (A) and the corresponding hot spot (automatically marked with a cross) on the thermogram of the affected leg (B) of the patient with Lyme borreliosis, where erythema migrans is absent, but localized inflammation is detected through thermography (suberythematous form).

    Thermography during this period revealed significant hyperthermia (ΔT = 1.1°C) at the site of the bite, coinciding with the hyperemic spot’s outline (Figures 1B and 2). No antibiotics were administered at this stage, and serological tests (ELISA) did not initially confirm the presence of specific antibodies to Borrelia burgdorferi s.l. However, a follow-up serological examination after 30 days, using both ELISA and immunoblot, showed seroconversion of IgM and IgG antibodies against B. burgdorferi s.l. This, combined with the unaltered appearance of the skin on the lower leg (Figure 3A), led to a diagnosis of the cutaneous, non-erythematous form of Lyme borreliosis.

    Figure 2 Analysis of the thermogram of the patient using IRSee Software. Upper left corner (A) Infrared image with line L1 (indicating the warming epicenter) and the zone of interest E1 (surrounding area) encircled by an ellipse; markings L1 and E1 represent measurement points as displayed in the software during analysis. Center upper section (B) Tick bite site on the left shin showing a primary lesion without erythema migrans. Upper right corner (C) Histogram of temperatures within zone E1 (surrounding tissue near the bite site), ranging from 29.1 to 30.6 °C, shown as relative frequency (%). Lower left corner (D) Display of the obtained indicators. Center lower section (E) Temperature palette corresponding to the different temperature values. Lower right corner (F) Temperature plot along line L1, showing temperatures from 29.5 to 30.6 °C.

    Figure 3 (A) Absence of any visible changes in the skin of the left shin of the patient 30 days after the tick bite; (B) infrared thermogram of the patient: despite the absence of visible erythema migrans, the annular hyperthermia is visible, corresponding to the pathognomonic erythema migrans rash, not detectable to the unaided eye.

    Subsequent thermographic analysis of the left lower leg revealed local rounded hyperthermia in the form of a concentric circle with a diameter of up to 7 cm (ΔT = 0.8°C, Figures 3B and 4). Fourteen days after the completion of doxycycline hydrochloride therapy (200 mg per day, divided into two 100 mg doses for 2 weeks), the local hyperthermia at the tick bite site resolved, as confirmed by a follow-up thermogram (Figure 5). Repeat thermographic assessment performed 34 days later (64 days after the tick bite) showed a normal thermal distribution without any signs of abnormal thermal asymmetry.

    Figure 4 Thermogram of the left shin of the patient depicting the ring-shaped infrared glow, a thermographic pattern in which a distinct ring of increased temperature surrounds the area of inflammation, with a moderate temperature difference (∆T=0.8 °C), despite the absence of visible erythema. Upper left corner (A) Infrared image of the same area as in Figure 2A (line L1 and the zone of interest E1 within the ellipse) 30 days after the bite. Center upper section (B) Tick bite site on the left shin with no visible changes 30 days after the bite. Upper right corner (C) Temperature distribution in zone E1 (surrounding tissue near the bite site), ranging from 29.3 to 31.2 °C, shown as relative frequency (%). Lower left corner (D) Display of the obtained indicators. Center lower section (E) Temperature palette corresponding to the different temperature values. Lower right corner (F) Temperature plot along line L1, ranging from 29.3 to 30.1 °C.

    Figure 5 Analysis of the thermogram of the left shin of the same patient, 2 weeks after the completion of antibiotic therapy. Upper left corner (A) Infrared image of the same zone as in Figure 2A after completion of therapy. Center upper section (B) Tick bite site on the left shin with unchanged appearance after therapy. Upper right corner (C) Temperature distribution in zone E1 (surrounding tissue near the bite site), ranging from 31.0 to 31.8 °C, shown as relative frequency (%). Lower left corner (D) Display of the obtained indicators. Center lower section (E) Temperature palette corresponding to the different temperature values. Lower right corner (F) Temperature plot along line L1, ranging from 31.4 to 31.7 °C.

    Discussion

    To our knowledge, this is the first study to investigate the diagnostic utility of infrared thermography in patients with Lyme borreliosis (LB) who do not exhibit the pathognomonic sign of erythema migrans, building on our prior preliminary report22 that proposed this method for forms of the disease lacking visible erythema. In such cases, the absence of cutaneous manifestations significantly complicates timely diagnosis, particularly during the early phase of infection, when serological tests may still yield negative results. In this study, we employed thermography as a non-invasive, radiation-free technique for detecting localized increases in skin temperature suggestive of inflammation. From a practical standpoint, the method is simple, requiring only brief patient acclimatization, basic environmental control, and standard camera positioning. These steps are easily reproducible and demand minimal additional resources. The procedure can be carried out in standard outpatient settings without the need for specialized infrastructure beyond the imaging device itself. In addition to the aforementioned advantages, this method, unlike serological testing, does not depend on the development of a detectable antibody response, making it potentially useful in the very early stages of infection. Our findings suggest that infrared thermography can serve as a practical, noninvasive adjunct to serological testing in suspected Lyme borreliosis, particularly in cases without visible erythema migrans. In addition to the patient cohort, we also evaluated a control group of seronegative individuals with documented tick bites; none demonstrated abnormal localized warming on thermography at either 10–20 days or 3–4 months post-bite, which is consistent with the absence of infection-related inflammation in this group.

    The physiological basis for thermographic findings lies in the vascular and metabolic changes associated with inflammation. Skin temperature reflects the underlying dynamics of blood flow, tissue metabolism, and immune activity, with vascular factors being the primary determinant of thermal asymmetry.15–18

    Our findings align with the known pathophysiology of LB. In typical cases, the hematogenous or lymphogenous dissemination of Borrelia from the site of inoculation leads to the development of erythema migrans, mediated by inflammatory cytokines such as TNF-α, IL-1, and IL-6.20,22 However, the absence of visible erythema in some patients may reflect either a lower inoculum of spirochetes, individual variability in immune response, or its poor visibility on darker skin tones. Nonetheless, the local tissue still mounts a subclinical inflammatory reaction – one that includes warmth (calor) as a cardinal sign of inflammation.21–24 Thermography appears to capture this subvisible inflammatory process with high sensitivity, as evidenced by the concentric hyperthermic zones we observed around the tick bite sites, even in the absence of clinically evident erythema.

    In this context, we propose the descriptive term “suberythematous form” for cases without visible erythema migrans but with thermographically detected annular hyperthermia. While not a formally recognized subtype, this presentation may benefit from early detection via infrared thermography, which could serve as a valuable adjunct to serological testing. The infrared thermography technique is already well-established in other fields eg, oncology, neurology, angiology, and dermatology, among others and its application to infectious diseases has been developed by several groups.15–18 Our results complement recent works applying infrared thermography to visible erythema migrans lesions, including both single and disseminated forms.18,25 In contrast, the present study demonstrates that infrared thermography can also reveal localized hyperthermia in the absence of visible erythema, which we define as the suberythematous form of LB. This finding supports emerging evidence that infrared thermography can detect inflammatory changes associated with Borrelia burgdorferi infection even when erythema migrans is not clinically apparent. Importantly, such imaging may be valuable not only when the rash is absent, but also when it is subtle, difficult to discern in visible light, or less apparent in patients with darker skin pigmentation.18 Given its non-invasiveness, lack of contraindications, and ability to detect otherwise occult inflammation, infrared thermography has potential as a complementary diagnostic tool alongside serology and clinical evaluation in suspected Lyme borreliosis. We did not perform tick testing because the presence of Borrelia burgdorferi in a tick does not guarantee transmission, which typically requires ≥36–48 hours of attachment,26 and negative results cannot reliably exclude infection, as cases of LD have been documented after bites from PCR-negative ticks.27 Future studies with larger cohorts and standardized thermographic criteria are needed to refine diagnostic utility of infrared thermography and define its role within clinical guidelines.

    Conclusions

    1. Infrared thermography of the tick bite site can visualize the pathognomonic sign of Lyme borreliosis, the annular erythema migrans rash, even when it is not detectable on visual inspection.
    2. In patients with Lyme borreliosis who lack visible erythema migrans, thermographic imaging enables the detection of a temperature difference greater than 0.5 °C between the site of the tick bite and adjacent or symmetrical areas of the body.
    3. Early thermographic identification of the suberythematous cutaneous form of Lyme borreliosis supports the initiation of etiotropic therapy to prevent the development of long-term complications.

    Data Sharing Statement

    The data supporting the findings of this study are available from the corresponding author upon reasonable request.

    Ethics Approval and Consent to Participate

    This study was approved by the Institutional Bioethics Committee of Ivan Horbachevsky Ternopil National Medical University (protocol №. 81 dated April 3, 2025). All patients gave informed consent for participation in the study. Informed consent for publication of the case details and accompanying images was obtained from the patient described in the illustrative case study. The study was conducted in accordance with the ethical standards of the Declaration of Helsinki and its subsequent amendments.

    Author Contributions

    All authors made a significant contribution to the work reported, including conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the manuscript; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.

    Funding

    This study was not supported by any external funds.

    Disclosure

    The authors declare that they have no competing interests in this work.

    References

    1. Gocko X, Lenormand C, Lemogne C, et al. Lyme borreliosis and other tick-borne diseases. Guidelines from the French scientific societies. Med Mal Infect. 2019;49(5):296–317. doi:10.1016/j.medmal.2019.05.006

    2. Kugeler KJ, Schwartz AM, Delorey MJ, Mead PS, Hinckley AF. Estimating the frequency of Lyme disease diagnoses, United States, 2010-2018. Emerg Infect Dis. 2021;27(2):616–619. doi:10.3201/eid2702.202731

    3. Nelson CA, Saha S, Kugeler KJ, et al. Incidence of clinician-diagnosed Lyme disease, United States, 2005-2010. Emerg Infect Dis. 2015;21(9):1625–1631. doi:10.3201/eid2109.150417

    4. Steere AC, Strle F, Wormser GP, et al. Lyme borreliosis. Nat Rev Dis Primers. 2016;2(1):16090. doi:10.1038/nrdp.2016.90

    5. Smiyan S, Komorovsky R, Koshak B, Duve K, Shkrobot S. Central nervous system manifestations in rheumatic diseases. Rheumatol Int. 2024;44(10):1803–1812. doi:10.1007/s00296-024-05679-1

    6. Adrion ER, Aucott J, Lemke KW, Weiner JP. Health care costs, utilization and patterns of care following Lyme disease. PLoS One. 2015;10(2):e0116767. doi:10.1371/journal.pone.0116767

    7. Bobe JR, Jutras BL, Horn EJ, et al. Recent progress in Lyme disease and remaining challenges. Front Med. 2021;8:666554. doi:10.3389/fmed.2021.666554

    8. Tick-borne disease working group. report to congress 2018. HHS. 2018. Available from: https://www.hhs.gov/sites/default/files/tbdwg-report-to-congress-2018.pdf. Accessed March 16, 2025.

    9. Aucott J, Morrison C, Munoz B, Rowe PC, Schwarzwalder A, West SK. Diagnostic challenges of early Lyme disease: lessons from a community case series. BMC Infect Dis. 2009;9(1):79. doi:10.1186/1471-2334-9-79

    10. Steere AC, Sikand VK. The presenting manifestations of Lyme disease and the outcomes of treatment. N Engl J Med. 2003;348(24):2472–2474. doi:10.1056/NEJM200306123482423

    11. Huppertz HI, Böhme M, Standaert SM, Karch H, Plotkin SA. Incidence of Lyme borreliosis in the Würzburg region of Germany. Eur J Clin Microbiol Infect Dis. 1999;18(10):697–703. doi:10.1007/s100960050381

    12. Stanek G, Strle F. The history, epidemiology, clinical manifestations and treatment of Lyme borreliosis. In: Hunfeld KP, Gray J, editors. Lyme Borreliosis. Cham: Springer; 2022:77–105.

    13. Jakubowska K, Janocha A, Jerzak A, Ziemba P. Diagnosis, clinical manifestations and treatment of Lyme disease. Qual Sport. 2024;18:53283. doi:10.12775/QS.2024.18.53283

    14. Choi MS, Seong GH, Park MJ, et al. Rapidly progressing generalized morphea with high Lyme disease titer. Indian J Dermatol. 2020;65(5):432–434. doi:10.4103/ijd.IJD_279_18

    15. Andreychyn MA, Kopcha Yu V. Remote thermography and its significance for the diagnosis of acute tonsillitis. Infektsiyni khvoroby. 2016;3(85):82–88. Ukrainian. doi:10.11603/1681-2727.2016.3.6897

    16. Liu Q, Li M, Wang W, et al. Infrared thermography in clinical practice: a literature review. Eur J Med Res. 2025;30(1):33. doi:10.1186/s40001-025-02278-z

    17. Kesztyüs D, Brucher S, Wilson C, Kesztyüs T. Use of infrared thermography in medical diagnosis, screening, and disease monitoring: a scoping review. Medicina. 2023;59(12):2139. doi:10.3390/medicina59122139

    18. Brothers NI, Rebman A, Zenilman JM, et al. Use of infrared thermography in visualizing erythema migrans. Cureus. 2025;17(8):e89242. doi:10.7759/cureus.89242

    19. U.S. Centers for Disease Control and Prevention. Standard two-tiered testing: suggested result reporting and interpretation. Available from: https://www.cdc.gov/lyme/media/pdfs/2024/05/Standard_Two_Tiered_Testing_Suggested_Results_Reporting_Interpretation.pdf. Accessed May 12, 2025.

    20. Thermal imager Xintest HT-18. Available from: https://brom.ua/index.php?route=product/product&product_id=18190. Accessed May 12, 2025.

    21. Andreychyn MA, Korda MM. [Lyme Borreliosis: Monograph]. Ternopil: TNMU; 2021:376. Ukrainian.

    22. Andreychyn MA, Korda MM, Kopcha VS, Shkilna MI. Method of diagnosis of non-erythematous Lyme disease. Infektsiyni khvoroby. 2020;1(99):16–19. Ukrainian. doi:10.11603/1681-2727.2020.1.11092

    23. Nielsen MC, Miller NS. Epidemiology and diagnosis of Lyme disease in the United States. Clin Lab Med. 2025;45(1):137–144. doi:10.1016/j.cll.2024.10.008

    24. Strle F, Wormser GP. Early Lyme disease (erythema migrans) and its mimics (southern tick-associated rash illness and tick-associated rash illness). Infect Dis Clin North Am. 2022;36(3):523–539. doi:10.1016/j.idc.2022.03.005

    25. Huk MT, Andreychyn MA, Shkilna MI, Zaporozhan SY. Thermographic study of migrating erythema. Sci Bul Uzhhorod Univ Series. 2021;63(1):43–48.

    26. Cook M. Lyme borreliosis: a review of data on transmission time after tick attachment. Int J Gen Med. 2015;8:1–8. doi:10.2147/IJGM.S73791

    27. Hofhuis A, van de Kassteele J, Sprong H, et al. Predicting the risk of Lyme borreliosis after a tick bite, using a structural equation model. PLoS One. 2017;12(7):e0181807. doi:10.1371/journal.pone.0181807

    Continue Reading

  • Relationship between sodium, calcium, magnesium, phosphorus levels and

    Relationship between sodium, calcium, magnesium, phosphorus levels and

    Introduction

    Hypertension stands as a significant chronic disease globally, with high rates of incidence and prevalence, and is also a frequently encountered cardiovascular syndrome in China.1 It is estimated that in 2010, 31.1% (1.39 billion) of the global adult population suffered from hypertension.2 Between 1990 and 2019, the number of hypertensive patients aged 30 to 79 worldwide has doubled,3 and the prevalence of hypertension among adults in China has reached 27.5%.4 The circadian rhythm of blood pressure, a common rhythm of the cardiovascular system, plays a pivotal role in the progression of hypertension.5 The normal circadian rhythm of blood pressure is characterized by a dipper pattern, where the systolic and diastolic blood pressure at night decreases by 10% to 20% compared to the daytime. A decrease of less than 10% is indicative of a non-dipper or reverse dipper pattern.6 Non-dipper and reverse dipper patterns are closely associated with a decline in cardiac and renal function and an increased risk of cardiovascular mortality.7,8

    The circadian rhythm of blood pressure in hypertensive patients may be influenced by the status of several electrolytes, including sodium, calcium, magnesium, and phosphorus. Dietary sodium intake has been shown to modulate nocturnal blood pressure decline in individuals with salt-sensitive essential hypertension.9 Hypomagnesemia attenuates the magnitude of nocturnal blood pressure dipping and promotes a non-dipper pattern, possibly via enhanced nocturnal sympathetic activity and disruption of the renin–angiotensin–aldosterone system rhythm.10 Dysregulated calcium–phosphorus homeostasis may also indirectly disturb blood pressure rhythmicity by impairing vascular function.11 Nevertheless, data directly linking circulating levels of sodium, calcium, magnesium, and phosphorus to the circadian blood pressure profile in hypertensive patients remain scarce. These ions are common elements in human body fluids and are intimately connected with human water and sodium metabolism, neuroendocrine functions, and fluid balance. They can partially reflect the metabolic status of the body, and clinical data are relatively easy to obtain. Therefore, this study aims to collect comprehensive clinical data from hypertensive patients and to analyze the associations between circulating and urinary concentrations of sodium, calcium, magnesium, and phosphorus and the dipper pattern of blood pressure. By doing so, we seek to determine whether a threshold level of these electrolytes in body fluid triggers an alteration in the circadian rhythm of blood pressure.

    Materials and Methods

    Study Subjects and Ethical Approval

    This is a single center cross-sectional study. Patients with hypertension admitted to the Department of Cardiology at Longhua Hospital Affiliated to Shanghai University of Traditional Chinese Medicine from January 2021 to December 2023 were selected for this study. The inclusion criterion was all patients who met the diagnostic criteria for hypertension as outlined in the 2018 Chinese guidelines for the management of hypertension.12 Exclusion criteria included patients with secondary hypertension, severe renal failure and those with incomplete clinical data. Secondary hypertension and severe renal failure, both of which may substantially perturb electrolyte homeostasis, were deliberately excluded from this study. This study was conducted in accordance with the Declaration of Helsinki and was approved by the Ethics Committee of Longhua Hospital Affiliated to Shanghai University of Traditional Chinese Medicine (approval number: 2020LCSY044). All patients voluntarily participated in the study and provided written informed consent. All enrolled patients maintained their usual diet. Based on recent national estimates, the prevalence of reverse-dipper hypertension in China is approximately 49%.13 Using an absolute precision of ±5%, a two-sided α of 0.05, and allowing for 10% anticipated data loss, the calculated sample size was 426 participants. And after excluding cases with incomplete data, a total of 419 patients were ultimately included in the final analysis.

    Study Methods

    General Data Collection

    Comprehensive clinical data were collected for each patient, including gender, age, hypertension duration, and the presence or absence of comorbidities such as diabetes mellitus, myocardial infarction, and cerebral infarction. Additionally, detailed records of antihypertensive medications taken by the patients were maintained, encompassing angiotensin receptor-neprilysin inhibitors (ARNI), angiotensin-converting enzyme inhibitors (ACEI), angiotensin receptor blockers (ARB), calcium channel blockers (CCB), diuretics (including thiazide and loop diuretics), mineralocorticoid receptor antagonists (MRA), beta-blockers, alpha-blockers, and clonidine. In cases where patients were using single-pill combinations, the medications were categorized based on their constituent components.

    24-Hour Ambulatory Blood Pressure Monitoring and Grouping

    24-hour Ambulatory blood pressure monitoring was conducted by dedicated nurse using the TM-2430 ambulatory blood pressure monitor. The cuff was secured on the patient’s left upper arm, with the lower edge of the cuff positioned 2 to 3 cm above the elbow crease. The device was programmed to automatically inflate and measure blood pressure at 20-minute intervals during the day (from 8:00 to 20:00) and at 30-minute intervals at night (from 20:00 to 08:00 the following day). The nocturnal blood pressure fall rate was calculated using the formula: (Daytime average blood pressure−Nighttime average blood pressure)/Daytime average blood pressure×100%, in accordance with the 2020 Chinese Hypertension League Guidelines on Ambulatory Blood Pressure Monitoring.6 Patients with a nocturnal blood pressure fall rate of less than 10% were categorized into the non-dipper/reverse dipper group, while those with a rate of 10% or more were classified into the dipper/over-dipper group. In cases where systolic and diastolic pressures did not align, the systolic pressure was taken as the reference standard.14 Controlled hypertension was defined as a 24-hour ambulatory systolic/diastolic blood pressure <130/80 mmHg.15 For each participant, data were collected continuously over a 24-hour period, beginning at 08:00 on day 1 and ending at 08:00 on the following day.

    Serum and 24-Hour Urine Testing

    Following a 12-hour overnight fast, venous blood samples were collected from patients at 7:00 AM the next day to measure serum concentrations of sodium, calcium, and magnesium ions. Additionally, patients were instructed to begin collecting a 24-hour urine sample starting at 6:00 AM the day after admission to assess the levels of sodium, calcium, magnesium, and phosphate ions in the urine. Venous blood samples were collected by dedicated nurses. Patients self-collected 24-hour urine into standardized containers provided by the hospital; the next day, dedicated nurses delivered these containers to the hospital laboratory for analysis. For each participant, data were collected continuously over a 36-hour interval, commencing at 19:00 on day 1 and concluding at 07:00 on day 3.

    Statistical Methods

    Statistical analysis was performed using SPSS 20.0 and R 4.2.2 software. For normally distributed continuous variables, the mean ± standard deviation was used for description, and the independent samples t-test was applied for group comparisons. For continuous variables not conforming to a normal distribution, the median (interquartile range) [M (QL~QU)] was used, and the Mann–Whitney U-test was employed for group comparisons. Variables that conformed to a normal distribution were age, systolic and diastolic blood pressures, the 24-hour mean systolic and diastolic blood pressures, the daytime and nighttime mean systolic and diastolic blood pressures, and serum electrolyte concentrations. Those exhibiting skewed distributions were hypertension duration, nocturnal systolic and diastolic blood pressure decline rates, and 24-hour urinary electrolyte excretion. Categorical data were described using frequencies and analyzed with the chi-square test. Univariate logistic regression analysis was conducted to assess the relationship between age, gender, diabetes, controlled hypertension, serum sodium, calcium, and magnesium concentrations, and 24-hour urine sodium, calcium, magnesium, and phosphate levels with the dipper/over-dipper group. Variables with statistically significant results in the univariate analysis were then subjected to multivariate logistic regression analysis (coding: dipper/over-dipper group = 1, non-dipper/reverse dipper group = 0). Considering the potential nonlinear relationships among these factors, we then used restricted cubic spline (RCS) analysis with four knots to examine how age, sex, diabetes, serum sodium, calcium and magnesium concentrations, as well as 24-hour urinary sodium, calcium, magnesium and phosphate excretion, are associated with (1) dipper/over-dipper status and (2) the magnitude of nocturnal systolic and diastolic blood pressure decline. The significance level was set at P < 0.05.

    Results

    Baseline Characteristics

    A total of 419 patients were included in this study, comprising 224 males and 195 females. There were 116 patients in the dipper/over-dipper group and 303 in the non-dipper/reverse dipper group. The average age was 65.08 ± 13.52 years, with hypertension duration of 12.00 (6.00–22.00) years. The nocturnal decline rates for systolic and diastolic blood pressure were 4.13 (−2.19–10.94)% and 5.14 (−0.81–11.82)% respectively. The serum concentrations of sodium, calcium, and magnesium ions were 140.76 ± 2.86 mmol/L, 2.24 ± 0.20 mmol/L, and 0.88 ± 0.13 mmol/L respectively. The 24-hour urinary levels of sodium, calcium, magnesium, and phosphate ions were 138.10 (105.00–178.00) mmol, 2.51 (1.06–4.83) mmol, 3.07 (2.18–4.19) mmol, and 15.50 (11.11–20.18) mmol respectively.

    Compared with the non-dipper/reverse dipper group, patients in the dipper/over-dipper group were significantly younger (P=0.005), exhibited a significantly greater nocturnal decline in systolic and diastolic blood pressure (P<0.001), had a lower 24-hour average systolic blood pressure (P=0.001), a higher daytime average diastolic blood pressure (P=0.030), and a significantly lower nighttime average systolic and diastolic blood pressure (P<0.001). The use of alpha-blockers was also lower in the dipper/over-dipper group (P=0.010), and this group had higher serum calcium ion concentrations and 24-hour urinary phosphate ion levels (P=0.030, P=0.041 respectively). No other significant differences were observed (P>0.05). See Table 1 for details.

    Table 1 Baseline Characteristics of All Study Participants

    Logistic Regression Analysis for Dipper/Over-Dipper Hypertension Pattern

    Univariate logistic regression analysis was performed to examine the association between various factors and the dipper/over-dipper group. The factors included age, gender, diabetes, cerebral infarction, controlled hypertension, antihypertensive medications, serum sodium, calcium, and magnesium concentrations, and 24-hour urine sodium, calcium, magnesium, and phosphate levels. The analysis revealed that age, the use of alpha-blockers, serum calcium ion concentration, and 24-hour urine calcium ion level were significantly associated with the dipper/over-dipper group (P < 0.05), while the other factors were not (P > 0.05). Subsequently, a multivariate logistic regression analysis was conducted with these significant factors. The results indicated that age and the use of alpha-blockers were independently associated with the dipper/over-dipper group (P = 0.013, P = 0.015). For further details, refer to Table 2.

    Table 2 Results of Logistic Regression Analysis for Dipper/Over-Dipper Hypertension Pattern (n=419)

    Relationship Between Dipper Blood Pressure Pattern and Various Factors

    Restricted cubic spline analysis was performed to examine the relationship between the dipper/over-dipper group and various factors such as age, gender, diabetes, serum sodium, calcium, and magnesium concentrations, and 24-hour urine sodium, calcium, magnesium, and phosphate levels. The results indicated that the dipper/over-dipper group was associated with age, serum sodium, calcium ion concentrations, and 24-hour urine calcium ion levels (P=0.007, P=0.005, P=0.037, P=0.018), while no significant associations were found with the other factors (P > 0.05).

    Although age showed a potential linear relationship with dipper blood pressure (P for non-linearity = 0.051), the trend plot revealed a turning point at 54 years. The likelihood of dipper blood pressure was maximized at serum sodium ion concentration of 139.55 mmol/L and 24-hour urine calcium ion level of 5.34 mmol, and minimized at a 24-hour urine calcium ion level of 1.65 mmol. A serum calcium ion concentration greater than 2.20 mmol/L was significantly associated with an increased likelihood of dipper blood pressure. For more details, refer to Figure 1.

    Figure 1 Relationship between dipper blood pressure pattern and various factors. (A) age, (B) serum sodium concentration, (C) serum calcium concentration, (D) 24-hour urine calcium ion level. The bold numerical indicate the most relevant turning point between two variables.

    Relationship Between Nocturnal Systolic Blood Pressure Decline Rate and Various Factors

    Restricted cubic spline analysis was also conducted to investigate the relationship between the nocturnal systolic blood pressure decline rate and various factors including age, gender, diabetes, serum sodium, calcium, and magnesium concentrations, and 24-hour urine sodium, calcium, magnesium, and phosphate levels. The results showed that the nocturnal systolic blood pressure decline rate was associated with age, serum calcium ion concentration, and 24-hour urine calcium ion level (P < 0.001, P = 0.003, P = 0.015), while no significant associations were found with the other factors (P > 0.05).

    The maximum nocturnal systolic blood pressure decline rate corresponded to an age of 57 years, a serum calcium ion concentration of 2.41 mmol/L, and a 24-hour urine calcium ion level of 5.47 mmol. The minimum nocturnal systolic blood pressure decline rate was observed at a 24-hour urine calcium ion level of 1.57 mmol. A serum calcium ion concentration greater than 2.20 mmol/L was significantly associated with an increased nocturnal systolic blood pressure decline rate. For more details, refer to Figure 2.

    Figure 2 Relationship between nocturnal systolic blood pressure decline rate and various factors. (A) age, (B) serum calcium concentration, (C) 24-hour urine calcium ion level. The bold numerical indicate the most relevant turning point between two variables.

    Relationship Between Nocturnal Diastolic Blood Pressure Decline Rate and Various Factors

    A restricted cubic spline analysis was also conducted to assess the correlation between the nocturnal diastolic blood pressure decline rate and various factors, including age, gender, diabetes, serum sodium, calcium, and magnesium concentrations, as well as 24-hour urinary sodium, calcium, magnesium, and phosphate levels. The analysis indicated a significant correlation between the nocturnal diastolic blood pressure decline rate and age, serum sodium, calcium, and magnesium concentrations (P=0.032, P=0.039, P=0.001, P=0.041), with no significant correlation observed with other factors (P>0.05).

    The maximum nocturnal diastolic blood pressure decline rate corresponded to serum sodium, calcium, and magnesium concentrations of 139.03 mmol/L, 2.42 mmol/L, and 0.95 mmol/L, respectively. A serum calcium concentration greater than 2.20 mmol/L was significantly associated with an increased nocturnal diastolic blood pressure decline rate. Although age appeared to have a linear relationship with the nocturnal diastolic blood pressure decline rate (P for non-linearity =0.194), a trend plot revealed a distinct inflection point at the age of 54 years. See Figure 3 for details.

    Figure 3 Relationship between nocturnal diastolic blood pressure decline rate and various factors. (A) age, (B) serum sodium concentration, (C) serum calcium concentration, (D) serum magnesium concentration.The bold numerical indicate the most relevant turning point between two variables.

    Discussion

    The circadian rhythm of blood pressure, integral to cardiovascular homeostasis, is influenced by a complex interplay of hormonal and physiological factors, including melatonin, atrial natriuretic peptide, and the renin-angiotensin-aldosterone system.16,17 This rhythm can be significantly altered by conditions such as obstructive sleep apnea, which disrupts the normal 24-hour blood pressure pattern and increases cardiovascular risk.18 Our study revealed a significant association between advanced age and the non-dipper pattern of blood pressure, identifying age as an independent predictor of dipper blood pressure (Table 1 and 2). Restricted cubic spline (RCS) analysis further illuminated this relationship, indicating a linear association between age and dipper blood pressure with a notable inflection point at 54 years (Figure 1). Similarly, the nocturnal systolic and diastolic blood pressure decline rates showed significant age-related trends, peaking at 57 years and 54 years, respectively (Figures 2 and 3). These findings suggest a critical age threshold between 54 and 57 years, beyond which the likelihood of maintaining a dipper blood pressure pattern declines precipitously. These findings are corroborated by an Indian observational study of 27,472 patients, which documented a decrease in the prevalence of dipper blood pressure from 42.5% to 17.9% and a concurrent increase in reverse dipper blood pressure from 7.3% to 34.2% among individuals aged 30 to 80 years.19 Research by Deng et al further supports this association, highlighting the independent impact of age on the amplitude of nocturnal blood pressure decline, particularly in elderly populations.20 Collectively, these results underscore the age-related alterations in blood pressure rhythms and their clinical relevance in hypertension management.

    Our study unveiled a significant positive correlation between calcium ions and dipper blood pressure patterns. Calcium, an abundant mineral in the human body, is crucial for maintaining skeletal structure, muscle contraction, vascular tone, blood clotting, nerve transmission, and energy metabolism.21 In blood pressure regulation, calcium ions play a pivotal role by influencing ion channels on vascular smooth muscle cells, particularly the L-type voltage-dependent calcium channels, which are essential in modulating the tone of small arteries.22 During hypertension, the function and expression of these ion channels may be altered, thereby affecting vascular tone. Although some studies have suggested that low-dose calcium intake may help lower blood pressure,23,24 with more pronounced effects in younger individuals,25 research on the correlation between calcium ion levels in blood and urine and dipper blood pressure patterns is relatively scarce. Our study found a correlation between serum calcium ions and dipper blood pressure (see Table 2), and both serum and urinary calcium levels were significantly associated with dipper blood pressure and the decline rates of nocturnal systolic and diastolic blood pressure. Higher levels of calcium ions in the blood and urine of hypertensive patients were associated with a greater likelihood of dipper blood pressure patterns (see Figures 1–3). Previous studies on the correlation between urinary calcium excretion and hypertension have found an inverse relationship between urinary calcium excretion and calcium concentration in renal interstitial fluid, suggesting that a decrease in calcium concentration in the renal cortical interstitial fluid may mediate renal vasoconstriction, thereby affecting blood pressure elevation and rhythm disruption.26 Some research has indicated that compared to healthy individuals, hypertensive patients have higher urinary calcium excretion and parathyroid hormone levels, possibly due to intrinsic defects in renal calcium handling.27 While the correlation and specific mechanisms between calcium ion concentrations in blood and urine and dipper blood pressure patterns are not yet fully understood, it is clear that calcium ion balance is crucial for maintaining overall blood pressure health. For hypertensive patients, monitoring and adjusting calcium intake and excretion may be an important direction for future research. We anticipate that further basic and clinical studies will explore this area to provide more insights into the role of calcium in hypertension management.

    In this study, logistic regression analysis did not reveal a correlation between blood and urine sodium levels and dipper blood pressure profiles. Similarly, RCS analysis did not identify a relationship between 24-hour urine sodium levels and dipper blood pressure profiles. However, we found a significant association between serum sodium levels and dipper blood pressure profiles, as well as the nocturnal diastolic blood pressure fall rate, with inflection points at 139.55 and 139.03 mmol/L, respectively (see Figures 1 and 3). This suggests that hypertensive patients with a serum sodium level of 139 mmol/L have the highest likelihood of exhibiting a dipper blood pressure pattern. Some studies have indicated a “J-shaped” relationship between serum sodium concentration and primary cardiovascular events, with the lowest cardiovascular risk observed in individuals with serum sodium levels between 141 and 143 mmol/L.28 Higher serum sodium concentrations are associated with an increased risk in hypertensive patients, and reducing salt intake can lower blood pressure in both hypertensive and normotensive individuals.29 It is also recommended that hypertensive patients in China limit their daily sodium intake to less than 2g (less than 5g of sodium chloride).1 Additionally, salt intake is correlated with blood pressure circadian rhythm. In salt-sensitive hypertensive rats, a high-salt diet leads to elevated blood pressure and non-dipper blood pressure profiles, which normalize upon a return to a regular salt diet, reverting to dipper blood pressure profiles.30 Sodium overload promotes hypertension and disrupts circadian blood pressure rhythms through multiple, convergent pathophysiological pathways. High dietary Na⁺ raises plasma osmolality, provoking arginine vasopressin release, extracellular fluid expansion, and sustained activation of the renin–angiotensin–aldosterone system; together these events potentiate vasoconstriction, vascular smooth-muscle hypertrophy, and a progressive rise in systemic vascular resistance.31 Concurrently, excess Na⁺ amplifies central sympathetic outflow via up-regulation of hypothalamic prolyl endopeptidase, accelerating norepinephrine release.32 Furthermore, a high-sodium milieu directly impairs endothelial function by attenuating endothelial nitric-oxide synthase activity, diminishing nitric-oxide bioavailability, and triggering reactive oxygen species formation, thereby intensifying oxidative stress and blunting endothelium-dependent vasodilation.33 The cumulative impact of these mechanisms not only elevates baseline blood pressure but also desynchronizes central and peripheral circadian clocks, driving the transition from a physiological nocturnal-dip pattern to non-dipping or reverse-dipping hypertension. So, it is evident that sodium is closely related to blood pressure and its circadian rhythm. The 24-hour urine sodium level is a more reliable indicator of salt intake.34 Previous studies have found that individuals with dipper blood pressure profiles have lower 24-hour urine sodium levels compared to those with non-dipper profiles.35 However, our results differ slightly from those of other scholars, possibly due to the single-center nature of our study, along with a smaller sample size and older age group.

    In our study, we did not find a correlation between magnesium ions and dipper blood pressure profiles or the nocturnal systolic blood pressure fall rate. However, there was a certain correlation between serum magnesium levels and the nocturnal diastolic blood pressure fall rate, with an inflection point at 0.95 mmol/L (see Figure 3), suggesting that magnesium ions may have some association with blood pressure circadian rhythm. Magnesium is the second most abundant intracellular cation in the human body and is widely involved in various physiological functions, affecting vascular endothelial cells, vascular inflammation, and oxidative stress. Insufficient magnesium intake may be associated with an increased risk of hypertension,36 while higher serum magnesium levels are independently associated with a lower risk of cardiovascular events.37 A cross-sectional study of 443 children aged 6–9 years in Guangzhou, China, found that higher levels of serum magnesium and calcium were negatively correlated with blood pressure.38 Although numerous studies have demonstrated a close relationship between lower levels of serum magnesium and the occurrence and progression of hypertension,39 and dietary magnesium supplementation has been shown to have a certain antihypertensive effect,40 research on the relationship between magnesium ions and blood pressure circadian rhythm is still relatively scarce. Magnesium ions may influence the nocturnal diastolic blood pressure fall rate by acting as a calcium antagonist, affecting the renin-angiotensin-aldosterone system, improving endothelial function, influencing vascular remodeling and stiffness, and regulating the release of catecholamines.41 In the analysis of the correlation between phosphate ions and dipper blood pressure profiles, although patients in the dipper group had higher 24-hour urinary phosphate levels than those in the non-dipper group, no other statistically significant differences were found. This suggests that the correlation between phosphate ions and blood pressure circadian rhythm may require further exploration.

    Non-dipper or reverse-dipper blood pressure patterns are associated with an increased risk of cardiovascular events and renal failure.42 Aggressive blood pressure control, particularly optimized antihypertensive therapy, can yield greater benefits for hypertensive patients.1 In our study, we delved into the correlation between serum levels of sodium, calcium, magnesium, and phosphate and the dipper pattern of blood pressure among hypertensive patients. Our findings underscore a significant association between these electrolytes and the dipper blood pressure profile, suggesting that the monitoring and modulation of these ions’ intake and excretion could represent a promising avenue for intervention in hypertension management. We observed that the likelihood of maintaining a dipper blood pressure pattern diminishes with advancing age, indicating a need for more nuanced assessment and vigilant monitoring of blood pressure circadian rhythms in the elderly hypertensive population. Furthermore, our analysis revealed an independent negative correlation between alpha-blockers and the dipper blood pressure pattern (as detailed in Table 2), which implies that such pharmacological agents may not be optimal for preserving the normal circadian rhythm of blood pressure in hypertensive patients. This hypothesis warrants further investigation through additional clinical studies. The clinical implications of our study are substantial. They suggest that personalized blood pressure management, coupled with strategic lifestyle modifications—such as sodium restriction and the optimization of calcium, phosphate, and magnesium intake—could enhance the control of hypertension and its circadian variations. This, in turn, may contribute to a reduction in the risk of cardiovascular events. The application of ambulatory blood pressure monitoring is also highlighted as a valuable tool for evaluating blood pressure rhythms, offering crucial insights for the diagnosis and therapeutic management of hypertension.

    Limitations

    As a single-center cross-sectional study, our conclusions may be subject to certain biases that could limit the generalizability of our findings. The study population was skewed towards older age groups and had an imbalanced gender ratio, which may have influenced the analysis of the correlation between ion levels and blood pressure rhythm. We did not fully account for patients’ lifestyle and dietary habits, factors that could impact blood pressure rhythm and ion levels. The cross-sectional design of our study also restricts our ability to establish causality. Future research should employ multicenter, large-sample, prospective cohort study designs to enhance the universality and reliability of the findings and to explore additional potential biomarkers and interventions. Through these approaches, we can gain a more comprehensive understanding of the regulatory mechanisms of blood pressure rhythm and provide more effective treatment strategies for hypertensive patients. Future investigations should also aim to elucidate the underlying mechanisms by which these electrolyte levels influence blood pressure circadian rhythms and to evaluate the efficacy of these management approaches across diverse patient demographics.

    Conclusion

    In conclusion, our study revealed significant associations between serum levels of sodium, magnesium, calcium, and other ions with dipper blood pressure patterns, suggesting that these ions in blood and urine may serve as potential predictors or early warning indicators for dipper blood pressure. Maintaining these ions within certain ranges may also aid in preserving the dipper pattern. Our research substantiates the critical role of integrating blood pressure rhythm and electrolyte homeostasis into hypertension management strategies, potentially improving patient outcomes.

    Data Sharing Statement

    Data will be made available on request from the corresponding author, Yi-Hong Wei.

    Ethics Approval and Consent to Participate

    This study was conducted in accordance with the Declaration of Helsinki and approved by the ethics committee of Longhua Hospital affiliated to Shanghai University of Traditional Chinese Medicine.

    Acknowledgments

    Jia-Ying Huang and Wang Zheng are co-first authors for this study.

    Funding

    This work was supported by Shanghai Health Commission Project (grant numbers 202040341), Shanghai University of Traditional Chinese Medicine Industry Development Center’s Integrated Medical and Elderly Care Innovation Project (grant numbers 602061D) and Shanghai Health Commission Traditional Chinese Medicine Research Project (grant numbers 2024QN032).

    Disclosure

    The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this article.

    References

    1. Writing Group of 2018 Chinese Guidelines for the Management of Hypertension, Chinese Hypertension League, China International Exchange and Promotive Association for Hypertension, Chinese Gerontological Society Hypertension Branch, Hypertension Branch of Chinese Geriatrics Society, Chinese Stroke Association. 2024 Chinese guidelines for the management of hypertension. Chinese J Hypertensions. 2024;32(7):603–700.

    2. Mills KT, Stefanescu A, He J. The global epidemiology of hypertension. Nat Rev Nephrol. 2020;16(4):223–237. doi:10.1038/s41581-019-0244-2

    3. NCD Risk Factor Collaboration. Worldwide trends in hypertension prevalence and progress in treatment and control from 1990 to 2019: a pooled analysis of 1201 population-representative studies with 104 million participants. Lancet. 2021;398(10304):957–980. doi:10.1016/S0140-6736(21)01330-1

    4. Mei Z, Jing W, Xiao Z, et al. Prevalence and control of hypertension in adults in China, 2018. Chin J Epidemiol. 2021;42(10):1780–1789. doi:10.3760/cma.j.cn112338-20210508-00379

    5. Chen L, Yang G. Recent advances in circadian rhythms in cardiovascular system. Front Pharmacol. 2015;6:71. doi:10.3389/fphar.2015.00071

    6. Writing Group of the 2020 Chinese Hypertension League Guidelines on Ambulatory Blood Pressure Monitoring, 2020 Chinese Hypertension League Guidelines on Ambulatory Blood Pressure Monitoring. 2020 Chinese hypertension league guidelines on ambulatory blood pressure monitoring. Chinese Circulation J. 2021;36(4):313–328.

    7. Salles GF, Reboldi G, Fagard RH, et al. Prognostic effect of the nocturnal blood pressure fall in hypertensive patients: the ambulatory blood pressure collaboration in patients with hypertension (ABC-H) meta-analysis. Hypertension. 2016;67(4):693–700. doi:10.1161/HYPERTENSIONAHA.115.06981

    8. Motiejunaite J, Flamant M, Arnoult F, et al. Predictors of daytime blood pressure, nighttime blood pressure, and nocturnal dipping in patients with chronic kidney disease. Hypertension Res. 2024;47(9):2511–2520. doi:10.1038/s41440-024-01778-5

    9. Uzu T, Ishikawa K, Fujii T, Nakamura S, Inenaga T, Kimura G. Sodium restriction shifts circadian rhythm of blood pressure from nondipper to dipper in essential hypertension. Circulation. 1997;96(6):1859–1862. doi:10.1161/01.CIR.96.6.1859

    10. Wu W, Gong M, Liu P, Yu H, Gao X, Zhao X. Hypomagnesemia: exploring its multifaceted health impacts and associations with blood pressure regulation and metabolic syndrome. Diabetol Metab Syndr. 2025;17(1):217. doi:10.1186/s13098-025-01772-y

    11. Hu J, Yang Z, Chen X, et al. Thromboxane A(2) is involved in the development of hypertension in chronic kidney disease rats. Eur J Pharmacol. 2021;909:174435. doi:10.1016/j.ejphar.2021.174435

    12. Writing Group of 2018 Chinese Guidelines for the Management of Hypertension, Chinese Hypertension League, Chinese Society of Cardiology, Chinese Medical Doctor Association Hypertension Committee, Hypertension Branch of China International Exchange and Promotive Association for Medical and Health Care, Hypertension Branch of Chinese Geriatric Medical Association. 2018 Chinese guidelines for the management of hypertension. Chinese J Cardiovasc Med. 2019;24(1):24–56.

    13. Liu J, Li Y, Asayama K, et al. Asian expert consensus on nocturnal hypertension management. Hypertension. 2025;82(6):945–956. doi:10.1161/HYPERTENSIONAHA.124.24026

    14. Liping D, Yu W, Jie X, Yilei G, Chenyu W. The relationship between blood pressure diurnal rhythm and chronotherapy in elderly hypertensive patients. Chinese J Hypertensions. 2023;31(8):764–768.

    15. Cheng YB, Thijs L, Zhang ZY, et al. Outcome-driven thresholds for ambulatory blood pressure based on the new American college of cardiology/American heart association classification of hypertension. Hypertension. 2019;74(4):776–783. doi:10.1161/HYPERTENSIONAHA.119.13512

    16. Veglio F, Pietrandrea R, Ossola M, Vignani A, Angeli A. Circadian rhythm of the angiotensin converting enzyme (ACE) activity in serum of healthy adult subjects. Chronobiologia. 1987;14(1):21–25.

    17. Li H, Sun NL, Wang J, Liu AJ, Su DF. Circadian expression of clock genes and angiotensin II type 1 receptors in suprachiasmatic nuclei of sinoaortic-denervated rats. Acta Pharmacol Sin. 2007;28(4):484–492. doi:10.1111/j.1745-7254.2007.00543.x

    18. Cuspidi C, Tadic M, Sala C, Gherbesi E, Grassi G, Mancia G. Blood pressure non-dipping and obstructive sleep apnea syndrome: a meta-analysis. J Clin Med. 2019;8(9):1367. doi:10.3390/jcm8091367

    19. Kaul U, Omboni S, Arambam P, et al. Blood pressure related to age: the India ABPM study. J Clin Hypertens. 2019;21(12):1784–1794. doi:10.1111/jch.13744

    20. Deng M, Chen DW, Dong YF, et al. Independent association between age and circadian systolic blood pressure patterns in adults with hypertension. J Clin Hypertens. 2017;19(10):948–955. doi:10.1111/jch.13057

    21. Xiao H, Yan Y, Gu Y, Zhang Y. Strategy for sodium-salt substitution: on the relationship between hypertension and dietary intake of cations. Food Res Int. 2022;156:110822. doi:10.1016/j.foodres.2021.110822

    22. Guowei Z, Bo L, Guozhen Z, Fei S. L-type calcium channels remodeling in vascular smooth muscle cell promoted by hypertension. Acad J Chinese PLA Med School. 2015;36(2):190–192.

    23. Cormick G, Belizan JM. Calcium intake and health. Nutrients. 2019;11(7):1606. doi:10.3390/nu11071606

    24. Hamer O, Mohamed A, Ali-Heybe Z, Schnieder E, Hill JE. Calcium supplementation for the prevention of hypertension: a synthesis of existing evidence and implications for practise. British J Cardiac Nursing. 2024;19(2):0010. doi:10.12968/bjca.2023.0010

    25. Cormick G, Ciapponi A, Cafferata ML, Cormick MS, Belizan JM. Calcium supplementation for prevention of primary hypertension. Cochrane Database Syst Rev. 2022;1(1):CD010037. doi:10.1002/14651858.CD010037.pub4

    26. Pointer MA, Eley S, Anderson L, et al. Differential effect of renal cortical and medullary interstitial fluid calcium on blood pressure regulation in salt-sensitive hypertension. Am J Hypertens. 2015;28(8):1049–1055. doi:10.1093/ajh/hpu255

    27. Papagalanis ND, Skopelitis P, Kourti A, et al. Urine calcium excretion, nephrogenous cyclic-adenosine monophosphate and serum parathyroid hormone levels in patients with essential hypertension. Nephron. 1991;59(2):226–231. doi:10.1159/000186555

    28. Cole NI, Suckling RJ, Swift PA, et al. The association between serum sodium concentration, hypertension and primary cardiovascular events: a retrospective cohort study. J Human Hypertens. 2019;33(1):69–77. doi:10.1038/s41371-018-0115-5

    29. He FJ, Tan M, Ma Y, MacGregor GA. Salt reduction to prevent hypertension and cardiovascular disease: JACC State-of-the-Art review. J American College Cardiol. 2020;75(6):632–647. doi:10.1016/j.jacc.2019.11.055

    30. Sufiun A, Rahman A, Rafiq K, et al. Association of a disrupted dipping pattern of blood pressure with progression of renal injury during the development of salt-dependent hypertension in rats. Int J Mol Sci. 2020;21(6):2248. doi:10.3390/ijms21062248

    31. Elijovich F, Weinberger MH, Anderson CA, et al. Salt sensitivity of blood pressure: a scientific statement from the American heart association. Hypertension. 2016;68(3):e7–e46. doi:10.1161/HYP.0000000000000047

    32. Fujita T. Mechanism of salt-sensitive hypertension: focus on adrenal and sympathetic nervous systems. J American Soc Nephrol. 2014;25(6):1148–1155. doi:10.1681/ASN.2013121258

    33. Karaulov AV, Mikhaylova IV, Smolyagin AI, et al. The immunotoxicological pattern of subchronic and chronic benzene exposure in rats. Toxicol Lett. 2017;275:1–5. doi:10.1016/j.toxlet.2017.04.006

    34. Mente A, O’Donnell M, Rangarajan S, et al. Urinary sodium excretion, blood pressure, cardiovascular disease, and mortality: a community-level prospective epidemiological cohort study. Lancet. 2018;392(10146):496–506. doi:10.1016/S0140-6736(18)31376-X

    35. Qimangul E, Jian-zhong X, Xiao-feng T, et al. The relationship of 24-hour urinary sodium with plasma renin activity, aldosterone concentration and the circadian blood pressure variations in patients with essential hypertension. Chinese J Hypertensions. 2017;25(8):762–766.

    36. Schutten JC, Joosten MM, de Borst MH, Bakker SJL. Magnesium and blood pressure: a physiology-based approach. Adv Chronic Kidney Dis. 2018;25(3):244–250. doi:10.1053/j.ackd.2017.12.003

    37. Ferre S, Liu YL, Lambert JW, et al. Serum magnesium levels and cardiovascular outcomes in systolic blood pressure intervention trial participants. Kidney Med. 2023;5(6):100634. doi:10.1016/j.xkme.2023.100634

    38. Chen G, Li Y, Deng G, et al. Associations of plasma copper, magnesium, and calcium levels with blood pressure in children: a cross-sectional study. Biol Trace Elem Res. 2021;199(3):815–824. doi:10.1007/s12011-020-02201-z

    39. AlShanableh Z, Ray EC. Magnesium in hypertension: mechanisms and clinical implications. Front Physiol. 2024;15:1363975. doi:10.3389/fphys.2024.1363975

    40. Asbaghi O, Hosseini R, Boozari B, Ghaedi E, Kashkooli S, Moradi S. The effects of magnesium supplementation on blood pressure and obesity measure among type 2 diabetes patient: a systematic review and meta-analysis of randomized controlled trials. Biol Trace Elem Res. 2021;199(2):413–424. doi:10.1007/s12011-020-02157-0

    41. Dominguez L, Veronese N, Barbagallo M. Magnesium and hypertension in old age. Nutrients. 2020;13(1):139. doi:10.3390/nu13010139

    42. Yang WY, Melgarejo JD, Thijs L, et al. Association of office and ambulatory blood pressure with mortality and cardiovascular outcomes. JAMA. 2019;322(5):409–420. doi:10.1001/jama.2019.9811

    Continue Reading

  • NEWTON-CABG CardioLink-5: Evolocumab Doesn’t Prevent SVG Failure in Cardiac Surgery

    NEWTON-CABG CardioLink-5: Evolocumab Doesn’t Prevent SVG Failure in Cardiac Surgery

    The idea that further LDL-lowering helps with this issue after CABG can be put to bed, says Subodh Verma.

    MADRID, Spain—Adding evolocumab (Repatha; Amgen) to statin therapy does not lower the risk of saphenous vein graft (SVG) failure among patients who undergo coronary artery bypass surgery, the NEWTON-CABG CardioLink-5 study showed.

    Presented earlier this week at the European Society of Cardiology Congress 2025, and published simultaneously in the Lancet, the study is a failed attempt to find a solution to a persistent and difficult surgical problem.

    “I would say 90% of procedures that occur involve the use of these [saphenous] veins,” said lead investigator Subodh Verma, MD (St. Michael’s Hospital/Unity Health, Toronto, Canada). “Vein graft failure has remained a recalcitrant problem since the advent [of CABG surgery]. It is a stubborn problem for which we have no solutions. Twenty, 40, and 50% of vein grafts undergo vein graft failure by year two, three, or four, respectively, and when vein grafts fail, they’re associated with poor prognosis.”

    SVG is caused by multiple factors, including the technical challenges of surgery, such as poor distal-target-vessel quality and competition from native coronary blood flow. It might also be caused by maladaptive structural changes that result when the vein graft is exposed to the high pressures of the arterial circulation.

    The role of LDL cholesterol in SVG failure is not known, said Verma. While some studies have linked higher LDL levels to SVG occlusion and suggested that lower levels could reduce the risk, other studies with statin therapy to lower LDL-cholesterol and reduce SVG risk haven’t panned out. The results of this randomized, controlled trial, however, are pretty clear, said Verma.

    “Further cholesterol-lowering with PCSK9 inhibitors is not the solution to this problem,” he said. “I think the LDL hypothesis is put to rest.” 

    Straightforward Answers

    The NEWTON-CABG CardioLink-5 trial, which was done at 23 sites in Canada, the United States, Australia, and Hungary, investigated the role of more intensive lipid-lowering with evolocumab, a PCSK9 inhibitor that is approved for lowering LDL cholesterol and reducing the risk of cardiovascular events in patients with and without established cardiovascular disease.

    In total, 782 patients (mean age 66 years; 14% female) were randomized to evolocumab or placebo on top of background statin therapy. Median LDL cholesterol level at baseline was 1.85 mmol/L (72 mg/dL). Nearly all patients were taking aspirin and beta-blockers at baseline, and roughly 40% were taking an ACE inhibitor or ARB. Baseline systolic and diastolic blood pressures were well controlled at 123/74 mm Hg. Nearly all patients received a left internal thoracic artery graft (94%), and a median of three SVGs were used during the procedure, the majority of which were harvested endoscopically (68%).

    The primary efficacy endpoint was the vein graft disease rate, defined as the proportion of SVGs with 50% or greater stenosis or total occlusion on coronary CT angiography or clinically indicated coronary angiography 24 months after the index CABG procedure.

    Further cholesterol-lowering with PCSK9 inhibitors is not the solution to this problem. Subodh Verma

    At 24 months, evolocumab cut LDL-cholesterol levels by more than 52% while there was a 4% reduction in the placebo group. However, the rate of vein graft disease, the study’s primary endpoint, did not differ between treatment arms: 21.7% in the evolocumab group and 19.7% in the placebo arm (P = 0.44). There was no benefit on any of the key secondary endpoints, including the percentage of patients with 100% graft occlusion, nor in any of the prespecified subgroups.

    Verma stressed that despite the results, lowering LDL cholesterol remains an important aspect of secondary prevention, even after surgery.

    What About Arterial Grafting?

    There is evidence that using arterial grafts, particularly the internal thoracic arteries, as well as the radial artery, results in better long-term graft patency than using the SVG. Revascularization guidelines from the US and Europe emphasize grafting the internal thoracic artery to the LAD in patients undergoing CABG surgery and both documents give preference to the radial artery over a saphenous vein conduit for other non-LAD lesions.    

    Even among patients who receive multiple arterial grafts, however, the saphenous vein is still commonly used, said Verma.

    “We are using more and more arterial grafts, but the use of total arterial revascularization, even in centers that do a lot of arterial grafts, will often have two arteries and you need one vein,” he said. “The solution to the global problems of vein graft failure is not to say everybody should do arterial grafting, because there are arterial grafting issues. Radial arteries can only be used on certain types of target vessels, because they are subjected to competitive flow.”

    Nonetheless, when appropriate, “the push towards using more arteries needs to be emphasized,” added Verma.

    François Mach, MD (Geneva University Hospital, Switzerland), the discussant for the study, stressed that NEWTON-CABG CardioLink-5 is not a negative trial, but one that provides an answer to an important clinical question. He noted that the reduction in LDL cholesterol is on par with what was achieved in the FOURIER trial, the landmark study that showed using evolocumab on top of statin therapy reduced the risk of MACE in patients with established cardiovascular disease.

    The 24-month follow-up, which Verma cited as a potential limitation, isn’t a problem for Mach either. In HUYGENS and PACMAN-AMI, for example, 12 months of treatment with evolocumab and alirocumab (Praluent; Sanofi/Regeneron) led to significant changes in the “quality and quantity of atherosclerotic plaques in the coronary arteries,” said Mach. However, both Verma and Mach noted that the response to treatment likely differs between coronary arteries and native veins grafted into the arterial circulation.

    “The biology may be different,” said Verma.

    One potential area of research for the prevention of SVG could be targeting inflammation or thrombosis. Additionally, it might be worth investigating whether intraoperative or postoperative imaging could be used to help identify technical failures or abnormal flow patterns that could be driving graft occlusion, said Verma. A provocative solution raised by Mach includes avoiding SVGs with xenotransplantation, such as by using arteries from animals in CABG surgeries.  


    Continue Reading

  • A Lifelong Battle With Crouzon Syndrome: A Detailed Case Report of Extensive Craniofacial Surgeries and Complex Psychiatric Care

    A Lifelong Battle With Crouzon Syndrome: A Detailed Case Report of Extensive Craniofacial Surgeries and Complex Psychiatric Care


    Continue Reading

  • Understanding the role of botanicals in medicine

    Understanding the role of botanicals in medicine

    Dr Erin C. Berthold from Planted in Science Consulting LLC discusses the uses and perceptions of botanical medicines, emphasizing the necessity for coordinated global efforts to understand and regulate these substances to ensure their safe integration into healthcare

    Botanicals have long been used in traditional medicine practices. In developing nations, particularly in Africa, up to 80% of the population relies on traditional medicine as their primary source of care. (1) This historical use of botanicals as medicines also profoundly shaped modern pharmacology, with botanical compounds serving as frameworks or direct sources for roughly one-third of all drugs approved by the U.S. Food and Drug Administration (FDA). (2)

    The growing demand for ‘natural’ remedies

    Today, however, the landscape of botanical medicine is undergoing a significant transformation. A surge in wellness trends and a growing consumer preference for ‘natural’ products has led to a boom in their use in developed countries. This shift, combined with a steady increase in the use of prescription medications, is creating potential for botanical-drug interactions. Sales data show that from 2017 to 2023, there was an increase of over $4 billion in sales of botanical supplements in the United States, while at the same time, pharmaceutical sales grew by over $100 billion (Figure 1).

    Figure 1: Growth in sales of both botanical supplements and pharmaceuticals in the US

    Safety concerns surrounding botanical supplementation

    A pervasive misconception about botanicals is that they are inherently safer than pharmaceuticals. This belief is based on the long history of traditional use of plant medicines, the accessibility of botanical supplements, and a bias against ‘synthetic’ chemicals. This leads many individuals to use botanicals concurrently with their prescription medications without informing their healthcare providers. This co-administration of botanicals and pharmaceuticals is not a niche behavior. A survey of US participants revealed that 38% used herbals alongside prescription drugs, with a concerning prevalence among adults over 70 – a demographic particularly vulnerable to adverse effects from polypharmacy due to altered metabolism. (3) Compounding this issue, as many as two-thirds of individuals do not disclose their botanical supplement use to their physicians, creating a significant information gap in patient care. (4)

    While the pharmaceutical industry is held to a rigorous standard, requiring extensive pre-market research into drug-drug and food-drug interactions, botanical supplements are often governed by different, far less stringent rules, and issues are often only recognized in post-market surveillance. This regulatory disparity leads to a marketplace where botanical products lack standardized potency, clear instructions, and any meaningful data on their potential interactions with prescription medications.

    Botanicals are chemically complex, containing a multitude of compounds that may alter the effects of a co-administered drug. In some cases, they may increase the effects of a prescribed medication, leading to adverse events, a particularly dangerous scenario for drugs with a narrow therapeutic window. Conversely, they may decrease the efficacy of a drug, which can be life-threatening or life-changing when the medication is for a critical condition like cardiac disease or contraception. (5)

    The clinical consequences of this regulatory gap and public misconception are becoming increasingly visible. One trend is the rising incidence of drug-induced liver injury (DILI) due to herbal and dietary supplements. While DILI from botanicals is often idiosyncratic, its prevalence is growing. Data from the Drug-Induced Liver Injury Network (DILIN) shows that the percentage of DILI cases attributed to herbal and dietary supplements soared from 7% in 2004-2005 to 20% in 2013-2014. (6)

    Botanicals and research gaps

    Despite the growing prevalence and clear risks, research into botanical-drug interactions has not kept pace. In fact, a search of the biomedical database PubMed reveals a decline in published clinical trials on the topic. In the ten-year period from 2002 to 2012, there were 51 published randomized controlled clinical trials on herb-drug interactions, but in the subsequent decade, that number dropped to just 25.

    The time for a globally coordinated effort to understand and regulate botanical medicines is long overdue. As these products become more integrated into the daily lives of millions, it is imperative that we bridge the gap between historical tradition and modern pharmacology. We must move beyond the idea that ‘natural is safe’ fallacy and dedicate the necessary resources to research, regulation, and education. This will require a collaborative approach that brings together researchers, regulatory agencies, and healthcare providers to develop standardized methodologies and a centralized database for reporting and tracking these interactions. Only by doing so can we ensure that the use of botanical medicine is not just a nod to tradition, but a truly safe and informed choice.

    References

    1. World Health Organization. (2002). WHO Traditional Medicine Strategy 2002-2005. Geneva, Switzerland.
    2. Newman, D. J., & Cragg, G. M. (2020). Natural Products as Sources of New Drugs over the Last 40 Years. Journal of Natural Products, 83(3), 770-803.
    3. Posadzki, P., Watson, L. K., & Ernst, E. (2012). Prevalence of herbal medicine use in the US: A systematic review. European Journal of Clinical Pharmacology, 68(11), 1395-1402.
    4. Kaptchuk, T. J., & Eisenberg, D. M. (2001). The persuasive placebo: A re-examination of the relationship between mind, body, and drug. New England Journal of Medicine, 344(23), 1735-1744.
    5. Zeller, A., & O’Brien, R. (2009). The potential for herb-drug interactions with cardiovascular medications. Journal of Cardiovascular Nursing, 24(4), 263-270.
    6. Navarro, V. J., Barnhart, H., Bonkovsky, H. L., Davern, R. J., Fontana, H. R., Jones, D. B.,… & the DILIN Network. (2014). Liver Injury from Herbal and Dietary Supplements in the US. Hepatology, 60(2), 585-594.

    Continue Reading