Category: 7. Science

  • Understanding the human brain architecture through gene coexpression analysis

    Understanding the human brain architecture through gene coexpression analysis

    In a comprehensive Genomic Press Interview published in Brain Medicine, Dr. Michael C. Oldham shares his unconventional journey from advertising executive to computational neuroscientist and his groundbreaking contributions to understanding the human brain’s cellular and molecular architecture through gene coexpression analysis.

    From Madison Avenue to molecular neuroscience

    Dr. Oldham’s path to neuroscience was anything but direct. After graduating from Duke University at age 20 with a pre-med focus, he found himself unable to commit to medical school, recognizing he lacked the intrinsic desire to treat patients. Following a stint in San Francisco’s advertising industry during the dot-com boom, his fascination with human language evolution and what distinguishes human brains from those of our closest primate relatives led him back to academia.

    “The genetic changes that gave rise to the modern human brain were the catalyst for life as we know it,” Dr. Oldham reflects in the interview. This fundamental question drove him to pursue a PhD at UCLA, where he would make discoveries that continue to shape neuroscience research today.

    Pioneering gene coexpression network analysis

    Working with Dr. Dan Geschwind at UCLA and biostatistician Dr. Steve Horvath, Dr. Oldham performed the first genome-wide analysis of transcriptional covariation in the human brain. His eureka moment came when he realized that recurrent patterns of gene activity in brain samples corresponded to transcriptional signatures of different cell types.

    “Variation in the cellular composition of bulk tissue samples should inevitably drive the covariation of markers for different cell types,” Dr. Oldham explains. This insight, published in Nature Neuroscience in 2008, demonstrated how gene coexpression analysis could reveal optimal markers of cell types and states-a principle that still forms the central thesis for his laboratory at UCSF.

    The approach, known as Weighted Gene Coexpression Network Analysis (WGCNA), has become a cornerstone technique in genomics research. Unlike traditional differential expression analysis that compares individual genes between cohorts, WGCNA identifies robust patterns of coordinated gene activity within biological systems. This methodology has proven particularly powerful for understanding complex tissues like the brain, where multiple cell types interact in intricate ways.

    From brain evolution to brain tumors

    Dr. Oldham’s early research focused on analyzing patterns of gene activity in the brains of humans and other species. These efforts identified functionally significant gene expression changes in human radial glia (Nature, 2014), interneurons (Cerebral Cortex, 2018), and astrocytes (Nature Neuroscience, 2018), while introducing novel methods for aggregating and comparing patterns of gene activity among biological systems.

    More recently, his research focus has shifted from studying what makes human brains unique to tackling one of medicine’s most challenging diseases: malignant gliomas. As a faculty member in UCSF’s Department of Neurological Surgery and Brain Tumor Center, he applies his computational approaches to these notoriously heterogeneous brain tumors.

    His team has analyzed gene activity patterns from over 17,000 human brain samples, including approximately 10,000 normal and 7,000 malignant glioma samples. This massive undertaking has led to the development of OMICON (theomicon.ucsf.edu), a platform designed to make the patterns of gene activity in these complex datasets accessible to the broader research community. The resource contains over 100,000 gene coexpression modules that have been extensively characterized via enrichment analysis with thousands of curated gene sets, providing researchers worldwide with unprecedented insights into brain function and dysfunction.

    By comparing patterns of gene activity between normal human brains and malignant gliomas, Dr. Oldham and his team are pinpointing highly reproducible molecular changes in specific cell types of the glioma microenvironment, including vascular cells and neurons. These molecular signatures provide opportunities for developing novel biomarkers and targeted treatment strategies for glioma patients. For example, cell-surface markers of glioma vasculature provide a potential molecular ‘zip code’ for targeting gliomas via the bloodstream.

    Confronting the reproducibility crisis

    Beyond his primary research, Dr. Oldham has become increasingly concerned with what he describes as science’s reproducibility crisis. “If most of the findings we toil to produce cannot feasibly be reproduced, what is the point?” he asks, highlighting a challenge that extends far beyond neuroscience.

    His response has been to take leadership roles addressing these systemic issues. As Vice Chair of UCSF’s Academic Senate Committee on Library and Scholarly Communication, he has launched a pan-UCSF Task Force on research data and metadata standardization. While the topic might sound technical, Dr. Oldham emphasizes its critical importance: these standards are essential prerequisites for more open and reproducible science, more precise biomedical knowledge representation, and more efficient collaboration.

    “Although there are many factors that affect the reproducibility of published research findings, there is no reason in principle why data analysis should not be completely reproducible,” Dr. Oldham notes. “By standardizing how we package and describe our research data, we can accelerate data discovery and analysis, including the use of artificial intelligence. More generally, standardized data packages with persistent identifiers can serve as building blocks for new technology infrastructure to modernize scholarly communication around reproducible data analysis.”

    The human side of scientific discovery

    The interview reveals personal insights that shaped Dr. Oldham’s career trajectory. His decision to spend two additional years in graduate school after his first major publication-a choice some considered “nuts”-resulted in a second, even more impactful paper that secured his selection as a UCSF Sandler Faculty Fellow. This prestigious position provided him with immediate independence and funding to establish his own laboratory.

    When not advancing neuroscience, Dr. Oldham can be found on the trails of Marin County, where he lives, often walking alone and lost in thought. He maintains close friendships from his San Francisco advertising days, adhering to their motto: “ABC (always be celebrating!).”

    Looking ahead, Dr. Oldham sees the integration of multiscale and multimodal data as crucial for understanding brain complexity. He advocates for standardized data production strategies that leverage robotic automation to generate reproducible datasets at scale. Dr. Oldham also believes that neuroscientists must ‘flip the switch’ from descriptive analysis of biological systems to predictive analysis using statistical models. “There is a big difference between describing what you think a dataset means versus predicting what you will see in the next dataset,” he says.

    Dr. Michael C. Oldham’s Genomic Press interview is part of a larger series called Innovators & Ideas that highlights the people behind today’s most influential scientific breakthroughs. Each interview in the series offers a blend of cutting-edge research and personal reflections, providing readers with a comprehensive view of the scientists shaping the future. By combining a focus on professional achievements with personal insights, this interview style invites a richer narrative that both engages and educates readers. This format provides an ideal starting point for profiles that explore the scientist’s impact on the field, while also touching on broader human themes. More information on the research leaders and rising stars featured in our Innovators & Ideas – Genomic Press Interview series can be found in our publications website: https://genomicpress.kglmeridian.com/.

    Source:

    Journal reference:

    Oldham, M. C., (2025) Michael C. Oldham: Clarifying the cellular and molecular architecture of the human brain in health and disease through gene coexpression analysis. Brain Medicine. doi.org/10.61373/bm025k.0080.

    Continue Reading

  • Narrow Spaces Trigger Stem Cells To Become Bone Cells

    Narrow Spaces Trigger Stem Cells To Become Bone Cells


    Register for free to listen to this article

    Thank you. Listen to this article using the player above.

    In a discovery that could reshape approaches to regenerative medicine and bone repair, researchers have found that human stem cells can be prompted to begin turning into bone cells simply by squeezing through narrow spaces.

    The study suggests that the physical act of moving through tight, confining spaces, like those between tissues, can influence how stem cells develop. This could open new possibilities for engineering materials and therapies by guiding cell behaviour using physical, rather than chemical, signals.

    The research was led by Assistant Professor Andrew Holle (Biomedical Engineering and the NUS Mechanobiology Institute) and was published on 8 May 2025 in the journal Advanced Science.

    Asst Prof Holle leads the Confinement Mechanobiology Lab at NUS. His lab studies how physical constraints – especially the tight spaces cells encounter as they move – affect how cells behave, function, and develop. While most earlier research in this area focused on cancer and immune cells, his team is among the first to explore how these forces affect stem cells, with the aim of applying their findings to future therapies.

    Mechanical ‘memory’

    The researchers focused on a type of adult stem cell known as a mesenchymal stem cell, or MSC. These cells are found in bone marrow and other tissues and are known for their ability to develop into bone, cartilage, and fat cells. Because of these properties, MSCs are widely used in research on tissue repair and regeneration.

    “To test how physical forces influence stem cell fate, we developed a specialised microchannel system that mimics the narrow tissue spaces cells navigate in the body,” said Asst Prof Holle.

    They found that when MSCs squeezed through the smallest channels (just three micrometres wide), the pressure caused lasting changes to the cells’ shape and structure. These cells showed increased activity in a gene called RUNX2, which plays a key role in bone formation. Even after exiting the channels, they retained this effect – suggesting they carry a kind of mechanical ‘memory’ of the experience.

    “Most people think of stem cell fate as being determined by chemical signals,” Asst Prof Holle said. “What our study shows is that physical confinement alone – squeezing through tight spaces – can also be a powerful trigger for differentiation.”

    While traditional methods of directing stem cells rely on chemical cues or growing them on stiff or soft materials, Asst Prof Holle’s team believes confinement-based selection may offer a simpler, cheaper, and potentially safer alternative. “This method requires no chemicals or genetic modification – just a maze for the cells to crawl through,” he said. “In theory, you could scale it up to collect millions of preconditioned cells for therapeutic use.”

    The researchers say their findings could help improve the design of biomaterials and scaffolds used in bone repair, by creating physical environments that naturally encourage the right kind of cell development. “By tuning the mechanical properties of materials, we might be able to steer stem cells more reliably toward the cell types we want,” Asst Prof Holle said.

    Next steps

    The approach could one day be used to speed up recovery from bone fractures or enhance the effectiveness of stem cell therapies. “We’d like to test whether preconditioned cells that have gone through this mechanical selection are better at promoting healing when introduced at injury sites,” Asst Prof Holle said. “That’s one of the next steps.”

    Beyond bone repair, the research may have broader implications. MSCs are also known to migrate toward tumours, and the research team is interested in whether mechanically preconditioned cells might be better equipped to move through dense tumour tissue – a challenge that has limited the success of many current cell therapies.

    The group is also exploring whether the technique could apply to more potent stem cell types, such as induced pluripotent stem cells (iPSCs), which can develop into almost any tissue in the body.

    “We suspect that confinement plays a role even in embryonic development,” Asst Prof Holle said. “Cells migrating through crowded environments early in life are exposed to mechanical stress that could shape their fate. We think this idea has potential far beyond just MSCs.”

    Reference: Gao X, Li Y, Lee JWN, et al. Confined migration drives stem cell differentiation. Adv Sci. 2025;12(21):2415407. doi: 10.1002/advs.202415407

    This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source. Our press release publishing policy can be accessed here.

    Continue Reading

  • Massive underwater eruption suffocated marine life

    Massive underwater eruption suffocated marine life

    In 2022, the underwater Hunga volcano exploded, blasting ash 37 miles into the sky – the largest volcanic plume ever captured by satellites. What followed offered researchers a rare chance to see how the ocean floor responds to sudden, massive disruption.

    Months later, scientists from the University of Oregon, along with teams from the University of Rhode Island and Western Washington University, headed out on a research cruise.


    Onboard, undergraduate student Marcus Chaknova discovered something unexpected: thick layers of volcanic ash coating the seafloor. The ash had suffocated deep-sea ecosystems that rely on fragile chemical exchanges to survive.

    Underwater eruptions usually go unseen

    “This was an extremely rare opportunity,” Chaknova said. “Observing the mass movement of underwater sediment is something that hasn’t been studied much.”

    Now a graduate student in Earth Sciences at the University of Oregon, Chaknova led a research project analyzing what the underwater eruption left behind.

    Working with Professor Thomas Giachetti and 16 other experts from around the world, he became the lead author on a study that looked at how volcanic ash travels – and what it does to life underwater.

    “We had scientists from every single time zone you could think of,” Chaknova said. His project needed expertise from the fields of marine biology, geochemistry, and micropaleontology.

    Tracking ash from the volcano

    The first step was to confirm that the underwater ash had indeed come from Hunga, located about 40 miles from Tonga’s main island in the South Pacific.

    After an eruption, it can take weeks or even months for ash to fall through the water and settle on the ocean floor. As it drifts, wind and currents push it farther from the eruption site.

    “One grain of sediment will take weeks or months to reach the bottom of the ocean. It’s like a leaf falling from a tree. Because of the wind, it might end up somewhere completely different,” Giachetti explained.

    Back in the lab, Chaknova matched the ash samples to those found near the volcano. The grains were varied – some jagged and sharp, others rounded and smooth.

    In some places, the sediment was more than a meter thick. Most of it was made up of very fine particles, around the width of a human hair.

    What the eruption left behind

    Chaknova discovered that much of the ash came from the volcano’s caldera walls and was carried away by fast-moving underwater flows – something like underwater avalanches.

    Those flows were so powerful they damaged submarine cables and carved small canyons into the seafloor. The researchers even used the timing of power loss from those broken cables to calculate how fast the ash surged.

    Using computer models, Chaknova plans to simulate how the ash moved and where it went. Giachetti said this research could change how scientists think about sediment movement in the oceans.

    Deep-sea life suffocated

    The underwater eruption didn’t just leave a geological footprint. It disrupted entire ecosystems.

    In the deep sea, where sunlight doesn’t reach, life depends on chemosynthesis – organisms use chemicals like methane or ammonia from hydrothermal vents, instead of sunlight, to produce energy.

    Roughly 90% of marine life lives on the seafloor, according to the World Wide Fund for Nature. After the eruption, much of that life was buried in thick sediment.

    Some creatures like worms and anemones can survive brief burials, but this sudden wave of ash was too much. Many deep-sea species are suspension feeders. They grab tiny particles of food from the water and filter them through their gills.

    “With all the displaced sediment, these organisms are only grabbing sediment,” Chaknova said. “That’s going to clog their gills, it’s going to clog their intestines, and that’s going to have a dramatic effect on their ability to create energy.”

    Fallout from the underwater eruption

    Chaknova’s early findings also show that ash made it all the way to Tonga’s coral reefs. At first, it caused a short-lived plankton bloom at the surface.

    But as the ash settled, it threatened coral ecosystems that support larger marine life. When coral suffers, everything above it in the food chain is affected.

    For Tonga, the eruption of this underwater volcano had more than environmental consequences. It also affected livelihoods.

    Fishing is a way of life in Tonga. According to the World Bank, about 82% of households rely on reef fishing in some way for income. Marine tourism accounts for more than 7% of Tonga’s GDP.

    “Although this eruption occurred on the seafloor, there is a chain of both positive and negative effects,” Chaknova said.

    “The negative effects go farther than just losing power or Wi-Fi from submarine cables. This is some people’s livelihood. They need fish for food. Fishing is incredibly important for the economic and food security of Tonga.”

    A warning for deep-sea mining

    The study also has bigger implications. As the world turns toward clean energy, demand is growing for metals like copper and cobalt – many of which sit in potato-sized nodules beneath the ocean floor.

    Private companies have approached small Pacific nations, including Tonga, with offers to mine these resources.

    “The area where we collected the sediment is within the Kingdom of Tonga, and we found that they are very rich in minerals,” Chaknova said. “A lot of companies are interested in collecting these minerals, and so this area that belongs to the Kingdom of Tonga will be up for bid, in coming years, for deep-sea mining.”

    While commercial mining hasn’t started yet, researchers are urging caution. The plumes of sediment created by mining could be just as harmful as those created by volcanic eruptions – clogging gills, burying habitats, and destroying fragile ecosystems.

    Chaknova’s work provides some of the only real-world data we have on what that kind of disturbance could look like.

    The research gives scientists and policymakers a better understanding of what’s at stake – and what might be lost if we move forward without enough knowledge.

    The full study was published in the journal Geochemistry, Geophysics, Geosystems.

    Image Credit: NOAA

    —–

    Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

    Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

    —–


    Continue Reading

  • See tomorrow’s space rovers at Prince’s Collection

    See tomorrow’s space rovers at Prince’s Collection

    The Prince’s Palace is transforming its prestigious Car Collection into a portal to infinity, playing host to four futuristic rovers that redefine the boundaries between terrestrial motoring and space exploration.

    Until 2 September, visitors to the Car Collection of H.S.H. the Prince of Monaco can contemplate humanity’s future via four exceptional rover vehicles, the fruits of a joint venture between Venturi Space and Venturi Astrolab. The temporary exhibition transforms the showcase for prestigious cars into a space innovation laboratory, where each rover tells a story of interplanetary conquest.

    Flex: lunar ambassador for 2027

    The Flex astromobile, the jewel in the crown of the ephemeral collection, is preparing to set foot, or rather wheel, on the moon in 2027 via SpaceX. Shortlisted by NASA, this technological marvel features Monegasque innovations: hyper-deformable wheels designed in Switzerland and high-performance batteries developed in the Principality.

    © Michael Alesi / Prince’s Palace

    From the Moon to Mars: unlimited ambition

    The Flip rover will reach the lunar South Pole in 2026, while its Martian counterpart will pave the way for colonisation of the Red Planet. Mona Luna embodies European excellence, with a Franco-Monegasque design for the European Space Agency and CNES, France’s national space study centre.

    Venturi Space: electric meets cosmic

    Under the visionary leadership of Gildo Pastor, Venturi started a strategic revolution in 2021, shifting its focus from terrestrial automotive innovation to the space industry. This metamorphosis is a perfect illustration of Monegasque DNA: transforming boldness into excellence, and innovation into legend.

    Venturi Space welcomes Prince Albert II for exclusive visit

    The unique exhibition reveals how Monaco, a Principality with a limited terrestrial footprint, nurtures infinite ambitions, writing the first pages of European space history.

    Prince of Monaco’s Car Collection, 54 route de la Piscine – Port Hercule.


    Continue Reading

  • SpaceX launches 28 Starlink satellites from Cape Canaveral

    SpaceX launches 28 Starlink satellites from Cape Canaveral

    SpaceX launched 28 more Starlink satellites for its low Earth orbit constellation on Tuesday (July 8).

    A Falcon 9 rocket lifted off with the broadband internet units (Group 10-28) at 4:21 a.m. EDT (0821 GMT) from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida. About nine minutes later, the satellites reached space and, 50 minutes after that, were deployed into orbit.

    “Deployment of 28 Starlink satellites confirmed,” SpaceX announced on the X social media network.

    A SpaceX Falcon 9 rocket carrying 28 Starlink satellites launches from Cape Canaveral Space Force Station in Florida on Tuesday, July 8, 2025. (Image credit: SpaceX)

    Continue Reading

  • Astronomers witness a star’s failed death

    Astronomers witness a star’s failed death

    In a cosmic twist worthy of a sci-fi thriller, astronomers have just caught a massive star in the act of dying, not with a bang, but with a stifled X-ray-powered whisper.

    Using a global network of telescopes, including the International Gemini Observatory and the SOAR Telescope in Chile, astronomers have observed the closest-ever example of a mysterious cosmic event called a fast X-ray transient (FXT). This particular flash, named EP 250108a, was spotted in January 2025 by the newly launched Einstein Probe, and it’s helping astronomers rewrite the story of how stars die.

    FXTs are brief, powerful bursts of X-rays from distant galaxies that last just seconds to hours. They’ve puzzled astronomers for years, until now. EP 250108a, located a mere 2.8 billion light-years away (close by cosmic standards), gave scientists their best look yet.

    Once the Einstein Probe raised the alarm, telescopes around the world sprang into action. Gemini South’s FLAMINGOS-2 and Gemini North’s GMOS captured the event in infrared and optical light, revealing the glowing aftermath of a supernova.

    Researchers detected the brightest gamma-ray burst

    But this wasn’t your typical supernova. Instead of blasting jets of energy into space like a gamma-ray burst (GRB), this star’s jets got stuck inside, heating up the star’s outer layers and releasing X-rays in the process. Think of it as a cosmic pressure cooker.

    “This FXT supernova is nearly a twin of past supernovae that followed GRBs,” said Rob Eyles-Ferris, lead author of one of two companion studies. “But here, the jets failed to escape.”

    Gemini North and Gemini South Capture the Fading Light of SN 2025kg
    This sequence of images shows the fading light of the supernova SN 2025kg, which followed the fast X-ray transient EP 250108a, a powerful blast of X-rays that was detected by Einstein Probe (EP) in early 2025. Using a combination of telescopes, including the International Gemini Observatory, funded in part by the U.S. National Science Foundation and operated by NSF NOIRLab, and the SOAR telescope at Cerro Tololo Inter-American Observatory in Chile, a Program of NSF NOIRLab, a team of astronomers studied the evolving signal of EP 250108a/SN 2025kg to uncover details about its origin. Their analysis reveals that fast X-ray transients can result from the ‘failed’ explosive death of a massive star.

    Credit:
    International Gemini Observatory/NOIRLab/NSF/AURA
    Acknowledgment: PI: J. Rastinejad (Northwestern University)
    Image processing: J. Miller & M. Rodriguez (International Gemini Observatory/NSF NOIRLab), M. Zamani (NSF NOIRLab)

    That failure turned out to be a scientific jackpot. By watching the event unfold over days and weeks, astronomers confirmed that the FXT was tied to a Type Ic broad-lined supernova, SN 2025kg, likely from a star 15–30 times the mass of our Sun.

    “The X-ray data alone cannot tell us what phenomena created the FXT,” says Jillian Rastinejad, PhD student at Northwestern University and lead author of the second companion paper. “Our optical monitoring campaign of EP 250108a was key to identifying the aftermath of the FXT and assembling the clues to its origin.”

    It takes two stars to make a gamma-ray burst

    After the initial X-ray flash from EP 250108a, astronomers noticed the area getting brighter in optical light for a few weeks before fading. The light also showed special patterns, called broad absorption lines, that revealed the FXT was linked to a powerful kind of explosion known as a Type Ic broad-lined supernova.

    To learn more, the team used the SOAR Telescope in Chile to observe the event in near-infrared light. These observations helped them estimate how bright the explosion got and what kind of star caused it.

    Their best guess? A massive star weighing 15 to 30 times more than our Sun, a true cosmic heavyweight that ended its life with a dramatic if slightly muffled, bang

    “Our analysis shows definitively that FXTs can originate from the explosive death of a massive star,” says Rastinejad. “It also supports a causal link between GRB-supernovae and FXT-supernovae, in which GRBs are produced by successful jets and FXTs are produced by trapped or weak jets.”

    A super-bright stellar explosion gave birth to a compact object

    FXTs are now being detected several times a month, while GRBs are rare, only about once a year. This suggests that “failed” jet explosions like EP 250108a may be far more common than their flashier cousins.

    “This discovery opens a new window into how massive stars die,” said Jillian Rastinejad, co-author of the second study. “It shows that even when a star’s final act is muted, it still has a powerful story to tell.”

    With the upcoming Vera C. Rubin Observatory set to launch its Legacy Survey of Space and Time, astronomers expect to uncover even more of these hidden stellar dramas. And thanks to the rapid-response power of observatories like Gemini, we’ll be ready to catch them in the act.

    Journal References:

    1. Rob A. J. Eyles-Ferris, Peter G. Jonker, Andrew J. Levan, Daniele Bjørn Malesani, Nikhil Sarin, Christopher L. Fryer, Jillian C. Rastinejad, Eric Burns, Nial R. Tanvir, Paul T. O’Brien, Wen-fai Fong, Ilya Mandel, Benjamin P. Gompertz, Charles D. Kilpatrick, Steven Bloemen, Joe S. Bright, Francesco Carotenuto, Gregory Corcoran, Laura Cotter, Paul J. Groot, Luca Izzo, Tanmoy Laskar, Antonio Martin-Carrillo, Jesse Palmerio, Maria E. Ravasio, Jan van Roestel, Andrea Saccardi, Rhaana L. C. Starling, Aishwarya Linesh Thakur, Susanna D. Vergani, Paul M. Vreeswijk, Franz E. Bauer, Sergio Campana, Jennifer A. Chacón, Ashley A. Chrimes, Stefano Covino, Joyce N. D. van Dalen, Valerio D’Elia, Massimiliano De Pasquale, Nusrin Habeeb, Dieter H. Hartmann, Agnes P. C. van Hoof, Páll Jakobsson, Yashaswi Julakanti, Giorgos Leloudas, Daniel Mata Sánchez, Christopher J. Nixon, Daniëlle L. A. Pieterse, Giavanna Pugliese, Jonathan Quirola-Vásquez, Ben C. Rayson, Ruben Salvaterra, Ben Schneider, Manuel A. P. Torres, Tayyaba Zafar. The kangaroo’s first hop: the early fast cooling phase of EP250108a/SN 2025kg. The Astrophysical Journal Letters. DOI: 10.48550/arXiv.2504.08886
    2. J. C. Rastinejad (Northwestern), A. J. Levan, P. G. Jonker, C. D. Kilpatrick, C. L. Fryer, N. Sarin, B. P. Gompertz, C. Liu, R. A. J. Eyles-Ferris, W. Fong, E. Burns, J. H. Gillanders, I. Mandel, D. B. Malesani, P. T. O’Brien, N. R. Tanvir, K. Ackley, A. Aryan, F. E. Bauer, S. Bloemen, T. de Boer, C. R. Bom, J. A. Chacon, K. Chambers, T.-W. Chen, A. A. Chrimes, J. N. D. van Dalen, V. D’Elia, M. De Pasquale, M. D. Fulton, P. J. Groot, R. Gupta, D. H. Hartmann, A. P. C. van Hoof, M. E. Huber, L. Izzo, W. Jacobson-Galan, P. Jakobsson, A. Kong, T. Laskar, T. B. Lowe, E. A. Magnier, E. Maiorano, A. Martin-Carrillo, L. Mas-Ribas, D. Mata Sanchez, M. Nicholl, C. J. Nixon, S. R. Oates, G. Paek, J. Palmerio, D. Paris, D. L. A. Pieterse, G. Pugliese, J. A. Quirola Vasquez, J. van Roestel, A. Rossi, A. Rouco Escorial. R. Salvaterra, B. Schneider, S. J. Smartt, K. Smith, I. A. Smith, S. Srivastav, M. A. P. Torres, C. Ventura, P. Vreeswijk, R. Wainscoat, Y.-J. Yang, S. Yang. EP 250108a/SN 2025kg: Observations of the most nearby Broad-Line Type Ic Supernova following an Einstein Probe Fast X-ray Transient. Astrophysical Journal Letters. DOI: 10.48550/arXiv.2504.08889

    Continue Reading

  • Optics & Photonics News – Nanostructures Bring Cloud Physics Down to Earth

    Optics & Photonics News – Nanostructures Bring Cloud Physics Down to Earth

    [Image: Stockbyte/ Getty Images]

    Scientists in Finland and Germany have developed a new kind of nanostructure that mimics the scattering and absorptive properties of clouds to either cool or heat objects exposed to sunlight (Adv. Mater., doi: 10.1002/adma.202501080). They say that the structure’s low emissions at infrared wavelengths make it well suited to military and materials applications requiring invisibility to thermal imaging.

    Passive cooling

    Objects can be passively cooled by shielding them from solar radiation and insulating them from their environment while enhancing emission at infrared wavelengths, allowing heat to be transferred from the objects to the cold of outer space. However, this infrared emission renders such objects susceptible to thermal imaging. (White paints, which scatter light diffusely across all wavelengths, also emit significant thermal radiation.)

    An alternative approach to passive cooling is to exploit the thermal physics of white clouds. These clouds cool air via backscattering—scattering incoming sunlight back in the direction it came from. Because they do so across a broad range of visible and infrared wavelengths, the clouds avoid heating up and emitting thermal radiation. Conversely, when clouds are located underneath a layer of aerosols, their backscattered light is absorbed by the particles, turning the clouds gray and heating the air around them.

    In the latest work, Mady Elbahri, Aalto University, Finland, and colleagues have taken inspiration from these processes to develop plasmonic metasurfaces that can modify both the optical and thermal properties of materials exposed to the sun’s rays. Their idea was to tailor the characteristics of silver nanoparticles lying on a substrate so that they backscatter light across visible and near-infrared wavelengths.

    Graph showing white versus gray cloud cooling and heating[Enlarge image]

    Researchers have built a new type of metasurface that mimics the cooling and heating effects of clouds while remaining thermally camouflaged. [Image: Mady Elbahri / Aalto University]

    Mimicking clouds

    The researchers fabricated a metasurface to mimic a white cloud by first depositing a very thin film of silver onto a layer of silicon and then carrying out vacuum annealing at 650ºC for 50 minutes. After producing a certain size and distribution of silver islands atop the silicon, they deposited a thicker (120-nm-thick) layer of silver on top of that. This upper reflecting layer not only made the surface more stable, it also converted what would otherwise have been forward-scattered light into back-scattered light.

    Elbahri and colleagues found that they could control the size of the silver nanoparticles by varying the thickness of the initial film. Thicker films yielded bigger and more varied particles, which pushed backscattering to longer wavelengths. With 50-nm-thick films, the team achieved backscattering across much of the visible and near-infrared spectrum, preventing heat from building up and so minimizing emissions in the infrared.

    To reproduce the effects of a gray cloud, the scientists then deposited a plasmonic nanocomposite—an aluminium oxide matrix containing randomly distributed copper nanoparticles acting like aerosols—on top of the existing metasurface. This extra layer absorbed almost all of the light backscattered by the silver, changing the device’s original white color to gray.

    To confirm that their metasurface switched from mimicking white to gray clouds, the researchers placed the device on a block of foam, exposed it to simulated sunlight and measured its temperature with a thermocouple. They found that without the nanocomposite, their metasurface reduced the temperature of the underlying substrate by 10ºC. With the additional layer added, in contrast, the metasurface increased the relative temperature by 10ºC—even surpassing the absorptive capacity of black surfaces.

    Toward energy efficiency and beyond

    Elbahri and colleagues argue that their technology could be exploited in adaptive coatings for energy efficiency in buildings, solar systems and textiles, as well as for military thermal camouflage. Having already fabricated different samples to demonstrate the mimicking of white and gray clouds, they say their next step is to build a device that can switch between the two states using, for example, electrochromic or phase-changing layers.

    They add that while the scattering efficiency of the white sample dropped slightly after remaining in storage, that of the gray sample remained almost unchanged; demonstrating, they say, “the metasurfaces’ practical stability for consumer applications.”

    Continue Reading

  • Summer In The Northern Hemisphere Will Be 15 Minutes Shorter Than Last Year

    Summer In The Northern Hemisphere Will Be 15 Minutes Shorter Than Last Year

    Put down your beach volleyball and take off your Crocs, summer fans, as this year your favorite season will be a little bit shorter than usual.

    As reliable as the seasons are, they do not always last the same amount of time. The seasons, as you are probably aware, are the result of the Earth’s 23.5-degree axial tilt away from our orbital plane. When your Hemisphere is tilted more towards the Sun it receives more sunlight, and that’s summer. When it’s pointed further away it receives less, and that’s your winter.

    Some are under the belief that the seasons are changed because of the Earth’s distance from the Sun. While this is incorrect, the Earth’s orbit can have a small effect on the length of the seasons. As the Earth reaches aphelion, the farthest point from the Sun on our orbit, or perihelion, its closest approach, its velocity changes.

    “The imaginary line joining a planet and the Sun sweeps out – or covers – equal areas of space during equal time intervals as the planet orbits. Basically, the planets do not move with constant speed along their orbits. Instead, their speed varies so that the line joining the centers of the Sun and the planet covers an equal area in equal amounts of time,” NASA explains of Kepler’s second law. “The point of nearest approach of the planet to the Sun is called perihelion. The point of greatest separation is aphelion, hence by Kepler’s second law, a planet is moving fastest when it is at perihelion and slowest at aphelion.”

    As aphelion currently takes place during summer in the Northern Hemisphere, and perihelion in the southern summer, this means that the Northern Hemisphere currently enjoys around four days more of summer than the Southern Hemisphere. This won’t be the case forever. As our calendars don’t perfectly match our orbit around the planet, the timing of aphelion and perihelion get shifted throughout the years. In 1,000 years, summers in the Northern Hemisphere will be around six hours longer than they are today, per Timeanddate.com.

    While that is the long-term trend, this year’s summer will be particularly short compared to last year. Timeanddate.com explains that due to the complex motions of the Earth and its axis of rotation, as well as the pull of the Moon, Jupiter, and other bodies of the Solar System, the summer will last 93 days, 15 hours, 37 mins in the Northern Hemisphere from beginning to end. This is a full 15 minutes shorter than the summer of 2024, which lasted 93 days, 15 hours, 52 mins. 

    While you may feel robbed of your summer, next year you will enjoy 93 days, 15 hours, and 40 mins of Croc season, or an extra three minutes to get a tan. And at least you aren’t in the Southern Hemisphere, where the summer remains a full four days shorter than in the north.

    [H/T: Timeanddate.com]

    Continue Reading

  • [INTERVIEW] NASA Return to the Moon missions powered by Additive Manufacturing

    [INTERVIEW] NASA Return to the Moon missions powered by Additive Manufacturing

    NASA’s metal additive manufacturing (AM) efforts have entered a new phase of maturity as engineers shift attention toward extremely large, feature-dense components. According to Paul Gradl, Principal Engineer at NASA’s Marshall Space Flight Center, the agency is now 3D printing rocket engine hardware that approaches two meters in diameter and nearly three meters in height, using directed energy deposition (DED). These parts feature complex internal geometries with “thousands of cooling passages” for advanced propulsion systems for commercial space partners and potentially on NASA’s super-heavy-lift Space Launch System (SLS). The SLS rocket will power the agency’s return to the Moon program: Artemis, and enable the establishment of an off-world base for future missions to Mars.

    Gradl explained that laser powder bed fusion (LPBF), once at the frontier of AM innovation, has become sufficiently stable for routine use. “We’ve gotten to a semi-normalisation with laser powder bed fusion. It’s something that we don’t even think twice about using now,” he said, noting that while nuance and qualification efforts persist, the real frontier now lies in large-scale DED and new materials.

    Join NASA’s Paul Gradl and other AM leaders at Additive Manufacturing Advantage: Aerospace, Space & Defense. Final free registration spots – secure your place for Thursday now. 

    3D printing timelines have expanded dramatically. Whereas typical metal AM projects see parts printed in days or weeks, these next-generation components may require months of continuous build time, sometimes approaching a full year, when including design and setup. The scale and duration present new challenges in terms of thermal management, error detection, and building interruptions. “We have hard lessons learned regarding build interruptions from powder bed fusion,” Gradl noted referencing a combustion chamber failure, highlighting the need for robust early error detection.

    Full Scale RS25 Additive Nozzle Liner printed by DM3D Technologies. Photo via NASA
    Full Scale RS25 Additive Nozzle Liner printed by DM3D Technologies. Photo via NASA

    Unlike PBF, which is sensitive to build pauses, DED processes are “almost continuous build interruptions with every layer,” according to Gradl. As parts cool between successive toolpaths, some taking hours per pass, planned interruptions become inherent to the workflow. The use of several tonnes of unmelted powder per build requires staged powder reclamation and restarts, making build interruptions “a lot more tolerable” than in powder bed processes.

    Material science has played a pivotal role in this transition. NASA’s GRX-810, a nickel-cobalt-chromium alloy designed for high-temperature, high-strength use, has now entered commercial availability. “We get 1,000 times better creep at temperatures up to about 1100°C,” Gradl said, positioning the material as “fundamentally changing the way we design” in long-life high temperature propulsion systems. GRX-810 incorporates oxide dispersion strengthened (ODS) feedstocks, coating metal powders with compounds like yttria to create materials that withstand ultra-high temperatures.

    Speed in alloy qualification has also improved. While GRX-810 took roughly 18+ months from simulation to build, a new oxygen-resistant alloy (internally referred to as ORCAlloy or Oxygen Resistant Compatible alloy) reached build readiness in just three months. Gradl attributes this vigorous pace to advances in computational tools such as Integrated Computational Materials Engineering (ICME), proprietary feedstock production techniques, and increased machine stability.

    The acceleration has moved the industry from constrained options, “any colour as long as it’s black,” as Gradl joked, to bespoke alloys tailored for specific propulsion environments. While AM was once synonymous with Ti-6Al-4V and stainless steel, the materials palette now includes purpose-designed formulations inaccessible to processes such as casting or forging.

    According to Gradl, the industry is approaching a significant inflection point in materials for extreme environments. “In 10 years, we will probably see a different landscape of materials than we do today,” Gradl said. “We’re making alloys that are required for high temperature, high pressure, extreme environments, and taking advantage of the process for that.”

    A core challenge remains cultural rather than technical. Gradl noted that organisations are more comfortable adopting modified versions of known alloys, such as Inconel 718, than accepting new chemistries, even when the performance is equivalent or superior. “There’s still a lot of psychology in how materials are being adopted,” he said. “If it has the same name, companies and agencies are a lot more comfortable with it.”

    However, Gradl cautioned against premature claims of qualification. “There’s a different definition of qualification,” he explained. “We want to see a material that is really matured—not just understanding the build parameters but all the subsequent properties and process sensitivities… how it’s going to machine, how it’s going to weld.” NASA intends to publish a framework later this year for materials qualification in aerospace contexts, aiming to standardise what constitutes a mature material.

    Gradl also acknowledged broader pressures, including supply chain risk and critical materials. While avoiding specific geopolitical commentary, he noted that NASA is using ICME to explore reductions in the content of costly or scarce refractory metals. “If I had a material like a C-103 that’s 80 or 90 percent niobium, can I reduce that to 40 or 50 percent and add other materials?” he said. The agency is simulating thousands of potential chemistries to identify equivalent performance with more readily available elements, an approach that could reduce reliance on supply-constrained inputs.

    Join AMAA 2025 to hear from AM leaders.

    NASA pushes additive manufacturing frontiers with multi-material propulsion systems

    NASA’s additive innovation is not limited to material discovery. Its invention of the year was a multi-material, multi-process thrust chamber liner assembly, produced using a combination of powder bed fusion, laser-directed energy deposition (DED), cold spray, and composite filament winding. This hybrid structure includes both axial and radial transitions between materials, allowing for locally tuned thermal and mechanical properties.

    “We were able to combine three different materials and several different processes,” Gradl said. For example, high thermal conductivity copper alloys are used on interior surfaces, while superalloys with high strength-to-weight ratios are applied externally. A polymer matrix composite overwrap, typically carbon fibre, delivers a 40 percent weight reduction and structural reinforcement.

    Gradl emphasised that AM is not displacing traditional processes indiscriminately. “We’re only replacing [them] where it makes sense, where it has an economic advantage, or higher performance,” he said. Increasingly, additive serves as a complementary technology that enables designs and integration steps previously not feasible.

    Yet multi-material builds present unique challenges. Engineers must manage differences in thermal expansion, residual stress, and phase compatibility. “You can create conditions where I might embrittle the material, or have low ductility and actually crack the material,” Gradl warned. “There’s a lot of fundamental material science that we need to make sure we have the correct folks involved, it’s not just a crazy dream of a couple of engineers (although it might start like that).”

    NASA’s experience points to a clear message: the path to high-performance, flight-ready components is paved with deep material science, multi-process integration, deep process understanding and control, and rigorous qualification, not shortcuts.

    An industrial-scale DED nozzle. Photo via NASAAn industrial-scale DED nozzle. Photo via NASA
    An industrial-scale DED nozzle. Photo via NASA

    Insights into NASA’s advanced approach to additive manufacturing with post-processing, data sharing, and software breakthroughs

    “You will not be successful in metal additive manufacturing unless you truly understand post-processing,” said Gradl. This includes not only machining and heat treatment but also inspection, powder removal, and defect mitigation across build stages.

    In the case of large-format DED, many traditional inspection methods are no longer feasible. “CT scanning has come a long way for additive, but certainly at large scale, we don’t have access to scanners of that size,” Gradl said. Instead, NASA is exploring a mix of in-process melt pool monitoring, ultrasonic testing, and adapted X-ray techniques. The agency is also evaluating powder cleaning processes, such as vacuum cycle nucleation, to mitigate risks associated with residual powder and foreign object debris.

    Design for additive manufacturing (DfAM) is undergoing a corresponding shift. Beyond overhangs and build orientation, engineers now integrate post-processing constraints into early-stage designs. For example, complex parts must be inspectable, machinable, and supportable in fixturing. “DfAM is much more than just the build side,” Gradl said. “We need to design for inspection, powder removal, and machining.”

    Topology optimisation and generative design tools are also being employed more widely, but with caution. “You need to have a good handle on your environmental loads,” Gradl noted, particularly in aerospace applications where both input loads and material properties carry significant uncertainty. Without that rigour, designs risk overcomplexity and fatigue issues that outweigh marginal performance gains.

    “One of the things holding us back from more complex designs is the lack of validated fatigue, creep, and size-dependent material properties,” Gradl said. Despite widespread recognition of differences in build orientation, size effects in thin-walled or small-scale features are less well understood.

    NASA’s influence extends beyond its own launch systems. Through Space Act Agreements, the agency is advising startups and established firms alike on process education and qualification frameworks. This includes teaching designers to account for feedstock variation, gas flow dynamics, and machine-specific quirks that can affect material performance. “Even where I position my parts on the build plate could ultimately affect my material property,” said Gradl.

    To accelerate tech transfer, Gradl emphasised the need for more open collaboration around data. “It would be nice if industry and government could start sharing material properties and develop more publicly accessible databases,” he said. Proprietary secrecy around allowables, he warned, may be inhibiting broader industry growth.

    Across both propulsion and satellite systems, additive manufacturing is now considered a strategic enabler. “I can really optimize for weight, and I have to be able to control these satellite systems,” Gradl explained, referencing the growing role of extreme temperature materials and GRX-810 for in-space propulsion.

    The next wave of capability may come not from hardware but software and control. NASA is exploring laser beam shaping, green and blue wavelength lasers for specific alloys, and fine control over energy input to achieve tailored material properties and geometries. “How do I really control the laser to make unique geometries that I couldn’t make with standard parameters?” Gradl asked. “These will help introduce more complexity and potentially even unique material properties.”

    “We’re past the hype,” said Gradl. “Companies are trying to make it real, and they’re running into cultural psychology roadblocks.” The problem is not necessarily a lack of capability but how organisations evaluate risk, define qualification, and decide whether to adopt new material systems or part designs. Even when additive offers a clear performance gain, institutional reluctance to depart from established materials or certification paths can slow progress.

    As NASA continues to support additive implementation across industry, Gradl sees the role of communication and expectation-setting as critical. “We’re in a unique spot,” he said. “We can openly talk about what’s working and what’s coming, and help companies think about how they might want to use multi-material additive manufacturing or new processes.” Public-private partnerships will be key to translating laboratory-scale innovation into operational advantage.

    Join AM defense experts on July 10th at Additive Manufacturing Advantage: Aerospace, Space & Defense. Spaces are limited for this free online event. Register now.

    How is the future of 3D printing shaping up?

    To stay up to date with the latest 3D printing news, don’t forget to subscribe to the 3D Printing Industry newsletter or follow us on Twitter, or like our page on Facebook.

    While you’re here, why not subscribe to our Youtube channel? Featuring discussion, debriefs, video shorts, and webinar replays.

    Full Scale RS25 Additive Nozzle Liner printed by DM3D Technologies. Photo via NASA.


    Continue Reading

  • Fat factory helped Neanderthals survive 125,000 years ago – mindmatters.ai

    1. Fat factory helped Neanderthals survive 125,000 years ago  mindmatters.ai
    2. 125,000-year-old ‘fat factory’ run by Neanderthals discovered in Germany  CNN
    3. Neanderthals Ran “Fat Factories” 125,000 Years Ago  Universiteit Leiden
    4. Fig. 5. Spatial distribution of faunal remains at NN2. distribution of…  researchgate.net
    5. Archaeologists Were Digging Into a Hill—and Stumbled Upon a 125,000-Year-Old Factory  Popular Mechanics

    Continue Reading