Category: 7. Science

  • Scientists Rewire Immune Cells To Supercharge Cancer-Fighting Power

    Scientists Rewire Immune Cells To Supercharge Cancer-Fighting Power

    A new study shows that blocking one protein can reprogram immune cells to fight cancer more powerfully, offering hope for next-generation therapies. Credit: Shutterstock

    Blocking a single protein boosts T cell metabolism and tumor-fighting strength. The discovery could lead to next-generation cancer immunotherapies.

    Scientists have identified a strategy to greatly enhance the cancer-fighting abilities of the immune system’s T cells. By inhibiting a protein known as Ant2, they succeeded in reshaping how these cells produce and use energy, effectively redesigning their internal power systems.

    This adjustment makes T cells stronger, more durable, and more efficient at targeting tumors. The discovery opens new possibilities for therapies that reinforce the body’s natural defenses, offering a more precise and effective form of cancer treatment.

    International team leads breakthrough

    A recent study suggests the potential for a new wave of cancer therapies that rely on training the immune system to perform more effectively. The work, led by PhD student Omri Yosef and Prof. Michael Berger at the Hebrew University Faculty of Medicine, brought together collaborators including Prof. Magdalena Huber of Philipps University of Marburg and Prof. Eyal Gottlieb of the University of Texas MD Anderson Cancer Center. Their findings show that carefully adjusting immune cell metabolism significantly boosts the cells’ ability to destroy tumors.

    Central to the research is a striking observation: when T cells—the immune system’s front-line fighters—are compelled to change the way they generate and manage energy, they become far more adept at recognizing and eliminating cancer cells.

    Tumor Size Under Different Treatment Conditions
    This image displays tumors collected at the end of the experiment from three experimental groups: untreated controls, OT-I CD8+ T cell recipients without additional treatment, and those receiving OT-I CD8+ T cells followed by ATR therapy. The 5 mm scale bar highlights visible differences in tumor volume. Growth was measured daily starting on day 17 post-implantation to evaluate treatment effectiveness. Credit: Omri Yosef and the authors

    “By disabling Ant2, we triggered a complete shift in how T cells produce and use energy,” explains Prof. Berger. “This reprogramming made them significantly better at recognizing and killing cancer cells.” In simpler terms, blocking this protein forces the immune cells to adapt their metabolism, turning them into stronger, faster, and more aggressive cancer fighters.

    Mitochondria as cellular engines

    Published in Nature Communications, the study focuses on the mitochondria—the “metabolic hub” of cells. By deliberately disrupting a specific energy pathway inside T cells, the team essentially rewired the cells’ engines, creating a state of heightened readiness and potency. The altered T cells exhibited greater stamina, faster replication, and sharper targeting of cancerous threats.

    Perhaps most importantly, the researchers showed that this metabolic rewiring can be triggered not only through genetic modifications but also with drugs—opening the door for potential clinical applications.

    This discovery is part of a growing movement in cancer immunotherapy that focuses not only on guiding the immune system but upgrading its inner machinery. While more studies and clinical trials are needed, the implications of this breakthrough are promising: new treatments that harness the body’s own defenses, fine-tuned for peak performance.

    “This work highlights how deeply interconnected metabolism and immunity truly are,” says Prof. Berger. “By learning how to control the power source of our immune cells, we may be able to unlock therapies that are both more natural and more effective.”

    Reference: “Metabolic reprogramming driven by Ant2 deficiency augments T Cell function and anti-tumor immunity in mice” by Omri Yosef, Leonor Cohen-Daniel, Oded Shamriz, Zahala Bar-On, Wajeeh Salaymeh, Amijai Saragovi, Ifat Abramovich, Bella Agranovich, Veronika Lutz, Joseph Tam, Anna Permyakova, Eyal Gottlieb, Magdalena Huber and Michael Berger, 8 May 2025, Nature Communications.
    DOI: 10.1038/s41467-025-59310-3

    Never miss a breakthrough: Join the SciTechDaily newsletter.

    Continue Reading

  • All shark no bite: Ocean acidification might leave species toothless

    All shark no bite: Ocean acidification might leave species toothless

    The rising issue of ocean acidification could render some of the ocean’s oldest apex predators as ‘all shark and no bite’, after a new study finds that more acidic oceans could leave many of the species with more brittle and weaker teeth. 

    Even the sharks’ famous ability to replace their teeth – with new ones always growing as they are using up the current set – might not be enough to stave off the pressures of the warming planet and an ocean environment left vulnerable to increasing levels of acidification.

    These findings are the result of recently published research from the German institute, Heinrich Heine University in Dusseldorf in which shark teeth were examined under different ocean acidification scenarios. It found that shark teeth – “despite being composed of highly mineralised phosphates” – are still vulnerable to corrosion under future ocean acidification scenarios.

    “They are highly developed weapons built for cutting flesh, not resisting ocean acid,” said the paper’s first author, Maximilian Baum, a biologist at Heinrich Heine University Dusseldorf in Frontiers in Marine Science. “Our results show just how vulnerable even nature’s sharpest weapons can be.”

    Ocean acidification is the process by which the ocean’s pH values keeps decreasing, resulting in more acidic water. It is mostly driven by the release of human-generated carbon oxide. According to the article, the current average pH of the world’s ocean is 8.1. It is expected to drop to 7.3 by the year 2300, making it tens times more acidic than it currently is.

    Damage to coral reefs, loss of habitats, and a threat to the survival of shell-building marine creatures are among the impacts already being felt across the ocean due to ocean acidification. Until only recently, it was deemed not to have crossed its ‘planetary boundary’, but a major study led by the UK’s Plymouth Marine Laboratory and the US’ National Oceanic and Atmospheric Administration released in June this year cited that the boundary had in fact been crossed five years ago.

    As a result of the ocean acidification experienced so far, selected tropical and sub-tropical coral reefs have lost 43% of their suitable habitats, sea butterflies in polar regions have lost up to 61% of their habitat, and coastal shellfish species have lost 13% of their global coastline habitats in which they can sustain their essential biological processes.

    Now, it would appear that sharks could stand to lose their teeth.


    Continue Reading

  • We drilled deep under the sea to learn more about mega-earthquakes and tsunamis

    We drilled deep under the sea to learn more about mega-earthquakes and tsunamis

    Far beneath the waves, down in the depths of the Japan Trench — seven kilometres below sea level — lie hidden clues about some of the most powerful earthquakes and tsunamis on Earth.

    From September to December 2024, Expedition 405 of the International Ocean Discovery Program (IODP) embarked on a four-month long mission to offshore Japan. Aboard the Chikyu — the world’s largest scientific drilling ship — 60 scientists teamed up with experienced drillers to uncover deep-sea sediment cores from beneath the sea floor.

    The scientists included sedimentologists like myself, alongside geochemists, micropaleontologists, structural geologists, geophysicists and paleomagnetists. We drilled into a fault zone where only one prior expedition had drilled directly before. IODP Expedition 405 — also called Tracking Tsunamigenic Slip Across the Japan Trench (JTRACK) — is only the second deep-drilling mission to access this area.

    This time, we reached and sampled the décollement, or the basal detachment, of the fault that ruptured during the devastating 2011 Tōhoku mega-earthquake. We collected cores that will help scientists better understand how such powerful earthquakes are triggered.

    An unexpected slip

    On March 11, 2011, the Tōhoku mega-earthquake struck off the northeast coast of Japan, triggering a catastrophic tsunami. At magnitude 9.1, it was the most powerful earthquake ever recorded and the deadliest natural disaster in Japan’s modern history.

    More than 18,000 people died. The earthquake severely damaged the Fukushima nuclear plant; there was an estimated US$235 billion in damages. Scientists were surprised not by the scale of the earthquake, but by the location of the largest plate slip that had triggered it: not deep underground, but just beneath the sea floor, at the shallowest part of the plate boundary.

    The rupture took place along the Japan Trench, where the Pacific Plate dives beneath the Okhotsk Plate. Until then, this shallow section of subduction zones was thought to slip slowly and quietly.

    But during the Tōhoku event, more than 50 metres of slip occurred on a fault that ruptured the sea floor, displacing huge amounts of water and generating the devastating tsunami.

    Drilling into the fault

    During the IODP 405 expedition, we set out to understand the conditions that make such tsunamis possible.

    The Japan Trench provides a natural laboratory to investigate the fundamental processes of tsunamigenic earthquakes that trigger massive tsunamis.

    For that reason, we drilled deep into the plate boundary fault, the exact zone that ruptured during the 2011 earthquake. This meant drilling more than 800 metres beneath the seafloor and into the fault itself to recover samples of rocks and sediments.

    We also installed a long-term observatory to monitor temperature and fluid pressure at the earthquake’s source, hoping to detect subtle signals locked in the material that once unleashed one of the most powerful earthquakes in history.

    Retrieving cores

    On board the Chikyu, operations ran 24-7. Every three hours, a new core arrived on deck — a long, cylindrical archive of Earth’s memory. As sedimentologists, we got to work right away peering through the transparent liners with flashlights, scanning for traces of sand, volcanic ash or anything hinting at past geological events.

    Each core told a chapter of a story written over millions of years. Layer by layer, they revealed a sequence of faulted, fractured or deformed sediments and rocks. Some contained smectite — a slippery clay mineral known to reduce friction along faults. These are precisely the kinds of materials that can allow tectonic plates to slip easily, even at shallow depths near the sea floor — exactly the kind of setting that could produce a tsunami-generating earthquake.

    One of the most thrilling moments came when we hit layers of chert — a hard, glassy rock that marks the transition from deep-sea sediments to oceanic crust. We had reached the décollement zone, the very boundary where one tectonic plate dives beneath another.

    In the lab, slicing open the cores revealed something else: beautifully banded colourful clays, tinted in rich shades of chocolate, vanilla and caramel — a natural palette created from geological processes deep within the Earth.

    Each new core entered a tightly co-ordinated workflow: scanned by high-resolution, X-ray-computed tomography, tested for physical and chemical properties, then split in half. One half was carefully preserved in a permanent archive, while the other was examined and sampled thoroughly by scientists from various countries and disciplines.

    My research focuses on the sedimentary signature of past earthquakes and tsunamis. On the Chikyu, I searched for deposits called homogenite-turbidite sequences. These form when a quake shakes the sea floor, triggering a submarine landslide (the turbidite), followed by a slow rain of fine particles stirred up by the tsunami (the homogenite). These sequences are geological time capsules, helping us estimate how often giant earthquakes have struck in the past.

    Fault evolution

    The Chikyu returned to the original site drilled soon after the 2011 earthquake. This gave us something rare in geoscience: an opportunity to observe how the fault has evolved over more than a decade. We installed a borehole observatory, deeper and more advanced than any before in this region.

    Over the coming years, it will monitor temperature and fluid flow in real time, giving us a window into the living, breathing dynamics of a megathrust fault.

    Using this data, scientists will simulate earthquake conditions using numerical models or experiments to test how these rocks respond under pressure. They will analyze the chemistry of the fluids trapped within the fault and use advanced logging tools to build a detailed picture of the fault’s internal architecture.

    Others — like myself — will focus on the sedimentary record, deciphering past events to better understand the frequency of earthquakes and tsunamis.

    From understanding to preparedness

    The Japan Trench is not an isolated case. Subduction zones around the world, from Chile to Alaska to Indonesia, pose similar risks, often just offshore from densely populated regions. If shallow slip can happen there too, then our current models and preparedness strategies must evolve accordingly.

    Our goal wasn’t just to understand why the 2011 Tōhoku earthquake happened, but to help prepare for the next one. By improving tsunami hazard assessments and deepening our understanding of mega-earthquake fault behaviour, we contribute to building global resilience.

    IODP Expedition 405 marks a major milestone for earthquake and tsunami science. In the coming years, data from the new borehole observatory, along with lab experiments and sediment analyses, will offer unprecedented insights into how these faults evolve and how we can better anticipate and mitigate the impacts of future megathrust earthquakes.

    This article is republished from The Conversation, a nonprofit, independent news organisation bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Morgane Brunet, Université du Québec à Rimouski (UQAR)

    Read more:

    Morgane Brunet receives funding from the European Commission through a Marie Curie Postdoctoral Global Fellowship.

    Continue Reading

  • Jupiter’s Age Revealed By “Molten Rock Raindrops”

    Jupiter’s Age Revealed By “Molten Rock Raindrops”

    A team from Nagoya University in Japan and the Italian National Institute for Astrophysics (INAF) have used a new method to gauge the age of gas giant Jupiter, suggesting it could be used to pin down the timings of when the planets of the Solar System formed.

    Figuring out the age of the planets is, as you might imagine, a tricky business. It was only in 1953 that we finally pinned down an approximate age of the Earth, when geochemist Clair Cameron Patterson measured lead isotopes in meteorite samples, tightly constraining the age of our Solar System. Further analysis of the oldest zircons on Earth has helped to support this dating, putting the age of our planet at around 4.543 billion years.

    “In the Solar System, radionuclides are the key to dating planets. These are special atoms that slowly release energy over a long period of time. As natural clocks, radionuclides help scientists determine the ages of all kinds of things, from rocks to bones and pottery,” Adam Burgasser, Professor of Astronomy & Astrophysics at the University of California, San Diego, explains in a piece for The Conversation.

    “Using this method, scientists have determined that the oldest known meteorite is 4.57 billion years old, almost identical to the Sun’s asteroseismology measurement of 4.58 billion years. The oldest known rocks on Earth have slightly younger ages of 4.40 billion years. Similarly, soil brought back from the Moon during the Apollo missions had radionuclide ages of up to 4.6 billion years.”

    Other methods are used to estimate the age of planets and Solar System bodies, including counting the number of craters from meteorite impacts, and estimating the expected number of impacts over a certain timescale. This is particularly useful for bodies like Mercury, a planet from which we only have one meteorite sample (maybe). 

    The surface of gas giants, though, is pretty unaffected by meteor impacts. They could be hit constantly, and though we might notice the flashes, they would leave no crater in gas. It is still possible to use the crater method on their moons, but this isn’t altogether ideal. Looking at iron-rich meteorites, though, scientists have found that there are two distinct populations of these rocks. From there, they have suggested that Jupiter’s formation was the cause of this distinction, making it likely that it is the oldest planet in the Solar System.

    In a new study, researchers came up with a new method for estimating Jupiter’s age. They looked at chondrules, which make up large volumes of chondrite meteorites.

    “Chondrules are spherical particles of crystallized liquid silicates that constitute large volume fractions of chondritic meteorites. Chondrules typically range 0.1–2 mm in size and solidified with estimated cooling rates of 10-1,000 K h-1,” the team explains in their new paper. “The widespread presence, high volume fraction (exceeding 80 percent in ordinary chondrites) and spherical (or subspherical) shape of chondrules suggest that they originated from an unknown melting process occurring ubiquitously in the solar nebula.”

    The team attempted to model how these chondrules form, finding that they can be naturally formed in the collisions between planetesimals containing volatile minerals. They found that their size and the rate at which they formed depend upon how water-rich the colliding planetesimals were.

    “When planetesimals collided with each other, water instantly vaporized into expanding steam. This acted like tiny explosions and broke apart the molten silicate rock into the tiny droplets we see in meteorites today,” co-lead author Professor Sin-iti Sirono from Nagoya University’s Graduate School of Earth and Environmental Sciences explained in a statement.

    “Previous formation theories couldn’t explain chondrule characteristics without requiring very specific conditions, while this model requires conditions that naturally occurred in the early Solar System when Jupiter was born.” 

    Modeling Jupiter’s formation and growth, and how its increasing gravity affected the impacts of planetesimals, the team found that the timeline fit well with previous ages of the Solar System.

    “We compared the characteristics and abundance of simulated chondrules to meteorite data and found that the model spontaneously generated realistic chondrules,” Dr Diego Turrini, co-lead author and senior researcher at INAF, added. 

    “The model also shows that chondrule production coincides with Jupiter’s intense accumulation of nebular gas to reach its massive size. As meteorite data tell us that peak chondrule formation took place 1.8 million years after the Solar System began, this is also the time at which Jupiter was born.”

    This would place Jupiter as forming earlier than the other planets. However, there are more mysteries to investigate. Chondrules of many different sizes, shapes, and ages are found in meteorites. The team suggests that this wide range could be the result of other gas giants (such as Saturn, Neptune, and Uranus) affecting their own collisions of bodies in the Solar System. If the study holds up, this new method could help us narrow down the order of formation of planets in our home system.

    The study is published in Scientific Reports.

    Continue Reading

  • NASA James Webb Space Telescope observes interstellar comet 3i/ATLAS

    NASA James Webb Space Telescope observes interstellar comet 3i/ATLAS

    Aug. 27 (UPI) — NASA observed interstellar comet 3I/ATLAS through the lens of the James Webb Space Telescope for the first time.

    On Aug. 6, the telescope with its near-infrared Spectrograph instrument analyzed the comet as it provided more information about its size, physical properties, and chemical makeup.

    This kind of observation can help make clear what conditions were like in the systems where comets like 3I/ATLAS was formed.

    3I/ATLAS is outgassing as it approaches the sun, which was speculated that it would.

    Astronomers identified it as made of carbon dioxide, water, water ice, carbon monoxide, and the smell gas carbon sulfide.

    The comet has the highest ratio of carbon dioxide to water ever observed in a comet, which implies that the comet contains ices that were exposed to much higher levels of radiation.

    The team also suggested the comet could have been formed in a “carbon dioxide ice line. Additionally, the comet could also be around 7 billion years old, a previous discovery found.

    The comet was first discovered on July 1st by NASA’s Asteroid Terrestrial impact Alert system.

    Continue Reading

  • Ready for Takeoff? ChatGPT Could Steer the Next Spacecraft

    Ready for Takeoff? ChatGPT Could Steer the Next Spacecraft

    When you think of ChatGPT, the first image that comes to mind is often a virtual assistant answering questions or writing essays. 

    Well, not anymore, say rocket scientists from MIT and the Universidad Politécnica de Madrid (UPM). They have shown how this large language model (LLM) could pilot a simulated spacecraft, hinting that autonomous space exploration is much closer than previously imagined.

    The Kerbal Challenge

    The ChatGPT experiment was MIT-UPM’s entry for the Kerbal Space Program Differential Game Challenge, a community-driven initiative inspired by the wildly popular Kerbal Space Program video game that mimics realistic spaceflight scenarios to test autonomous control systems.

    Participating teams pitted their AI systems against each other in complex tasks such as satellite interception, pursuit-and-evasion missions, and navigation in limited timeframes. The challenge’s goal was to create AI agents that could operate spacecraft without human input.

    ChatGPT’s Space Pilot

    Unlike competitors who built specialized systems from scratch, the MIT-UPM team relied on ChatGPT and their first prompt to OpenAI’s virtual assistant read, “You operate as an autonomous agent controlling a pursuit spacecraft.”

    Prompts on the spacecraft’s current state, objectives, etc., were fed into ChatGPT, and the LLM generated maneuvering instructions to steer the spacecraft with such remarkable precision that it finished second in the Kerbal Differential Game Challenge.

    Autonomous Space Explorations

    MIT-UPM’s ChatGPT breakthrough points to a future where LLM-based copilots could assist and even replace humans in piloting spacecraft. Traditional autonomous systems have failed in this area due to their need for extensive training cycles and real-time feedback.

    Since ChatGPT is pre-trained on vast tomes of human knowledge, it can “understand” tasks with minimal additional guidance, making it well-suited for navigating hazardous environments and operating remote robotic systems across our solar system.

    Image credit: Dima Zel/Shutterstock

    Find Thomasnet Suppliers and Services

    Continue Reading

  • The ancient oxygen flood that forever changed life in the oceans

    The ancient oxygen flood that forever changed life in the oceans

    Some 390 million years ago in the ancient ocean, marine animals began colonizing depths previously uninhabited. New research indicates this underwater migration occurred in response to a permanent increase in deep-ocean oxygen, driven by the aboveground spread of woody plants — precursors to Earth’s first forests. 

    That rise in oxygen coincided with a period of remarkable diversification among fish with jaws — the ancestors of most vertebrates alive today. The finding suggests that oxygenation might have shaped evolutionary patterns among prehistoric species.

    “It’s known that oxygen is a necessary condition for animal evolution, but the extent to which it is the sufficient condition that can explain trends in animal diversification has been difficult to pin down,” said co-lead author Michael Kipp, assistant professor of earth and climate sciences in the Duke University Nicholas School of the Environment. “This study gives a strong vote that oxygen dictated the timing of early animal evolution, at least for the appearance of jawed vertebrates in deep-ocean habitats.”

    For a time, researchers thought that deep-ocean oxygenation occurred once at the beginning of the Paleozoic Era, some 540 million years ago. But more recent studies have suggested that oxygenation occurred in phases, with nearshore waters first becoming livable to breathing organisms, followed by deeper environments.

    Kipp and colleagues homed in on the timing of those phases by studying sedimentary rocks that formed under deep seawater. Specifically, they analyzed the rocks for selenium, an element that can be used to determine whether oxygen existed at life-sustaining levels in ancient seas. 

    In the marine environment, selenium occurs in different forms called isotopes that vary by weight. Where oxygen levels are high enough to support animal life, the ratio of heavy to light selenium isotopes varies widely. But at oxygen levels prohibitive to most animal life, that ratio is relatively consistent. By determining the ratio of selenium isotopes in marine sediments, researchers can infer whether oxygen levels were sufficient to support animals that breathe underwater.

    Working with research repositories around the world, the team assembled 97 rock samples dating back 252 to 541 million years ago. The rocks had been excavated from areas across five continents that, hundreds of millions of years ago, were located along the outermost continental shelves — the edges of continents as they protrude underwater, just before giving way to steep drop-offs.

    After a series of steps that entailed pulverizing the rocks, dissolving the resulting powder and purifying selenium, the team analyzed the ratio of selenium isotopes that occurred in each sample.

    Their data indicated that two oxygenation events occurred in the deeper waters of the outer continental shelves: a transient episode around 540 million years ago, during a Paleozoic period known as the Cambrian, and an episode that began 393-382 million years ago, during an interval called the Middle Devonian, that has continued to this day. During the intervening millennia, oxygen dropped to levels inhospitable to most animals. The team published their findings in Proceedings of the National Academy of Sciences in August.

    “The selenium data tell us that the second oxygenation event was permanent. It began in the Middle Devonian and persisted in our younger rock samples,” said co-lead author Kunmanee “Mac” Bubphamanee, a Ph.D. candidate at the University of Washington.

    That event coincided with numerous changes in oceanic evolution and ecosystems — what some researchers refer to as the “mid-Paleozoic marine revolution.” As oxygen became a permanent feature in deeper settings, jawed fish, called gnathostomes, and other animals began invading and diversifying in such habitats, according to the fossil record. Animals also got bigger, perhaps because oxygen supported their growth.

    The Middle Devonian oxygenation event also overlapped with the spread of plants with hard stems of wood.

    “Our thinking is that, as these woody plants increased in number, they released more oxygen into the air, which led to more oxygen in deeper ocean environments,” said Kipp, who began this research as a Ph.D. student at the University of Washington.

    The cause of the first, temporary oxygenation event during the Cambrian is more enigmatic.

    “What seems clear is that the drop in oxygen after that initial pulse hindered the spread and diversification of marine animals into those deeper environments of the outer continental shelves,” Kipp said.

    Though the team’s focus was on ancient ocean conditions, their findings are relevant now.

    “Today, there’s abundant ocean oxygen in equilibrium with the atmosphere. But in some locations, ocean oxygen can drop to undetectable levels. Some of these zones occur through natural processes. But in many cases, they’re driven by nutrients draining off continents from fertilizers and industrial activity that fuel plankton blooms that suck up oxygen when they decay,” Kipp said.

    “This work shows very clearly the link between oxygen and animal life in the ocean. This was a balance struck about 400 million years ago, and it would be a shame to disrupt it today in a matter of decades.”

    Funding: MAK was supported by an NSF Graduate Research Fellowship and Agouron Institute Postdoctoral Fellowship. Additional support was provided by the NASA Astrobiology Institute’s Virtual Planetary Laboratory.

    Continue Reading

  • Scientists Track Down Fresh Boulder Falls on the Moon

    Scientists Track Down Fresh Boulder Falls on the Moon

    As a boulder rolls down a cliff slope on the Moon, it kicks up lunar dust, leaving behind a telltale herringbone pattern of ejecta.

    In a recent study, for the first time, scientists geolocated and dated evidence of such boulder falls. They identified 245 fresh tracks created as boulders rolled, bounced, and slid down crater walls.

    “For a long time, there was this belief that the Moon is geologically dead.…Our study shows that boulders with sizes ranging [from] tens to hundreds of meters and [with] weights in tons have moved from their places over time,” said Sivaprahasam Vijayan, the study’s lead author and an associate professor at the Physical Research Laboratory in Ahmedabad, India. “It is equally important to know how recent these boulder fall events are to understand the time periods when the geological agents were active.”

    Tracking Boulder Falls

    As lunar boulders bounce, they scoop up bright, unweathered subsurface material and bring it to the surface. As a result, fresh boulder fall tracks appear brighter than older ones.

    “One can identify a boulder fall to be a recent one considering the boulder fall ejecta,” said Senthil Kumar Perumal, principal scientist with the Planetary Sciences Group at the National Geophysical Research Institute in Hyderabad, India, who was not involved in the new study.

    The craters were found to be around 400,000 years old—which means the BFE tracks are more recent.

    To identify relatively recent boulder tracks, Vijayan and his colleagues first manually searched thousands of images of the lunar surface between 40°S and 40°N. At these latitudes, the Sun makes the bright boulder tracks distinguishable from the rest of the lunar surface. Once they identified a track, the researchers studied corresponding images taken by NASA’s Lunar Reconnaissance Orbiter Narrow Angle Camera between 2009 and 2022.

    Next, scientists estimated the age of the tracks by studying regions with both boulder fall ejecta (BFE) and distinct impact ejecta blankets. (Such blankets, nicknamed the “lunar equivalent of fossils,” have long been used to estimate the age of impact events.) The craters analyzed by Vijayan and his colleagues were found to be around 400,000 years old—which means the BFE tracks are more recent.

    Finally, the scientists identified possible seismic faults or impact craters nearby that could have triggered the boulder falls.

    Mapping the Moon

    The new geological map of boulder falls, published in Icarus, highlights seismically active spots and fresh impact sites on the Moon. Researchers say these regions could be potential landing sites for future lunar missions focused on recent surface and subsurface activity.

    The study authors plan to integrate artificial intelligence methods into the next iteration of their work, but ultimately, Vijayan said, “the next step is to more precisely determine whether the cause [of a fall] is endogenic or exogenic, which can be achieved by deploying additional seismometers in upcoming missions.”

    Kumar concurred. “We need to have a large network of seismometers covering the entire [Moon] that monitors seismic activity continuously for several decades,” he said.

    —Unnati Ashar, Science Writer

    Citation: Ashar, U. (2025), Scientists track down fresh boulder falls on the Moon, Eos, 106, https://doi.org/10.1029/2025EO250314. Published on 27 August 2025.
    Text © 2025. The authors. CC BY-NC-ND 3.0
    Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

    Continue Reading

  • Bluesky now platform of choice for science community

    Bluesky now platform of choice for science community

    Shiffman, the author of Why Sharks Matter, described early Twitter recently on the blog Southern Fried Science as “the world’s most interesting cocktail party.”

    “Then it stopped being useful,” Shiffman told Ars. “I was worried for a while that this incredibly powerful way of changing the world using expertise was gone. It’s not gone. It just moved. It’s a little different now, and it’s not as powerful as it was, but it’s not gone. It was for me personally, immensely reassuring that so many other people were having the same experience that I was. But it was also important to document that scientifically.”

    Eager to gather solid data on the migration phenomenon to bolster his anecdotal observations, Shiffman turned to social scientist Julia Wester, one of the scientists who had joined Twitter at Shiffman’s encouragement years before, before also becoming fed up and migrating to Bluesky. Despite being “much less online” than the indefatigable Shiffman, Wester was intrigued by the proposition. “I was interested not just in the anecdotal evidence, the conversations we were having, but also in identifying the real patterns,” she told Ars. “As a social scientist, when we hear anecdotal evidence about people’s experiences, I want to know what that looks like across the population.”

    Shiffman and Wester targeted scientists, science communicators, and science educators who used (or had used) both Twitter and Bluesky. Questions explored user attitudes toward, and experiences with, each platform in a professional capacity: when they joined, respective follower and post counts, which professional tasks they used each platform for, the usefulness of each platform for those purposes relative to 2021, how they first heard about Bluesky, and so forth.

    The authors acknowledge that they are looking at a very specific demographic among social media users in general and that there is an inevitable self-selection effect. However, “You want to use the sample and the method that’s appropriate to the phenomenon that you’re looking at,” said Wester. “For us, it wasn’t just the experience of people using these platforms, but the phenomenon of migration. Why are people deciding to stay or move? How they’re deciding to use both of these platforms? For that, I think we did get a pretty decent sample for looking at the dynamic tensions, the push and pull between staying on one platform or opting for another.”

    Continue Reading

  • CHORD to Revolutionize Canadian Radio Astronomy

    CHORD to Revolutionize Canadian Radio Astronomy

    Construction is underway of CHORD, the most ambitious radio telescope project ever built on Canadian soil. Short for the Canadian Hydrogen Observatory and Radio-transient Detector, CHORD will give astronomers an unprecedented opportunity to explore some of the most exciting and mysterious questions in astrophysics and cosmology, from Fast Radio Bursts (FRBs) and dark energy to the measurements of fundamental particles, and beyond.

    “This telescope will be an order of magnitude more powerful than its predecessor, the CHIME telescope, and it will all be enabled by Canadian technology and expertise,” said Matt Dobbs, a professor of physics at McGill University and one of the project leads. It incorporates the latest advances in radio dish fabrication, electronics designed to minimize the amount of radio interference and digital signal processing by harnessing state-of-the-art computing technologies.

    Launched in 2017, CHIME, the Canadian Hydrogen Intensity Mapping Experiment, placed Canadian scientists at the forefront of unravelling the mysteries of FRBs. A type of transient radio signal, FRBs last only a fraction of a second. They are caused by an astrophysical phenomenon not yet understood by scientists. FRBs originate far outside our galaxy and are highly energetic at their source, though by the time they reach Earth, the signal can be quite weak.

    CHIME has detected thousands of FRBs since 2018, but CHORD is designed to be even more sensitive, allowing researchers to observe fainter signals and detect a broader range of frequencies. It will be able to detect thousands of FRBs in real time with unparalleled precision.

    “CHORD’s increased frequency bands and sensitivity mean we can build on CHIME’s success as the most effective FRB-detecting instrument on Earth, helping us find more FRBs and understand them in greater detail,” said Kendrick Smith, who leads CHORD’s software design at Perimeter Institute for Theoretical Physics. “Eventually, once we have detected enough FRBs, we can make a statistical map of electrons in the universe.”

    ‘The first dish is the hardest’

    In January, the team installed its first dish; the dishes continue to be rolled out, with about 50 expected by the end of this year. All of CHORD’s components – hardware, software and science campaigns – are to be brought together for a test run in fall 2025 using the first few dishes, before the system is built out to full capacity in 2027.

    “The first dish is the hardest, since it requires all the pieces to be in place for the whole production: the facility, the staff, and the supply chain of materials,” said CHORD Collaboration project manager Dallas Wulf.

    The telescope’s core array will be located at the National Research Council of Canada’s (NRC) Dominion Radio Astrophysical Observatory, near Penticton, British Columbia. There will also be two outrigger sites, smaller versions of the CHORD instrument located in northern California and central West Virginia, which will help pinpoint the exact galaxy from which the FRB signal emerged.

    “The CHORD outrigger stations will dramatically enhance the scientific return of the project by enabling precise localizations for the FRBs detected by the core array,” said Juan Mena-Parra, a senior science group member for the instrument from the Dunlap Institute for Astronomy & Astrophysics at the University of Toronto. “This level of accuracy allows us to confidently identify the host galaxies and source environments of FRBs, key steps toward understanding their origins and unlocking their power as probes of the large-scale structure of the universe.”

    A flagship for Canadian astronomy

    Beyond unravelling the mysteries of FRBs, CHORD is also deepening partnerships with Canadian industry to design, build and produce the telescope on Canadian soil.

    “One of the first things we did was to build an arena-sized factory, and we brought on a small army of technicians to work with young scientists and engineers,” Dobbs said. “Homegrown technology runs all up and down the instrument, with Canadians from different sectors and different provinces collaborating to build the foundations of making this breakthrough science possible.”

    The NRC’s Herzberg Astronomy and Astrophysics Research Centre is behind some of the homegrown technology, including the pioneering single-piece reflectors used by CHORD.

    “Dishes like this don’t exist off the shelf. To get the surface form that we needed, our team designed and fabricated everything by hand and on-site,” said Brian Hoff, NRC’s CHORD project manager. “The most complex part was maintaining the surface accuracy needed at each step of the process. Our design is both low-cost and highly reproducible, which is incredibly important when fabricating 640 dishes.”

    The software and analytic tools are also Canadian designed. CHORD will gather an enormous amount of data from its surveys of the universe. Smith estimated it will collect more than a terabyte of data every second – approximately the data rate of the entire North American cell phone network. To sort through that amount of data will require powerful new software and algorithms suited to the task. Canadian researchers are leading the charge.

    CHORD is supported by funding from the Canada Foundation for Innovation (CFI). Institutions involved in building the telescope are McGill University, the University of Toronto, the National Research Council, Perimeter Institute and the University of Calgary. Other CHORD partners are Queen’s University, the University of British Columbia and York University, together with Arizona State University, Italy’s Instituto Nazionale di Astrofisica, the Massachusetts Institute of Technology and West Virginia University.

    Continue Reading