The endangered South African cycad Encephalartos horridus may resemble a relic from the Jurassic age, but the species itself evolved long after dinosaurs disappeared. Still, it carries a biochemical legacy inherited from its distant ancestors—plants that once thrived alongside Jurassic fauna. A team led by Hiroshima University (HU) researchers found that its spiky, silvery-blue leaves owe their color not to pigment, but to a wax-based optical effect produced by a lipid compound that may date back to the dawn of land plants.
In a study published in the Journal of Experimental Botany , researchers revealed that the coating of epicuticular wax on E. horridus leaves forms tubular crystals that reflect light from ultraviolet (UV) to blue wavelengths, giving the plant its bluish sheen. The paper will also appear on the front cover of the journal’s upcoming Volume 76, Issue 12.
Nonacosan-10-ol, the key wax compound, is found across diverse plant lineages—including gymnosperms such as ginkgos, conifers, and cycads, and has even been detected in certain mosses—suggesting the ability to produce it emerged early in the evolution of land plants. However, only a few species can organize it into specialized wax structures that produce structural color, vivid hues generated not by pigment, but by microscopic architectures that scatter light. It’s the same optical effect behind the iridescent wings of morpho butterflies and the vibrant plumage of blue jays—both of which appear blue despite lacking the pigment.
The team found, however, that this unique color in E. horridus doesn’t come from the wax alone. It also depends on how the wax interacts with the dark green, chlorophyll-rich tissues underneath.
“The blue color of Encephalartos horridus leaves comes not from pigments but from a clever natural trick. Tiny wax crystals on the surface create what’s called ‘structural coloration,’” explained study corresponding author Takashi Nobusawa , assistant professor at HU’s Graduate School of Integrated Sciences for Life .
“The leaf surface is coated with ultra-thin wax crystals about one ten-thousandth of a millimeter wide. Peeling off the leaf’s surface layer makes the blue disappear. But placing it back on a dark surface brings the blue back, as if by magic.”
UV defense and pollinator lure?
To understand why the leaves of E. horridus appear bluish, researchers ran Monte Carlo multi-layered (MCML) simulations to model how light interacts with the wax crystals about 0.1 micrometers in diameter, thousands of times smaller than a typical grain of sand. The simulations revealed that when the wax layer sits against a dark background, it minimizes unwanted reflection, intensifying the blue hue. But if there is an air gap between the wax and the underlying tissue, reflectivity increases, causing a grayish cast. Replacing the air with water restores the original color by letting more light reach the chlorophyll-rich cells beneath the wax.
Although the superhydrophobic properties of nonacosan-10-ol have been well-documented, its connection to efficient UV reflection remains less understood. Shielding against UV rays is important for survival in desert environments, where the radiation can harm plant cells. However, the researchers suspect there’s more to it. The glaucous sheen could also be a visual cue for insect pollinators like a neon sign pointing toward the plant’s reproductive organs. Insects can see UV light, which is invisible to the human eye, and many also have heightened sensitivity to blue wavelengths.
Lost to time
Although E. horridus is known to accumulate the secondary alcohol nonacosan-10-ol in its epicuticular wax, how this compound is biosynthesized remains a mystery. By contrast, wax biosynthesis has been extensively studied in Arabidopsis thaliana and other model plants in the angiosperm group (flowering plants), which evolved much later. In Arabidopsis, nonacosan-10-ol is not detected; instead, nonacosan-14-ol and nonacosan-15-ol are produced as secondary alcohols by a characterized pathway.
To investigate how the E. horridus produces its distinctive wax compound, the team focused on KCS (keto-acyl-CoA synthases) enzymes, which they suspected to be responsible for nonacosan-10-ol biosynthesis. However, introducing these enzymes into a model plant did not result in production of the compound—suggesting that additional, as-yet-unknown pathways are likely involved.
“Why do the leaves of Encephalartos horridus, an endangered South African cycad, appear strikingly blue even though they contain no blue pigments? The question itself is scientifically fascinating—it uncovers a natural optical strategy far more refined than we might expect from plants. Understanding this mechanism not only deepens our grasp of plant adaptation in extreme environments but could also inspire nature-based technologies,” Nobusawa said.
“The next step is to figure out how the plant makes the special wax compound, nonacosan-10-ol, and to uncover the genes and enzymes behind it. In the long run, the goal is to understand how this adaptation evolved and to use these insights to develop new materials inspired by nature.”
Other members of the research team were Makoto Kusaba also from HU’s Graduate School of Integrated Sciences for Life, Takashi Okamoto from Kyushu Institute of Technology, and Michiharu Nakano from Kochi University.
/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.
The first stars in the universe may have been much smaller than we thought, new research hints — possibly explaining why it’s so hard to find evidence they ever existed.
According to the new research, the earliest generation of stars had a difficult history. These stars came to be in a violent environment: inside a huge gas cloud whipping with supersonic-speed turbulence at velocities five times the speed of sound (as measured in Earth’s atmosphere).
A simulation underpinning the new research also showed gases clustering into lumps and bumps that appeared to herald a coming starbirth. The cloud broke apart, creating pieces from which clusters of stars seemed poised to emerge. One gas cloud eventually settled into the right conditions to form a star eight times the mass of our sun — much smaller than the 100-solar-mass behemoths researchers previously imagined in our early universe.
These findings hint that the first supergiant stars in history may have come to be in stellar networks — not in splendid isolation, as previously thought.
“With the presence of supersonic turbulence, the cloud becomes fragmented into multiple smaller clumps, leading to the formation of several less massive stars instead,” principal researcher Ke-Jung Chen, a research fellow at the Academia Sinica Institute of Astronomy and Astrophysics in Taiwan, told LiveScience by email.
This glimpse of our early history is crucial in learning about the origins of our galaxy, as well as our solar system.
“These first stars played a crucial role in shaping the earliest galaxies, which eventually evolved into systems like our own Milky Way,” Chen wrote. With this new model in hand, he added, fresh observations can bring the research further, studying starbirth and galaxy formation using both computer models and NASA’s powerful James Webb Space Telescope.
Get the world’s most fascinating discoveries delivered straight to your inbox.
Simulating the universe
Researchers generated their fresh understanding of early stars using the Gizmo simulation code, which is used to study astronomical phenomena ranging from black holes to magnetic fields, and a project called IllustrisTNG that has previously been shown to accurately replicate galaxy formation. Their goal was to study the conditions in our cosmos a few hundred million years after the Big Bang, 13.8 billion years ago.
Related: Scientists just recreated the universe’s first ever molecules — and the results challenge our understanding of the early cosmos
Given the sheer scale of the universe, the simulation focused on a single area: a dense structure, roughly 10 million times the mass of our sun, called a dark matter minihalo. (Dark matter makes up most of the stuff of our universe, but doesn’t interact with light, and cannot be sensed by telescopes. We can, however, infer the presence of dark matter through its gravitational effect on other objects.)
Simulations of a huge structure, known as a dark matter minihalo, show gas moving in an extremely turbulent environment at supersonic speed. From left to right are images showing different stages in the minihalo’s formation, with lumpy structures researchers believe are caused by gas flows. (Image credit: ASIAA/Meng-Yuan Ho & Pei-Cheng Tung)
The researchers examined how gas particles were moving in relatively small regions of space inside the halo, each region measuring roughly three light-years across. Simulations showed the dark matter minihalo attracts gas through sheer gravity, and by doing so, generates both supersonic-speed turbulence and gas cloud clumping. Violence was therefore a part of creating early stars.
This traumatic environment created another side effect: there were fewer huge, early stars than we previously imagined. Previous research had suggested we could have had early stars of more than 100 solar masses each. Eventually, these old stars would have exploded as supernovas, leaving behind traceable remnants that newer stars would incorporate as they grew.
Newer stars, however, do not show any chemical signatures of giant elders inside them — showing that a first generation of enormous stars may have been rare indeed.
Chen’s team isn’t done yet. They are now using the dark matter halos to see how supersonic turbulence worked more generally in the early universe, especially as the first stars came to light in an era more than 13 billion years ago, called “the cosmic dawn.”
“This paper is part of a collaborative effort aimed at understanding the cosmic dawn through investigating the formation and evolution of the first stars,” Chen said.
The next set of simulations may also include magnetic fields, he added. We can see in galaxies today that supersonic turbulence boosts magnetic fields and influences star formation; it may very well be that magnetism was just as crucial to star formation in the early universe.
Chen’s team published their results July 30 in the journal Astrophysical Journal Letters.
Earth’s skies are about to be graced by a full Moon, plump and shining in the sky, swimming across a lake of stars like a giant sturgeon.
Okay, maybe not, but the full Moon that will rise on Friday 8 and Saturday 9 August is called the full Sturgeon Moon, the old US Farmer’s Almanac name for the Moon for the late summer month of August.
This particular full Moon will arrive just in time to mess with the Perseid meteor shower, which is due to peak on August 13 – but at least when you look at it, you’ll know why we call it the Sturgeon Moon.
Related: One of 2025’s Best Meteor Showers Is Upon Us: Here’s How to Watch
Each of the months in the year has its own Farmer’s Almanac name. Each name can be just one of several, too, depending on cultural contexts and geographical locations. These can have to do with the geography of the region, agricultural calendars, and Native American customs.
The sturgeon is among the largest of the freshwater bony fish, with a lineage that can be traced back to the Jurassic. It has a cartilaginous skeleton, and its long body is armored by bony plates called scutes. They live a long time – their average lifespan is 50 to 60 years, and the largest of them can grow up to several meters in length.
It’s highly prized for its meat and caviar, and can be found – among other places throughout the Northern Hemisphere – in the Great Lakes of North America. Late summer is apparently prime sturgeon fishing season in the Great Lakes, which is why August’s Moon is named for it.
It’s worth noting that sturgeons are more critically endangered than any other group species, and maybe ought to be left alone.
Beluga sturgeons are the largest recorded members of the species group. (Zocha_K/E+/Getty Images)
Other names for August’s full Moon, according to the Farmer’s Almanac, include the Flying Up Moon, from the Cree, describing young birds leaving the nest; the Mountain Shadows Moon, from the Tlingit; and several other names to do with gathering the harvest before colder weather sets in.
The full Moon is always directly opposite the side of Earth from the Sun, so look for its rise on the eastern horizon at sunset. It’ll be at its fullest in the early hours of August 9, just before it sets behind the western horizon.
The northern lights will appear in 18 U.S. states due to a geomagnetic storm
The northern lights could be visible in 18 U.S. states on August 7 and 8, caused by a G2 geomagnetic storm.
The space forecasters predicted the minor (G1) to moderate (G2) geomagnetic storm levels with a chance of a strong (G3) storm.
The storm is a result of a coronal mass ejection (CME) from the sun. The charged particles, created by an explosion in the outermost atmosphere of the sun, have been rushing towards Earth since Tuesday, August 5.
The storm is a result of a coronal mass ejection (CME) from the sun.
These particles interact with the magnetic field of our planet in a way that triggers aurora borealis, also known as northern lights.
According to the U.S. National Oceanic and Atmospheric Administration (NOAA), the northern lights could be visible in Washington, New York, Michigan, New Hampshire, Alaska, Minnesota, Montana, Illinois, Nebraska, Oregon, Idaho, Vermont, Maine, South Dakota, North Dakota, Wisconsin, Iowa, and Wyoming.
Aurora requires a dark sky to be visible. The time between 9 p.m. and midnight will be the prime window. At this time, the auroras are expected to be more visible due to a “moderate” storm.
To view the dazzling sky in colours, go out to an open place away from city lights and look toward the northern side of the sky.
According to NOAA, the geomagnetic storm could cause “manageable effects to some technological infrastructure.”
To view the dazzling sky in colours, go out to an open place away from city lights and look toward the northern horizon of the sky.
The aurora forecasting applications and NOAA Space Weather Prediction Centre provide real-time updates.
If the sky is clear and you are standing in the right place, such as on a lakeside or in a field, you will find colourful (green, purple and reddish) hues among the stars.
What causes the Northern Lights?
The interaction of the charged particles of the Sun with Earth’s magnetic field causes northern lights.
Researchers have successfully induced lucid dreams involving feelings of compassion and a sense of ego-loss in four participants. The feat was achieved by exposing the quartet to a specially designed virtual reality experience in the hours before bedtime, illustrating the potential of VR to influence subconscious processes and generate lasting psychological changes.
“By bridging the realms of virtual waking and dreaming states, this study opens new avenues for understanding how combining immersive technologies and sleep-engineering technologies might be leveraged for therapeutic and personal growth in waking life,” write the researchers in a new study.
To conduct their unique experiment, the team recruited four people who claimed to have regular lucid dreams, in which a sleeper becomes aware that they are dreaming and can often control elements of the dreamscape. Using virtual reality headsets, the participants were introduced to a program called Ripple, which aimed to generate feelings of awe, oneness and the loss of self – also known as ego-attenuation.
Previous studies have demonstrated that similar VR programs can trigger mystical experiences and ego-dissolution to the same extent as psychedelic drugs. In the case of Ripple, users saw themselves as a glowing sphere of light which then moved in synchrony with other people’s “energetic bodies”, before merging with them to produce a sense of oneness between participants and facilitators.
After an initial introduction to the experience, the volunteers were asked to return to the lab a week later for a second session, this time bringing their pyjamas. Three hours before going to bed, the VR headsets were fired up and participants re-entered the world of Ripple, before the researchers monitored their sleep using electroencephalography (EEG).
When brain activity readings indicated that the slumbering subjects had entered REM sleep, the researchers quietly played sounds from Ripple in an attempt to trigger lucid dreams resembling the VR. “Three participants experienced lucid dreams about Ripple that night, and all four reported dreams containing elements of Ripple,” they write.
Follow-up interviews then confirmed that the emotional and psychological effects of Ripple were recapitulated in these lucid dreams and even spilled over into waking life. For instance, the study authors explain that “Participant 4 reported a profound experience of interconnectedness and ego-dissolution,” while “participants 2 and 3 reported heightened waking sensory perception, such as touch and smell, for several days.”
Despite the small scale of the study, the researchers conclude that their results “underscore a way to expand VR’s benefits via VR-based dreaming.”
“This study opens the door for future research to now test the degree to which lucid dreaming combined with VR can benefit psychological well-being,” they write. “In particular, we envision many ways for dream content to synchronize with ego-attenuation and the perpetuation of awe in VR environments.”
The study has been published in the journal Neuroscience of Consciousness.
A recent coronal mass ejection is expected to disrupt Earth’s magnetic field once again Friday night, potentially bringing the northern lights to more than a dozen states in the northern U.S., according to the National Oceanic and Atmospheric Administration.
Earth’s magnetic field will likely be disrupted by a recent coronal mass ejection, forecaster said.
APA/AFP via Getty Images
Key Facts
NOAA forecast a Kp index of five on a scale of nine, suggesting the northern lights have a minimal chance of being seen as far south as Iowa, with a higher chance of seeing aurora borealis in states along the Canadian border.
Periods of minor geomagnetic storms are expected late Friday and early Saturday because of “influences” from a coronal mass ejection emitted from the sun on Aug. 5, according to NOAA.
Calmer auroral activity is forecast for Saturday and Sunday night with a maximum Kp index of about four and just over three expected, respectively, according to NOAA’s three-day outlook.
Where Will The Northern Lights Be Visible?
Northern Canada and Alaska will have the highest likelihood of viewing the phenomenon, once the sun sets in the state. A lesser chance is forecast for parts of Washington, Idaho, Montana, Wyoming, North Dakota, South Dakota, Minnesota, Iowa, Wisconsin, Michigan, New York, Vermont, New Hampshire and Maine. (See map below.)
Friday night’s view line.
NOAA
What’s The Best Way To See The Northern Lights?
NOAA suggests traveling to a north-facing, high vantage point away from light pollution sometime between 10 p.m. and 2 a.m. local time, when the lights are most active.
What’s The Best Way To Photograph The Northern Lights?
If using a regular camera, photography experts told National Geographic it’s best to use a wide-angle lens, an aperture or F-stop of four or less and a focus set to the furthest possible setting. With a smartphone, NOAA recommends enabling night mode and disabling flash, while also using a tripod to stabilize the image.
Key Background
More people in the U.S. have been exposed to northern lights displays in the last year as activity peaked on the sun’s surface. This peak, a “solar maximum,” occurs throughout the sun’s 11-year cycle alongside a “solar minimum” and indicates an increase in solar events like coronal mass ejections and solar flares. The swirling, colorful lights of the aurora borealis are created from electrons of these events colliding with molecules of oxygen and nitrogen in the Earth’s atmosphere, causing them to become “excited” before releasing energy in the form of light.
Further Reading
ForbesNorthern Lights Displays Hit A 500-Year Peak In 2024—Here’s Where You Could Catch Aurora Borealis In 2025By Ty Roush
Researchers at the University of Wien (TU Wien) in collaboration with those at ETH Zurich have unlocked quantum states in glass spheres, sized smaller than a grain of sand, without having to resort to ultra-low temperatures.
This record-breaking achievement has pushed the boundaries of quantum physics, making it easier to study quantum properties in ways that were considered impossible before.
Quantum physics is a relatively newer field of science that attempts to explain the world around us through the study of matter and energy at atomic and submatomic scales.
While we are still beginning to scratch surface of this field, applications in areas such as sensing, computation, simulation as well as cryptography are already being developed.
As the field expands, researchers are also keen to know the limits of quantum physics. So far, studies have focused on understanding properties such as entanglement or superposition at subatomic levels.
However, researchers at ETH Zurich and TU Wein wondered if objects larger than atoms and molecules also displayed quantum properties.
Oscillations in quantum states
In the everyday world, we look at oscillations as big movements. For instance, the pendulum of a clock can oscillate at various angles and varying speeds. But as we zoom into microscopic levels, oscillations take a different form. Microscopic particles wobble at all times.
“This oscillation depends on the energy and on how the particle is influenced by its environment and its temperature,” explained Carlos Gonzalez-Ballestero from the Institute of Theoretical Physics at TU Wien, who led the work.
“In the quantum world, however, things are different: if you look at oscillations with very low energy, you find that there are very specific’ oscillation quanta’”.
The minimum vibration amplitude is known as the ground state, with excited states existing sequentially with an increase in vibration and energy levels. While there are no intermediate states, a particle can exist in a combination of different vibration states.
To identify the quantum states of a particle, scientists need to isolate it from perturbations arising from its surroundings. This is why quantum experiments are carried out at extremely low temperatures close to absolute zero.
Quantum state at room temperature
The research collaboration worked on a technique to a nanoparticle to its quantum state even when it was not near an ultracooled state. The nanoparticle used in the experiments was not perfectly round but slightly elliptical.
“When you hold such a particle in an electromagnetic field, it starts to rotate around an equilibrium orientation, much like the needle of a compass,” added Gonzalez-Ballestero in a press release.
Graphic representation of the system of lasers and mirrors used by scientists in their experiments. Image credit: Lorenzo Dania (ETHZ)
To study the quantum properties of this vibration, the research team used lasers and mirror systems that could perform the dual role of supplying or even extracting energy from it.
“By adjusting the mirrors in a suitable way, you can ensure that energy is extracted with a high probability and only added with a low probability. The energy of the rotational movement thus decreases until we approach the quantum ground state,’ Gonzalez-Ballestero further explained.
The researchers succeeded in bringing the nanoparticle’s rotation to a state resembling the ground state. Interestingly, this was achieved when the particle was several hundred degrees hot, instead of being ultracooled.
“You have to consider different degrees of freedom separately,” said Gonzalez-Ballestero, explaining their achievement. “This allows the energy of the rotational movement to be reduced very effectively without having to reduce the internal thermal energy of the nanoparticle at the same time. Amazingly, the rotation can freeze, so to speak, even though the particle itself has a very high temperature.”
The achievement allows particles to be studied in significantly ‘purer’ quantum states without requiring ultracold temperatures.
The research findings were published in the journal Nature Physics.
Scientists have found new evidence that a massive comet trail may have caused climate upheaval on Earth more than 12,000 years ago.
Tiny particles detected in ocean sediment cores suggest that dust from a large, disintegrating comet entered Earth’s atmosphere around the beginning of the Younger Dryas event, a period of abrupt cooling that caused temperatures in the Northern Hemisphere to plummet by up to 18 degrees Fahrenheit (10 degrees Celsius) within about a year. The researchers shared their findings Aug. 6 in the journal PLOS One.
“The amount of comet dust in the atmosphere was enough to cause a short-term ‘impact winter,’” which led to an extended period of cooling, study co-author Vladimir Tselmovich, an Earth scientist at Borok Geophysical Observatory in Russia, said in a statement.
After 7,000 years of gradual warming, Earth experienced a period of rapid cooling about 12,900 years ago. Dubbed the Younger Dryas, after the wildflowers of the Dryas genus that flourished in colder temperatures, this chillier era lasted about 1,200 years before warming resumed.
Competing hypotheses describe what kicked off the Younger Dryas. Most scientists think cold freshwater lakes poured into oceans as Earth’s glaciers melted, and this weakened large-scale ocean currents that brought warm water northward from the tropics. Others have proposed that impacts from a disintegrating comet filled the atmosphere with dust and destabilized the planet’s ice sheets, triggering long-term cooling.
However, no one has found evidence of an impact crater dated to the start of the Younger Dryas that could have triggered such an event. What’s more, some scientists claim that some of the supposed evidence for the hypothesis — such as “black mats” that contain metals common to asteroids from around the start of the Younger Dryas — could instead be explained by more mundane processes.
Related: The Gulf Stream stopped pumping nutrients during the last ice age — and the same could be happening now
Get the world’s most fascinating discoveries delivered straight to your inbox.
In the new study, researchers studied ocean sediment cores from Baffin Bay, between Greenland and Canada, to search for evidence of a possible impact. The team found tiny metallic particles that could have come from comet dust, along with even smaller particles with high levels of platinum and iridium, elements that are common in comets and meteorites.
They also found microscopic spherical particles that most likely formed on Earth but may contain small amounts of material from a comet or asteroid. All of these appeared around the time the Younger Dryas began.
The new study doesn’t directly confirm the impact hypothesis. Instead, the particles act as indirect evidence of an impact or “airburst,” which occurs when a meteor explodes inside a planet’s atmosphere before hitting the ground.
These impacts might have come from a large, disintegrating comet that later gave rise to Comet Encke and the Taurid Complex, the source of the annual Taurid meteor shower, the researchers wrote in the study.
However, more research is needed to confirm this proposal. The team plans to test other ocean cores for similar particles to confirm whether the Younger Dryas began shortly after those particles appear in the geological record.
Shrinking glaciers present some of the most visible signs of ongoing anthropogenic climate change. Their melting also alters landscapes, increases local geohazards, and affects regional freshwater availability and global sea level rise.
Worldwide observations show that global glacier decline through the early 21st century has been historically unprecedented. Modeling research suggests that 16 kilograms of glacier ice melt for every kilogram of carbon dioxide (CO2) emitted, and other studies indicate that every centimeter of sea level rise exposes an additional 2–3 million people to annual flooding.
Science has benefited from a diverse set of spaceborne missions and strategies for estimating glacier mass changes at regional to global scales.
For more than 130 years, the World Glacier Monitoring Service and its predecessor organizations have coordinated international glacier monitoring. This effort started with the worldwide collection, analysis, and distribution of in situ observations [World Glacier Monitoring Service, 2023]. In the 20th century, remote sensing data from airborne and spaceborne sensors began complementing field observations.
Over the past 2 decades, science has benefited from a diverse set of spaceborne missions and strategies for estimating glacier mass changes at regional to global scales (see Figure 2 of Berthier et al. [2023]). However, this research is challenged by its dependence on the open accessibility of observations from scientific satellite missions. The continuation and accessibility of several satellite missions are now at risk, presenting potential major gaps in our ability to observe glaciers from space.
We discuss here the history, strengths, and limitations of several strategies for tracking changes in glaciers and how combining studies from the multiple approaches available—as exemplified by a recent large-scale effort within the research community—improves the accuracy of analyses of glacial mass changes. We also outline actions required to secure the future of long-term glacier monitoring.
The Glacier Mass Balance Intercomparison Exercise
Glaciological observations from in situ measurements of ablation and accumulation, generally carried out with ablation stakes and in snow pits, represent the backbone of glacier mass balance monitoring. For more than 30 years, glaciologists have undertaken this work at 60 reference glaciers worldwide, with some observations extending back to the early 20th century.
Researchers conduct glaciological fieldwork, using ablation stakes and other tools, on Findelengletscher in Zermatt, Switzerland, in October 2024. Credit: Andreas Linsbauer, University of Zurich
These observations provide good estimates of the interannual variability of glacier mass balance and have been vital for process understanding, model calibration, and long-term monitoring. However, because of the limited spatial coverage of in situ observations, the long-term trends they indicate may not accurately represent mass change across entire glacial regions, and some largely glacierized regions are critically undersampled.
Airborne geodetic surveys provide wider views of individual glaciers compared with point measurements on the ground, and comparing ice elevation changes in airborne data allows researchers to identify and quantify biases in field observations at the glacier scale [Zemp et al., 2013]. Meanwhile, geodetic surveys from spaceborne sensors enable many opportunities to assess glacier elevation and mass changes at regional to global scales.
The Glacier Mass Balance Intercomparison Exercise (GlaMBIE), launched in 2022, combined observations from in situ and remote sensing approaches, compiling 233 regional glacier mass change estimates from about 450 data contributors organized in 35 research teams.
The results of the Glacier Mass Balance Intercomparison Exercise show that since 2000, glaciers have lost between 2% and 39% of their mass depending on the region and about 5% globally.
The results of this community effort, published in February 2025, show that since 2000, glaciers have lost between 2% and 39% of their mass depending on the region and about 5% globally [The GlaMBIE Team, 2025]. These cumulative losses amount to 273 gigatons of water annually and contribute 0.75 millimeter to mean global sea level rise each year. Compared with recent estimates for the ice sheets [Otosaka et al., 2023], glacier mass loss is about 18% larger than the loss from the Greenland Ice Sheet and more than twice the loss from the Antarctic Ice Sheet.
GlaMBIE provided the first comprehensive assessment of glacier mass change measurements from heterogeneous in situ and spaceborne observations and a new observational baseline for global glacier change and impact assessments [Berthier et al., 2023]. It also revealed opportunities and challenges ahead for monitoring glaciers from space.
Strategies for Glacier Monitoring from Space
GlaMBIE used a variety of technologies and approaches for studying glaciers from space. Many rely on repeated mapping of surface elevations to create digital elevation models (DEMs) and determine glacier elevation changes. This method provides multiannual views of glacier volume changes but requires assumptions about the density of snow, firn, and ice to convert volume changes to mass changes, which adds uncertainty because conversion factors can vary substantially.
Optical stereophotogrammetry applied to spaceborne imagery allows assessment of glacier elevation changes across scales. Analysis of imagery from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on board Terra from 2000 to 2019 yielded elevation changes at 100-meter horizontal resolution for almost all of Earth’s glaciers [Hugonnet et al., 2021]. At the regional scale, finer spatial resolution is possible with the ongoing French Satellite pour l’Observation de la Terre (SPOT) mission series and with satellites like GeoEye, Pléiades, and WorldView.
The Glacier Mass Balance Intercomparison Exercise (GlaMBIE) used data from a fleet of satellites that monitor glaciers worldwide using optical, radar, laser, and gravity measurements. Clockwise from left in this image are illustrations of Terra, CryoSat, ICESat-2, and the twin GRACE spacecraft above a map of elevation change for the Vatnajökull ice cap in Iceland. Credit: ESA/NASA/Planetary Visions
Two spaceborne missions applying synthetic aperture radar (SAR) interferometry have been used to assess glacier elevation changes. In February 2000, the Shuttle Radar Topography Mission (SRTM) produced a near-global DEM at a spatial resolution of 30 meters. The second mission, TerraSAR-X add-on for Digital Elevation Measurement (TanDEM-X), has operated since 2010, providing worldwide DEMs at a pixel spacing of 12 meters.
Data from both missions can be used to assess glacier elevation changes in many regions [Braun et al., 2019], capitalizing on the high spatial resolution and the ability of radar signals to penetrate clouds. However, measurements using C- and X-band radar—as both SRTM and TanDEM-X have—are subject to uncertainties because of topographic complexity in high mountain terrain and because the radar signals can penetrate into snow and firn.
Laser and radar altimetry allow us to determine glacier elevation changes along ground tracks or swaths, which can be aggregated to produce regional estimates. Laser altimetry has been carried out by NASA’s Ice, Cloud and Land Elevation Satellites (ICESat and ICESat-2) and the Global Ecosystem Dynamics Investigation (GEDI) on board the International Space Station (ISS) [Menounos et al., 2024; Treichler et al., 2019].
Spaceborne gravimetry offers an alternative to elevation-focused methods, allowing scientists to estimate mass changes by measuring changes in Earth’s gravitational field.
Spaceborne radar altimetry has a long tradition of measuring ocean and land surfaces, but these missions’ large detection footprints and the challenges of mountainous terrain hampered the use of early missions (e.g., ERS, Envisat) for glacier applications. The European Space Agency’s (ESA) CryoSat-2, which launched in 2010, offered improved coverage of the polar regions, denser ground coverage, a sharper footprint, and other enhanced capabilities that opened its use for monitoring global glacier elevation changes [Jakob and Gourmelen, 2023].
Both laser altimetry and radar altimetry provide elevation change time series at monthly or quarterly resolution for regions with large ice caps and ice fields (e.g., Alaska, the Canadian Arctic, Svalbard, and the periphery of the Greenland and Antarctic Ice Sheets). However, assessing mountain regions with smaller glaciers (e.g., Scandinavia, central Europe, Caucasus, and New Zealand) remains challenging because of the complex terrain and limited spatial coverage.
Spaceborne gravimetry offers an alternative to elevation-focused methods, allowing scientists to estimate mass changes across the ocean and in water and ice reservoirs by measuring changes in Earth’s gravitational field. Two missions, the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE) and the NASA/GFZ Helmholtz Centre for Geosciences GRACE Follow-on mission (GRACE-FO), have provided such measurements almost continuously since 2002.
Gravimetry offers more direct estimates of glacier mass change than other methods [Wouters et al., 2019]. However, the data must be corrected to account for effects of atmospheric drag and oceanic variability, glacial isostatic adjustment, nonglacier hydrological features, and other factors [Berthier et al., 2023].
Estimates from the GRACE and GRACE-FO missions are most relevant for regions with large areas of glacier coverage (>15,000 square kilometers) because of the relatively coarse resolution (a few hundred kilometers) of the gravity data and because of issues such as poor signal-to-noise ratios in regions with small glacier areas and related small mass changes.
Securing Glacier Monitoring over the Long Term
The work of GlaMBIE to combine observations from the diverse approaches above points to several interconnected requirements and challenges for improving the comprehensiveness of global glacier change assessments and securing the future of long-term glacier monitoring.
First, we must extend the existing network of in situ glaciological observations to fill major data gaps. Such gaps remain in many regions, including Central Asia, Karakoram, Kunlun, and the Central Andes, where glaciers are vital for freshwater availability, as well as in the polar regions, where glaciers are key contributors to sea level rise. These networks also need to be updated to provide real-time monitoring, which will improve understanding of glacier processes and help calibrate and validate remote sensing data and numerical modeling.
A sequence of aerial photographs taken in 1980 from about 11,000-meter altitude of Grey Glacier in the Southern Patagonian Ice Field (top) was used to generate a 3D model of the glacier (bottom). Credit: 3D reconstruction by Livia Piermattei and Camilo Rada using images from the Servicio Aerofotogramétrico de la Fuerza Aérea de Chile (SAF)
Second, we must continue unlocking historical archives of airborne and spaceborne missions to expand the spatiotemporal coverage of the observations used in glacier mass change assessments. Data from declassified spy satellites, such as CORONA and Hexagon, have provided stereo observing capabilities at horizontal resolutions of a few meters and offer potential to assess glacier elevation changes back to the 1960s and 1970s. Aerial photography has provided unique opportunities to reconstruct glacier changes since the early 20th century, such as in Svalbard, Greenland, and Antarctica. Beyond individual and institutional studies, we need open access to entire national archives of historical images to safeguard records of past glacier changes on a global scale.
We must ensure the continuation of space-based glacier monitoring with open-access and high-resolution sensors.
Third, we must ensure the continuation of space-based glacier monitoring with open-access and high-resolution sensors, following the examples of the Sentinel missions, SPOT 5, and Pléiades. For optical sensors, there is an urgent need for new missions collecting open-access, high-resolution stereo imagery. The French space agency’s forthcoming CO3D (Constellation Optique en 3D) mission, scheduled for launch in 2025, will help meet this need if its data are openly available. But with the anticipated decommissioning of ASTER [Berthier et al., 2024] and the suspended production of new ArcticDEM and REMA (Reference Elevation Model of Antarctica) products from WorldView satellite data as a result of recent U.S. funding cuts, additional replacement missions are needed to observe elevation changes over individual glaciers.
For radar imaging and altimetry, the SAR mission TanDEM-X and the radar altimeter CryoSat-2 are still operating with expected mission extensions into the late 2020s, and ESA’s SAR-equipped Harmony mission is expected to launch in 2029. With the planned launch of the Copernicus Polar Ice and Snow Topography Altimeter (CRISTAL) in 2027, ESA aims to establish a long-term, cryosphere-specific monitoring program.
The challenge with CRISTAL will be to ensure that its sensor and mission specifications are tailored for application over glaciers—a difficult task because of glaciers’ relatively small sizes, steep slopes, and distribution in mountain terrain. In the event of a gap between the end of CryoSat-2 and the start of CRISTAL, a bridging airborne campaign, similar to NASA’s Operation IceBridge between ICESat and ICESat-2, will be needed.
Grosser Aletschgletscher in Switzerland, seen in October 2015. Credit: Jürg Alean, SwissEduc, Glaciers online
For laser altimetry, we may face gaps in observations as well, as no follow-on is planned for the current ICESat-2 and GEDI science missions. ICESat-2, which measures Earth’s glacier topography and provides validation and coregistration points for other altimetry missions, is projected to run until the early to mid-2030s, but continuing missions must be initiated now. Future missions should combine high accuracy with full coverage over individual glaciers. Proposed concepts with swath lidar, such as EDGE (Earth Dynamics Geodetic Explorer) and CASALS (Concurrent Artificially-Intelligent Spectrometry and Adaptive Lidar System), could be game changers for repeated mapping because they would provide full coverage of glacier topography. Advancing Surface Topography and Vegetation (STV) as a targeted observable for NASA missions, as recommended by the National Academies’ 2017–2027 Decadal Survey, could extend such observations beyond the current science missions.
For gravimetry, we also face a potential gap in observations, depending on when GRACE-FO is decommissioned and when approved follow-up missions—GRACE-C and NGGM (Next Generation Gravity Mission)—launch. Regardless of launch dates, the usefulness of future missions for monitoring glacier mass changes across regions will strongly depend on the spatial resolution of their data and on the ability to separate glacier and nonglacier signals. Cofounded gravity missions such as the Mass-Change and Geosciences International Constellation (MAGIC), a planned joint NASA-ESA project with four satellites operating in pairs, could significantly improve the spatial and temporal resolution of the gravity data and their utility for glacier monitoring.
Bringing It All Together
Glaciers worldwide are diminishing at alarming rates, affecting everything from geohazards to freshwater supplies to sea level rise.
Glaciers worldwide are diminishing with global warming, and they’re doing so at alarming rates, affecting everything from geohazards to freshwater supplies to sea level rise. Understanding as well as possible the details of glacier change from place to place and how these changes may affect different communities requires combining careful observations from a variety of field, airborne, and—increasingly—spaceborne approaches.
In light of major existing and impending observational gaps, the scientific community along with government bodies and others should work together to expand access to relevant historical data and extend present-day monitoring capabilities. Most important, space agencies and their sponsor nations must work rapidly to replace and improve upon current satellite missions to ensure long-term glacier monitoring from space. Given the climate crisis, we also call for open scientific access to data from commercial and defense missions to fill gaps and complement civil missions.
As the work of GlaMBIE reiterated, the more complete the datasets we have, the better positioned we will be to comprehend and quantify glacier changes and related downstream impacts.
Acknowledgments
We thank Etienne Berthier, Dana Floricioiu, and Noel Gourmelen for their contributions to this article and all coauthors and data contributors of GlaMBIE for constructive and fruitful discussions during the project, which built the foundation to condense the information presented here. This article was enabled by support from ESA projects GlaMBIE (4000138018/22/I-DT) and The Circle (4000145640/24/NL/SC), with additional contributions from the International Association of Cryospheric Sciences (IACS).
References
Berthier, E., et al. (2023), Measuring glacier mass changes from space—A review, Rep. Prog. Phys., 86(3), 036801, https://doi.org/10.1088/1361-6633/acaf8e.
Berthier, E., et al. (2024), Earth-surface monitoring is at risk—More imaging tools are urgently needed, Nature, 630(8017), 563, https://doi.org/10.1038/d41586-024-02052-x.
Braun, M. H., et al. (2019), Constraining glacier elevation and mass changes in South America, Nat. Clim. Change, 9(2), 130–136, https://doi.org/10.1038/s41558-018-0375-7.
Hugonnet, R., et al. (2021), Accelerated global glacier mass loss in the early twenty-first century, Nature, 592(7856), 726–731, https://doi.org/10.1038/s41586-021-03436-z.
Jakob, L., and N. Gourmelen (2023), Glacier mass loss between 2010 and 2020 dominated by atmospheric forcing, Geophys. Res. Lett., 50(8), e2023GL102954, https://doi.org/10.1029/2023GL102954.
Menounos, B., et al. (2024), Brief communication: Recent estimates of glacier mass loss for western North America from laser altimetry, Cryosphere, 18(2), 889–894, https://doi.org/10.5194/tc-18-889-2024.
Otosaka, I. N., et al. (2023), Mass balance of the Greenland and Antarctic Ice Sheets from 1992 to 2020, Earth Syst. Sci. Data, 15(4), 1,597–1,616, https://doi.org/10.5194/essd-15-1597-2023.
The GlaMBIE Team (2025), Community estimate of global glacier mass changes from 2000 to 2023, Nature, 639, 382–388, https://doi.org/10.1038/s41586-024-08545-z.
Treichler, D., et al. (2019), Recent glacier and lake changes in high mountain Asia and their relation to precipitation changes, Cryosphere, 13(11), 2,977–3,005, https://doi.org/10.5194/tc-13-2977-2019.
World Glacier Monitoring Service (2023), Global Glacier Change Bulletin No. 5 (2020–2021), edited by M. Zemp et al., 134 pp., Zurich, Switzerland, https://wgms.ch/downloads/WGMS_GGCB_05.pdf.
Wouters, B., A. S. Gardner, and G. Moholdt (2019), Global glacier mass loss during the GRACE satellite mission (2002–2016), Front. Earth Sci., 7, 96, https://doi.org/10.3389/feart.2019.00096.
Zemp, M., et al. (2013), Reanalysing glacier mass balance measurement series, Cryosphere, 7(4), 1,227–1,245, https://doi.org/10.5194/tc-7-1227-2013.
Author Information
Michael Zemp ([email protected]), University of Zurich, Switzerland; Livia Jakob, Earthwave Ltd., Edinburgh, U.K.; Fanny Brun, Université Grenoble Alpes, Grenoble, France; Tyler Sutterley, University of Washington, Seattle; and Brian Menounos, University of Northern British Columbia, Prince George, Canada
Citation: Zemp, M., L. Jakob, F. Brun, T. Sutterley, and B. Menounos (2025), Glacier monitoring from space is crucial, and at risk, Eos, 106, https://doi.org/10.1029/2025EO250290. Published on [DAY MONTH] 2025.
In medicine and biotechnology, the ability to evolve proteins with new or improved functions is crucial, but current methods are often slow and laborious. Now, Scripps Research scientists have developed a synthetic biology platform that accelerates evolution itself-enabling researchers to evolve proteins with useful, new properties thousands of times faster than nature. The system, named T7-ORACLE, was described in Science on August 7, 2025, and represents a breakthrough in how researchers can engineer therapeutic proteins for cancer, neurodegeneration and essentially any other disease area.
This is like giving evolution a fast-forward button. You can now evolve proteins continuously and precisely inside cells without damaging the cell’s genome or requiring labor-intensive steps.”
Pete Schultz, co-senior author, the President and CEO of Scripps Research, where he also holds the L.S. “Sam” Skaggs Presidential Chair
Directed evolution is a laboratory process that involves introducing mutations and selecting variants with improved function over multiple cycles. It’s used to tailor proteins with desired properties, such as highly selective, high-affinity antibodies, enzymes with new specificities or catalytic properties, or to investigate the emergence of resistance mutations in drug targets. However, traditional methods often require repeated rounds of DNA manipulation and testing with each round taking a week or more. Systems for continuous evolution-where proteins evolve inside living cells without manual intervention-aim to streamline this process by enabling simultaneous mutation and selection with each round of cell division (roughly 20 minutes for bacteria). But existing approaches have been limited by technical complexity or modest mutation rates.
T7-ORACLE circumvents these bottlenecks by engineering E. coli bacteria-a standard model organism in molecular biology-to host a second, artificial DNA replication system derived from bacteriophage T7, a virus that infects bacteria and has been widely studied for its simple, efficient replication system. T7-ORACLE enables continuous hypermutation and accelerated evolution of biomacromolecules, and is designed to be broadly applicable to many protein targets and biological challenges. Conceptually, T7-ORACLE builds on and extends efforts on existing orthogonal replication systems-meaning they operate separately from the cell’s own machinery-such as OrthoRep in Saccharomyces cerevisiae (baker’s yeast) and EcORep in E. coli. In comparison to these systems, T7-ORACLE benefits from the combination of high mutagenesis, fast growth, high transformation efficiency, and the ease with which both the E. coli host and the circular replicon plasmid can be integrated into standard molecular biology workflows.
The T-7 ORACLE orthogonal system targets only plasmid DNA (small, circular pieces of genetic material), leaving the cell’s host genome untouched. By engineering T7 DNA polymerase (a viral enzyme that replicates DNA) to be error-prone, the researchers introduced mutations into target genes at a rate 100,000 times higher than normal without damaging the host cells.
“This system represents a major advance in continuous evolution,” says co-senior author Christian Diercks, an assistant professor of chemistry at Scripps Research. “Instead of one round of evolution per week, you get a round each time the cell divides-so it really accelerates the process.”
To demonstrate the power of T7-ORACLE, the research team inserted a common antibiotic resistance gene, TEM-1 β-lactamase, into the system and exposed the E. coli cells to escalating doses of various antibiotics. In less than a week, the system evolved versions of the enzyme that could resist antibiotic levels up to 5,000 times higher than the original. This proof-of-concept demonstrated not only T7-ORACLE’s speed and precision, but also its real-world relevance by replicating how resistance develops in response to antibiotics.
“The surprising part was how closely the mutations we saw matched real-world resistance mutations found in clinical settings,” notes Diercks. “In some cases, we saw new combinations that worked even better than those you would see in a clinic.”
But Diercks emphasizes that the study isn’t focused on antibiotic resistance per se.
“This isn’t a paper about TEM-1 β-lactamase,” he explains. “That gene was just a well-characterized benchmark to show how the system works. What matters is that we can now evolve virtually any protein, like cancer drug targets and therapeutic enzymes, in days instead of months.”
The broader potential of T7-ORACLE lies in its adaptability as a platform for protein engineering. Although the system is built into E. coli, the bacterium serves primarily as a vessel for continuous evolution. Scientists can insert genes from humans, viruses or other sources into plasmids, which are then introduced into E. coli cells. T7-ORACLE mutates these genes, generating variant proteins that can be screened or selected for improved function. Because E. coli is easy to grow and widely used in labs, it provides a convenient, scalable system for evolving virtually any protein of interest.
This could help scientists more rapidly evolve antibodies to target specific cancers, evolve more effective therapeutic enzymes, and design proteases that target proteins involved in cancer and neurodegenerative disease.
“What’s exciting is that it’s not limited to one disease or one kind of protein,” says Diercks. “Because the system is customizable, you can drop in any gene and evolve it toward whatever function you need.”
Moreover, T7-ORACLE works with standard E. coli cultures and widely used lab workflows, avoiding the complex protocols required by other continuous evolution systems.
“The main thing that sets this apart is how easy it is to implement,” adds Diercks. “There’s no specialized equipment or expertise required. If you already work with E. coli, you can probably use this system with minimal adjustments.”
T7-ORACLE reflects Schultz’s broader goal: to rebuild key biological processes-such as DNA replication, RNA transcription and protein translation-so they function independently of the host cell. This separation allows scientists to reprogram these processes without disrupting normal cellular activity. By decoupling fundamental processes from the genome, tools like T7-ORACLE help advance synthetic biology.
“In the future, we’re interested in using this system to evolve polymerases that can replicate entirely unnatural nucleic acids: synthetic molecules that resemble DNA and RNA but with novel chemical properties,” says Diercks. “That would open up possibilities in synthetic genomics that we’re just beginning to explore.”
Currently, the research team is focused on evolving human-derived enzymes for therapeutic use, and on tailoring proteases to recognize specific cancer-related protein sequences.
“The T7-ORACLE approach merges the best of both worlds,” says Schultz. “We can now combine rational protein design with continuous evolution to discover functional molecules more efficiently than ever.”
Source:
Scripps Research Institute
Journal reference:
Diercks, C. S., et al. (2025) An orthogonal T7 replisome for continuous hypermutation and accelerated evolution in E. coli. Science. doi.org/10.1126/science.adp9583.