Category: 7. Science

  • Northern lights may appear in 18 US states

    Northern lights may appear in 18 US states



    The northern lights will appear in 18 U.S. states due to a geomagnetic storm

    The northern lights could be visible in 18 U.S. states on August 7 and 8, caused by a G2 geomagnetic storm.

    The space forecasters predicted the minor (G1) to moderate (G2) geomagnetic storm levels with a chance of a strong (G3) storm.

    The storm is a result of a coronal mass ejection (CME) from the sun. The charged particles, created by an explosion in the outermost atmosphere of the sun, have been rushing towards Earth since Tuesday, August 5.

    The storm is a result of a coronal mass ejection (CME) from the sun.
    The storm is a result of a coronal mass ejection (CME) from the sun.

    These particles interact with the magnetic field of our planet in a way that triggers aurora borealis, also known as northern lights.

    According to the U.S. National Oceanic and Atmospheric Administration (NOAA), the northern lights could be visible in Washington, New York, Michigan, New Hampshire, Alaska, Minnesota, Montana, Illinois, Nebraska, Oregon, Idaho, Vermont, Maine, South Dakota, North Dakota, Wisconsin, Iowa, and Wyoming.

    Aurora requires a dark sky to be visible. The time between 9 p.m. and midnight will be the prime window. At this time, the auroras are expected to be more visible due to a “moderate” storm.

    To view the dazzling sky in colours, go out to an open place away from city lights and look toward the northern side of the sky.
    To view the dazzling sky in colours, go out to an open place away from city lights and look toward the northern side of the sky.

    According to NOAA, the geomagnetic storm could cause “manageable effects to some technological infrastructure.”

    To view the dazzling sky in colours, go out to an open place away from city lights and look toward the northern horizon of the sky.

    The aurora forecasting applications and NOAA Space Weather Prediction Centre provide real-time updates.

    If the sky is clear and you are standing in the right place, such as on a lakeside or in a field, you will find colourful (green, purple and reddish) hues among the stars.

    What causes the Northern Lights?

    The interaction of the charged particles of the Sun with Earth’s magnetic field causes northern lights.

    Continue Reading

  • Scientists Used Virtual Reality To Alter People’s Lucid Dreams

    Scientists Used Virtual Reality To Alter People’s Lucid Dreams

    Researchers have successfully induced lucid dreams involving feelings of compassion and a sense of ego-loss in four participants. The feat was achieved by exposing the quartet to a specially designed virtual reality experience in the hours before bedtime, illustrating the potential of VR to influence subconscious processes and generate lasting psychological changes.

    “By bridging the realms of virtual waking and dreaming states, this study opens new avenues for understanding how combining immersive technologies and sleep-engineering technologies might be leveraged for therapeutic and personal growth in waking life,” write the researchers in a new study.

    To conduct their unique experiment, the team recruited four people who claimed to have regular lucid dreams, in which a sleeper becomes aware that they are dreaming and can often control elements of the dreamscape. Using virtual reality headsets, the participants were introduced to a program called Ripple, which aimed to generate feelings of awe, oneness and the loss of self – also known as ego-attenuation.

    Previous studies have demonstrated that similar VR programs can trigger mystical experiences and ego-dissolution to the same extent as psychedelic drugs. In the case of Ripple, users saw themselves as a glowing sphere of light which then moved in synchrony with other people’s “energetic bodies”, before merging with them to produce a sense of oneness between participants and facilitators.

    After an initial introduction to the experience, the volunteers were asked to return to the lab a week later for a second session, this time bringing their pyjamas. Three hours before going to bed, the VR headsets were fired up and participants re-entered the world of Ripple, before the researchers monitored their sleep using electroencephalography (EEG).

    When brain activity readings indicated that the slumbering subjects had entered REM sleep, the researchers quietly played sounds from Ripple in an attempt to trigger lucid dreams resembling the VR. “Three participants experienced lucid dreams about Ripple that night, and all four reported dreams containing elements of Ripple,” they write.

    Follow-up interviews then confirmed that the emotional and psychological effects of Ripple were recapitulated in these lucid dreams and even spilled over into waking life. For instance, the study authors explain that “Participant 4 reported a profound experience of interconnectedness and ego-dissolution,” while “participants 2 and 3 reported heightened waking sensory perception, such as touch and smell, for several days.”

    Despite the small scale of the study, the researchers conclude that their results “underscore a way to expand VR’s benefits via VR-based dreaming.”

    “This study opens the door for future research to now test the degree to which lucid dreaming combined with VR can benefit psychological well-being,” they write. “In particular, we envision many ways for dream content to synchronize with ego-attenuation and the perpetuation of awe in VR environments.”

    The study has been published in the journal Neuroscience of Consciousness.

    Continue Reading

  • 15 States Might See Aurora Borealis Friday Night

    15 States Might See Aurora Borealis Friday Night

    Topline

    A recent coronal mass ejection is expected to disrupt Earth’s magnetic field once again Friday night, potentially bringing the northern lights to more than a dozen states in the northern U.S., according to the National Oceanic and Atmospheric Administration.

    Key Facts

    NOAA forecast a Kp index of five on a scale of nine, suggesting the northern lights have a minimal chance of being seen as far south as Iowa, with a higher chance of seeing aurora borealis in states along the Canadian border.

    Periods of minor geomagnetic storms are expected late Friday and early Saturday because of “influences” from a coronal mass ejection emitted from the sun on Aug. 5, according to NOAA.

    Calmer auroral activity is forecast for Saturday and Sunday night with a maximum Kp index of about four and just over three expected, respectively, according to NOAA’s three-day outlook.

    Where Will The Northern Lights Be Visible?

    Northern Canada and Alaska will have the highest likelihood of viewing the phenomenon, once the sun sets in the state. A lesser chance is forecast for parts of Washington, Idaho, Montana, Wyoming, North Dakota, South Dakota, Minnesota, Iowa, Wisconsin, Michigan, New York, Vermont, New Hampshire and Maine. (See map below.)

    What’s The Best Way To See The Northern Lights?

    NOAA suggests traveling to a north-facing, high vantage point away from light pollution sometime between 10 p.m. and 2 a.m. local time, when the lights are most active.

    What’s The Best Way To Photograph The Northern Lights?

    If using a regular camera, photography experts told National Geographic it’s best to use a wide-angle lens, an aperture or F-stop of four or less and a focus set to the furthest possible setting. With a smartphone, NOAA recommends enabling night mode and disabling flash, while also using a tripod to stabilize the image.

    Key Background

    More people in the U.S. have been exposed to northern lights displays in the last year as activity peaked on the sun’s surface. This peak, a “solar maximum,” occurs throughout the sun’s 11-year cycle alongside a “solar minimum” and indicates an increase in solar events like coronal mass ejections and solar flares. The swirling, colorful lights of the aurora borealis are created from electrons of these events colliding with molecules of oxygen and nitrogen in the Earth’s atmosphere, causing them to become “excited” before releasing energy in the form of light.

    Further Reading

    ForbesNorthern Lights Displays Hit A 500-Year Peak In 2024—Here’s Where You Could Catch Aurora Borealis In 2025

    Continue Reading

  • Quantum state unlocked in object at room temperature in world-first

    Quantum state unlocked in object at room temperature in world-first

    Researchers at the University of Wien (TU Wien) in collaboration with those at ETH Zurich have unlocked quantum states in glass spheres, sized smaller than a grain of sand, without having to resort to ultra-low temperatures.

    This record-breaking achievement has pushed the boundaries of quantum physics, making it easier to study quantum properties in ways that were considered impossible before. 

    Quantum physics is a relatively newer field of science that attempts to explain the world around us through the study of matter and energy at atomic and submatomic scales.

    While we are still beginning to scratch surface of this field, applications in areas such as sensing, computation, simulation as well as cryptography are already being developed. 

    As the field expands, researchers are also keen to know the limits of quantum physics. So far, studies have focused on understanding properties such as entanglement or superposition at subatomic levels.

    However, researchers at ETH Zurich and TU Wein wondered if objects larger than atoms and molecules also displayed quantum properties. 

    Oscillations in quantum states

    In the everyday world, we look at oscillations as big movements. For instance, the pendulum of a clock can oscillate at various angles and varying speeds. But as we zoom into microscopic levels, oscillations take a different form. Microscopic particles wobble at all times. 

    “This oscillation depends on the energy and on how the particle is influenced by its environment and its temperature,” explained Carlos Gonzalez-Ballestero from the Institute of Theoretical Physics at TU Wien, who led the work. 

    “In the quantum world, however, things are different: if you look at oscillations with very low energy, you find that there are very specific’ oscillation quanta’”. 

    The minimum vibration amplitude is known as the ground state, with excited states existing sequentially with an increase in vibration and energy levels. While there are no intermediate states, a particle can exist in a combination of different vibration states. 

    To identify the quantum states of a particle, scientists need to isolate it from perturbations arising from its surroundings. This is why quantum experiments are carried out at extremely low temperatures close to absolute zero. 

    Quantum state at room temperature

    The research collaboration worked on a technique to a nanoparticle to its quantum state even when it was not near an ultracooled state. The nanoparticle used in the experiments was not perfectly round but slightly elliptical.

    “When you hold such a particle in an electromagnetic field, it starts to rotate around an equilibrium orientation, much like the needle of a compass,” added Gonzalez-Ballestero in a press release. 

    Graphic representation of the system of lasers and mirrors used by scientists in their experiments. Image credit: Lorenzo Dania (ETHZ)

    To study the quantum properties of this vibration, the research team used lasers and mirror systems that could perform the dual role of supplying or even extracting energy from it. 

    “By adjusting the mirrors in a suitable way, you can ensure that energy is extracted with a high probability and only added with a low probability. The energy of the rotational movement thus decreases until we approach the quantum ground state,’ Gonzalez-Ballestero further explained. 

    The researchers succeeded in bringing the nanoparticle’s rotation to a state resembling the ground state. Interestingly, this was achieved when the particle was several hundred degrees hot, instead of being ultracooled. 

    “You have to consider different degrees of freedom separately,” said Gonzalez-Ballestero, explaining their achievement. “This allows the energy of the rotational movement to be reduced very effectively without having to reduce the internal thermal energy of the nanoparticle at the same time. Amazingly, the rotation can freeze, so to speak, even though the particle itself has a very high temperature.” 

    The achievement allows particles to be studied in significantly ‘purer’ quantum states without requiring ultracold temperatures. 

    The research findings were published in the journal Nature Physics.  

    Continue Reading

  • Massive comet trail may have transformed Earth’s climate more than 12,000 years ago, tiny particles suggest

    Massive comet trail may have transformed Earth’s climate more than 12,000 years ago, tiny particles suggest

    Scientists have found new evidence that a massive comet trail may have caused climate upheaval on Earth more than 12,000 years ago.

    Tiny particles detected in ocean sediment cores suggest that dust from a large, disintegrating comet entered Earth’s atmosphere around the beginning of the Younger Dryas event, a period of abrupt cooling that caused temperatures in the Northern Hemisphere to plummet by up to 18 degrees Fahrenheit (10 degrees Celsius) within about a year. The researchers shared their findings Aug. 6 in the journal PLOS One.

    Continue Reading

  • Glacier Monitoring from Space Is Crucial, and at Risk

    Glacier Monitoring from Space Is Crucial, and at Risk

    Shrinking glaciers present some of the most visible signs of ongoing anthropogenic climate change. Their melting also alters landscapes, increases local geohazards, and affects regional freshwater availability and global sea level rise.

    Worldwide observations show that global glacier decline through the early 21st century has been historically unprecedented. Modeling research suggests that 16 kilograms of glacier ice melt for every kilogram of carbon dioxide (CO2) emitted, and other studies indicate that every centimeter of sea level rise exposes an additional 2–3 million people to annual flooding.

    Science has benefited from a diverse set of spaceborne missions and strategies for estimating glacier mass changes at regional to global scales.

    For more than 130 years, the World Glacier Monitoring Service and its predecessor organizations have coordinated international glacier monitoring. This effort started with the worldwide collection, analysis, and distribution of in situ observations [World Glacier Monitoring Service, 2023]. In the 20th century, remote sensing data from airborne and spaceborne sensors began complementing field observations.

    Over the past 2 decades, science has benefited from a diverse set of spaceborne missions and strategies for estimating glacier mass changes at regional to global scales (see Figure 2 of Berthier et al. [2023]). However, this research is challenged by its dependence on the open accessibility of observations from scientific satellite missions. The continuation and accessibility of several satellite missions are now at risk, presenting potential major gaps in our ability to observe glaciers from space.

    We discuss here the history, strengths, and limitations of several strategies for tracking changes in glaciers and how combining studies from the multiple approaches available—as exemplified by a recent large-scale effort within the research community—improves the accuracy of analyses of glacial mass changes. We also outline actions required to secure the future of long-term glacier monitoring.

    The Glacier Mass Balance Intercomparison Exercise

    Glaciological observations from in situ measurements of ablation and accumulation, generally carried out with ablation stakes and in snow pits, represent the backbone of glacier mass balance monitoring. For more than 30 years, glaciologists have undertaken this work at 60 reference glaciers worldwide, with some observations extending back to the early 20th century.

    Researchers conduct glaciological fieldwork, using ablation stakes and other tools, on Findelengletscher in Zermatt, Switzerland, in October 2024. Credit: Andreas Linsbauer, University of Zurich

    These observations provide good estimates of the interannual variability of glacier mass balance and have been vital for process understanding, model calibration, and long-term monitoring. However, because of the limited spatial coverage of in situ observations, the long-term trends they indicate may not accurately represent mass change across entire glacial regions, and some largely glacierized regions are critically undersampled.

    Airborne geodetic surveys provide wider views of individual glaciers compared with point measurements on the ground, and comparing ice elevation changes in airborne data allows researchers to identify and quantify biases in field observations at the glacier scale [Zemp et al., 2013]. Meanwhile, geodetic surveys from spaceborne sensors enable many opportunities to assess glacier elevation and mass changes at regional to global scales.

    The Glacier Mass Balance Intercomparison Exercise (GlaMBIE), launched in 2022, combined observations from in situ and remote sensing approaches, compiling 233 regional glacier mass change estimates from about 450 data contributors organized in 35 research teams.

    The results of the Glacier Mass Balance Intercomparison Exercise show that since 2000, glaciers have lost between 2% and 39% of their mass depending on the region and about 5% globally.

    The results of this community effort, published in February 2025, show that since 2000, glaciers have lost between 2% and 39% of their mass depending on the region and about 5% globally [The GlaMBIE Team, 2025]. These cumulative losses amount to 273 gigatons of water annually and contribute 0.75 millimeter to mean global sea level rise each year. Compared with recent estimates for the ice sheets [Otosaka et al., 2023], glacier mass loss is about 18% larger than the loss from the Greenland Ice Sheet and more than twice the loss from the Antarctic Ice Sheet.

    GlaMBIE provided the first comprehensive assessment of glacier mass change measurements from heterogeneous in situ and spaceborne observations and a new observational baseline for global glacier change and impact assessments [Berthier et al., 2023]. It also revealed opportunities and challenges ahead for monitoring glaciers from space.

    Strategies for Glacier Monitoring from Space

    GlaMBIE used a variety of technologies and approaches for studying glaciers from space. Many rely on repeated mapping of surface elevations to create digital elevation models (DEMs) and determine glacier elevation changes. This method provides multiannual views of glacier volume changes but requires assumptions about the density of snow, firn, and ice to convert volume changes to mass changes, which adds uncertainty because conversion factors can vary substantially.

    Optical stereophotogrammetry applied to spaceborne imagery allows assessment of glacier elevation changes across scales. Analysis of imagery from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on board Terra from 2000 to 2019 yielded elevation changes at 100-meter horizontal resolution for almost all of Earth’s glaciers [Hugonnet et al., 2021]. At the regional scale, finer spatial resolution is possible with the ongoing French Satellite pour l’Observation de la Terre (SPOT) mission series and with satellites like GeoEye, Pléiades, and WorldView.

    The Glacier Mass Balance Intercomparison Exercise (GlaMBIE) used data from a fleet of satellites that monitor glaciers worldwide using optical, radar, laser, and gravity measurements. Clockwise from left in this image are illustrations of Terra, CryoSat, ICESat-2, and the twin GRACE spacecraft above a map of elevation change for the Vatnajökull ice cap in Iceland. Credit: ESA/NASA/Planetary Visions

    Two spaceborne missions applying synthetic aperture radar (SAR) interferometry have been used to assess glacier elevation changes. In February 2000, the Shuttle Radar Topography Mission (SRTM) produced a near-global DEM at a spatial resolution of 30 meters. The second mission, TerraSAR-X add-on for Digital Elevation Measurement (TanDEM-X), has operated since 2010, providing worldwide DEMs at a pixel spacing of 12 meters.

    Data from both missions can be used to assess glacier elevation changes in many regions [Braun et al., 2019], capitalizing on the high spatial resolution and the ability of radar signals to penetrate clouds. However, measurements using C- and X-band radar—as both SRTM and TanDEM-X have—are subject to uncertainties because of topographic complexity in high mountain terrain and because the radar signals can penetrate into snow and firn.

    Laser and radar altimetry allow us to determine glacier elevation changes along ground tracks or swaths, which can be aggregated to produce regional estimates. Laser altimetry has been carried out by NASA’s Ice, Cloud and Land Elevation Satellites (ICESat and ICESat-2) and the Global Ecosystem Dynamics Investigation (GEDI) on board the International Space Station (ISS) [Menounos et al., 2024; Treichler et al., 2019].

    Spaceborne gravimetry offers an alternative to elevation-focused methods, allowing scientists to estimate mass changes by measuring changes in Earth’s gravitational field.

    Spaceborne radar altimetry has a long tradition of measuring ocean and land surfaces, but these missions’ large detection footprints and the challenges of mountainous terrain hampered the use of early missions (e.g., ERS, Envisat) for glacier applications. The European Space Agency’s (ESA) CryoSat-2, which launched in 2010, offered improved coverage of the polar regions, denser ground coverage, a sharper footprint, and other enhanced capabilities that opened its use for monitoring global glacier elevation changes [Jakob and Gourmelen, 2023].

    Both laser altimetry and radar altimetry provide elevation change time series at monthly or quarterly resolution for regions with large ice caps and ice fields (e.g., Alaska, the Canadian Arctic, Svalbard, and the periphery of the Greenland and Antarctic Ice Sheets). However, assessing mountain regions with smaller glaciers (e.g., Scandinavia, central Europe, Caucasus, and New Zealand) remains challenging because of the complex terrain and limited spatial coverage.

    Spaceborne gravimetry offers an alternative to elevation-focused methods, allowing scientists to estimate mass changes across the ocean and in water and ice reservoirs by measuring changes in Earth’s gravitational field. Two missions, the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE) and the NASA/GFZ Helmholtz Centre for Geosciences GRACE Follow-on mission (GRACE-FO), have provided such measurements almost continuously since 2002.

    Gravimetry offers more direct estimates of glacier mass change than other methods [Wouters et al., 2019]. However, the data must be corrected to account for effects of atmospheric drag and oceanic variability, glacial isostatic adjustment, nonglacier hydrological features, and other factors [Berthier et al., 2023].

    Estimates from the GRACE and GRACE-FO missions are most relevant for regions with large areas of glacier coverage (>15,000 square kilometers) because of the relatively coarse resolution (a few hundred kilometers) of the gravity data and because of issues such as poor signal-to-noise ratios in regions with small glacier areas and related small mass changes.

    Securing Glacier Monitoring over the Long Term

    The work of GlaMBIE to combine observations from the diverse approaches above points to several interconnected requirements and challenges for improving the comprehensiveness of global glacier change assessments and securing the future of long-term glacier monitoring.

    First, we must extend the existing network of in situ glaciological observations to fill major data gaps. Such gaps remain in many regions, including Central Asia, Karakoram, Kunlun, and the Central Andes, where glaciers are vital for freshwater availability, as well as in the polar regions, where glaciers are key contributors to sea level rise. These networks also need to be updated to provide real-time monitoring, which will improve understanding of glacier processes and help calibrate and validate remote sensing data and numerical modeling.

    A sequence of black-and-white aerial photographs of a mountain glacier (top), and a black-and-white, 3D, model-generated perspective view of the same glacier (bottom).
    A sequence of aerial photographs taken in 1980 from about 11,000-meter altitude of Grey Glacier in the Southern Patagonian Ice Field (top) was used to generate a 3D model of the glacier (bottom). Credit: 3D reconstruction by Livia Piermattei and Camilo Rada using images from the Servicio Aerofotogramétrico de la Fuerza Aérea de Chile (SAF)

    Second, we must continue unlocking historical archives of airborne and spaceborne missions to expand the spatiotemporal coverage of the observations used in glacier mass change assessments. Data from declassified spy satellites, such as CORONA and Hexagon, have provided stereo observing capabilities at horizontal resolutions of a few meters and offer potential to assess glacier elevation changes back to the 1960s and 1970s. Aerial photography has provided unique opportunities to reconstruct glacier changes since the early 20th century, such as in Svalbard, Greenland, and Antarctica. Beyond individual and institutional studies, we need open access to entire national archives of historical images to safeguard records of past glacier changes on a global scale.

    We must ensure the continuation of space-based glacier monitoring with open-access and high-resolution sensors.

    Third, we must ensure the continuation of space-based glacier monitoring with open-access and high-resolution sensors, following the examples of the Sentinel missions, SPOT 5, and Pléiades. For optical sensors, there is an urgent need for new missions collecting open-access, high-resolution stereo imagery. The French space agency’s forthcoming CO3D (Constellation Optique en 3D) mission, scheduled for launch in 2025, will help meet this need if its data are openly available. But with the anticipated decommissioning of ASTER [Berthier et al., 2024] and the suspended production of new ArcticDEM and REMA (Reference Elevation Model of Antarctica) products from WorldView satellite data as a result of recent U.S. funding cuts, additional replacement missions are needed to observe elevation changes over individual glaciers.

    For radar imaging and altimetry, the SAR mission TanDEM-X and the radar altimeter CryoSat-2 are still operating with expected mission extensions into the late 2020s, and ESA’s SAR-equipped Harmony mission is expected to launch in 2029. With the planned launch of the Copernicus Polar Ice and Snow Topography Altimeter (CRISTAL) in 2027, ESA aims to establish a long-term, cryosphere-specific monitoring program.

    The challenge with CRISTAL will be to ensure that its sensor and mission specifications are tailored for application over glaciers—a difficult task because of glaciers’ relatively small sizes, steep slopes, and distribution in mountain terrain. In the event of a gap between the end of CryoSat-2 and the start of CRISTAL, a bridging airborne campaign, similar to NASA’s Operation IceBridge between ICESat and ICESat-2, will be needed.

    Panoramic view of a long glacier flowing through a valley among tall mountain peaks under blue sky.
    Grosser Aletschgletscher in Switzerland, seen in October 2015. Credit: Jürg Alean, SwissEduc, Glaciers online

    For laser altimetry, we may face gaps in observations as well, as no follow-on is planned for the current ICESat-2 and GEDI science missions. ICESat-2, which measures Earth’s glacier topography and provides validation and coregistration points for other altimetry missions, is projected to run until the early to mid-2030s, but continuing missions must be initiated now. Future missions should combine high accuracy with full coverage over individual glaciers. Proposed concepts with swath lidar, such as EDGE (Earth Dynamics Geodetic Explorer) and CASALS (Concurrent Artificially-Intelligent Spectrometry and Adaptive Lidar System), could be game changers for repeated mapping because they would provide full coverage of glacier topography. Advancing Surface Topography and Vegetation (STV) as a targeted observable for NASA missions, as recommended by the National Academies’ 2017–2027 Decadal Survey, could extend such observations beyond the current science missions.

    For gravimetry, we also face a potential gap in observations, depending on when GRACE-FO is decommissioned and when approved follow-up missions—GRACE-C and NGGM (Next Generation Gravity Mission)—launch. Regardless of launch dates, the usefulness of future missions for monitoring glacier mass changes across regions will strongly depend on the spatial resolution of their data and on the ability to separate glacier and nonglacier signals. Cofounded gravity missions such as the Mass-Change and Geosciences International Constellation (MAGIC), a planned joint NASA-ESA project with four satellites operating in pairs, could significantly improve the spatial and temporal resolution of the gravity data and their utility for glacier monitoring.

    Bringing It All Together

    Glaciers worldwide are diminishing at alarming rates, affecting everything from geohazards to freshwater supplies to sea level rise.

    Glaciers worldwide are diminishing with global warming, and they’re doing so at alarming rates, affecting everything from geohazards to freshwater supplies to sea level rise. Understanding as well as possible the details of glacier change from place to place and how these changes may affect different communities requires combining careful observations from a variety of field, airborne, and—increasingly—spaceborne approaches.

    In light of major existing and impending observational gaps, the scientific community along with government bodies and others should work together to expand access to relevant historical data and extend present-day monitoring capabilities. Most important, space agencies and their sponsor nations must work rapidly to replace and improve upon current satellite missions to ensure long-term glacier monitoring from space. Given the climate crisis, we also call for open scientific access to data from commercial and defense missions to fill gaps and complement civil missions.

    As the work of GlaMBIE reiterated, the more complete the datasets we have, the better positioned we will be to comprehend and quantify glacier changes and related downstream impacts.

    Acknowledgments

    We thank Etienne Berthier, Dana Floricioiu, and Noel Gourmelen for their contributions to this article and all coauthors and data contributors of GlaMBIE for constructive and fruitful discussions during the project, which built the foundation to condense the information presented here. This article was enabled by support from ESA projects GlaMBIE (4000138018/22/I-DT) and The Circle (4000145640/24/NL/SC), with additional contributions from the International Association of Cryospheric Sciences (IACS).

    References

    Berthier, E., et al. (2023), Measuring glacier mass changes from space—A review, Rep. Prog. Phys., 86(3), 036801, https://doi.org/10.1088/1361-6633/acaf8e.

    Berthier, E., et al. (2024), Earth-surface monitoring is at risk—More imaging tools are urgently needed, Nature, 630(8017), 563, https://doi.org/10.1038/d41586-024-02052-x.

    Braun, M. H., et al. (2019), Constraining glacier elevation and mass changes in South America, Nat. Clim. Change, 9(2), 130–136, https://doi.org/10.1038/s41558-018-0375-7.

    Hugonnet, R., et al. (2021), Accelerated global glacier mass loss in the early twenty-first century, Nature, 592(7856), 726–731, https://doi.org/10.1038/s41586-021-03436-z.

    Jakob, L., and N. Gourmelen (2023), Glacier mass loss between 2010 and 2020 dominated by atmospheric forcing, Geophys. Res. Lett., 50(8), e2023GL102954, https://doi.org/10.1029/2023GL102954.

    Menounos, B., et al. (2024), Brief communication: Recent estimates of glacier mass loss for western North America from laser altimetry, Cryosphere, 18(2), 889–894, https://doi.org/10.5194/tc-18-889-2024.

    Otosaka, I. N., et al. (2023), Mass balance of the Greenland and Antarctic Ice Sheets from 1992 to 2020, Earth Syst. Sci. Data, 15(4), 1,597–1,616, https://doi.org/10.5194/essd-15-1597-2023.

    The GlaMBIE Team (2025), Community estimate of global glacier mass changes from 2000 to 2023, Nature, 639, 382–388, https://doi.org/10.1038/s41586-024-08545-z.

    Treichler, D., et al. (2019), Recent glacier and lake changes in high mountain Asia and their relation to precipitation changes, Cryosphere, 13(11), 2,977–3,005, https://doi.org/10.5194/tc-13-2977-2019.

    World Glacier Monitoring Service (2023), Global Glacier Change Bulletin No. 5 (2020–2021), edited by M. Zemp et al., 134 pp., Zurich, Switzerland, https://wgms.ch/downloads/WGMS_GGCB_05.pdf.

    Wouters, B., A. S. Gardner, and G. Moholdt (2019), Global glacier mass loss during the GRACE satellite mission (2002–2016), Front. Earth Sci., 7, 96, https://doi.org/10.3389/feart.2019.00096.

    Zemp, M., et al. (2013), Reanalysing glacier mass balance measurement series, Cryosphere, 7(4), 1,227–1,245, https://doi.org/10.5194/tc-7-1227-2013.

    Author Information

    Michael Zemp ([email protected]), University of Zurich, Switzerland; Livia Jakob, Earthwave Ltd., Edinburgh, U.K.; Fanny Brun, Université Grenoble Alpes, Grenoble, France; Tyler Sutterley, University of Washington, Seattle; and Brian Menounos, University of Northern British Columbia, Prince George, Canada

    Citation: Zemp, M., L. Jakob, F. Brun, T. Sutterley, and B. Menounos (2025), Glacier monitoring from space is crucial, and at risk, Eos, 106, https://doi.org/10.1029/2025EO250290. Published on [DAY MONTH] 2025.
    Text © 2025. The authors. CC BY-NC-ND 3.0
    Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

    Continue Reading

  • Revolutionizing therapeutic protein design with synthetic biology

    Revolutionizing therapeutic protein design with synthetic biology

    In medicine and biotechnology, the ability to evolve proteins with new or improved functions is crucial, but current methods are often slow and laborious. Now, Scripps Research scientists have developed a synthetic biology platform that accelerates evolution itself-enabling researchers to evolve proteins with useful, new properties thousands of times faster than nature. The system, named T7-ORACLE, was described in Science on August 7, 2025, and represents a breakthrough in how researchers can engineer therapeutic proteins for cancer, neurodegeneration and essentially any other disease area.

    This is like giving evolution a fast-forward button. You can now evolve proteins continuously and precisely inside cells without damaging the cell’s genome or requiring labor-intensive steps.”


    Pete Schultz, co-senior author, the President and CEO of Scripps Research, where he also holds the L.S. “Sam” Skaggs Presidential Chair

    Directed evolution is a laboratory process that involves introducing mutations and selecting variants with improved function over multiple cycles. It’s used to tailor proteins with desired properties, such as highly selective, high-affinity antibodies, enzymes with new specificities or catalytic properties, or to investigate the emergence of resistance mutations in drug targets. However, traditional methods often require repeated rounds of DNA manipulation and testing with each round taking a week or more. Systems for continuous evolution-where proteins evolve inside living cells without manual intervention-aim to streamline this process by enabling simultaneous mutation and selection with each round of cell division (roughly 20 minutes for bacteria). But existing approaches have been limited by technical complexity or modest mutation rates.

    T7-ORACLE circumvents these bottlenecks by engineering E. coli bacteria-a standard model organism in molecular biology-to host a second, artificial DNA replication system derived from bacteriophage T7, a virus that infects bacteria and has been widely studied for its simple, efficient replication system. T7-ORACLE enables continuous hypermutation and accelerated evolution of biomacromolecules, and is designed to be broadly applicable to many protein targets and biological challenges. Conceptually, T7-ORACLE builds on and extends efforts on existing orthogonal replication systems-meaning they operate separately from the cell’s own machinery-such as OrthoRep in Saccharomyces cerevisiae (baker’s yeast) and EcORep in E. coli. In comparison to these systems, T7-ORACLE benefits from the combination of high mutagenesis, fast growth, high transformation efficiency, and the ease with which both the E. coli host and the circular replicon plasmid can be integrated into standard molecular biology workflows.

    The T-7 ORACLE orthogonal system targets only plasmid DNA (small, circular pieces of genetic material), leaving the cell’s host genome untouched. By engineering T7 DNA polymerase (a viral enzyme that replicates DNA) to be error-prone, the researchers introduced mutations into target genes at a rate 100,000 times higher than normal without damaging the host cells.

    “This system represents a major advance in continuous evolution,” says co-senior author Christian Diercks, an assistant professor of chemistry at Scripps Research. “Instead of one round of evolution per week, you get a round each time the cell divides-so it really accelerates the process.”

    To demonstrate the power of T7-ORACLE, the research team inserted a common antibiotic resistance gene, TEM-1 β-lactamase, into the system and exposed the E. coli cells to escalating doses of various antibiotics. In less than a week, the system evolved versions of the enzyme that could resist antibiotic levels up to 5,000 times higher than the original. This proof-of-concept demonstrated not only T7-ORACLE’s speed and precision, but also its real-world relevance by replicating how resistance develops in response to antibiotics.

    “The surprising part was how closely the mutations we saw matched real-world resistance mutations found in clinical settings,” notes Diercks. “In some cases, we saw new combinations that worked even better than those you would see in a clinic.”

    But Diercks emphasizes that the study isn’t focused on antibiotic resistance per se.

    “This isn’t a paper about TEM-1 β-lactamase,” he explains. “That gene was just a well-characterized benchmark to show how the system works. What matters is that we can now evolve virtually any protein, like cancer drug targets and therapeutic enzymes, in days instead of months.”

    The broader potential of T7-ORACLE lies in its adaptability as a platform for protein engineering. Although the system is built into E. coli, the bacterium serves primarily as a vessel for continuous evolution. Scientists can insert genes from humans, viruses or other sources into plasmids, which are then introduced into E. coli cells. T7-ORACLE mutates these genes, generating variant proteins that can be screened or selected for improved function. Because E. coli is easy to grow and widely used in labs, it provides a convenient, scalable system for evolving virtually any protein of interest.

    This could help scientists more rapidly evolve antibodies to target specific cancers, evolve more effective therapeutic enzymes, and design proteases that target proteins involved in cancer and neurodegenerative disease.

    “What’s exciting is that it’s not limited to one disease or one kind of protein,” says Diercks. “Because the system is customizable, you can drop in any gene and evolve it toward whatever function you need.”

    Moreover, T7-ORACLE works with standard E. coli cultures and widely used lab workflows, avoiding the complex protocols required by other continuous evolution systems.

    “The main thing that sets this apart is how easy it is to implement,” adds Diercks. “There’s no specialized equipment or expertise required. If you already work with E. coli, you can probably use this system with minimal adjustments.”

    T7-ORACLE reflects Schultz’s broader goal: to rebuild key biological processes-such as DNA replication, RNA transcription and protein translation-so they function independently of the host cell. This separation allows scientists to reprogram these processes without disrupting normal cellular activity. By decoupling fundamental processes from the genome, tools like T7-ORACLE help advance synthetic biology.

    “In the future, we’re interested in using this system to evolve polymerases that can replicate entirely unnatural nucleic acids: synthetic molecules that resemble DNA and RNA but with novel chemical properties,” says Diercks. “That would open up possibilities in synthetic genomics that we’re just beginning to explore.”

    Currently, the research team is focused on evolving human-derived enzymes for therapeutic use, and on tailoring proteases to recognize specific cancer-related protein sequences.

    “The T7-ORACLE approach merges the best of both worlds,” says Schultz. “We can now combine rational protein design with continuous evolution to discover functional molecules more efficiently than ever.”

    Source:

    Scripps Research Institute

    Journal reference:

    Diercks, C. S., et al. (2025) An orthogonal T7 replisome for continuous hypermutation and accelerated evolution in E. coli. Science. doi.org/10.1126/science.adp9583.

    Continue Reading

  • James Webb discovers universe’s oldest black hole

    James Webb discovers universe’s oldest black hole

    Scientists have found the earliest black hole in the universe that existed just 500 million years after the Big Bang. The discovery was revealed in a new paper published in The Astrophysical Journal Letters. The ancient black hole was spotted by the James Webb Space Telescope in the distant galaxy CAPERS-LRD-z9, shrouded by gas. It is around 13.3 billion years old, when the universe was just three per cent of its current age. CAPERS-LRD-z9 is one of the “little red dot” galaxies that Webb captured in the first year of its launch. They looked very different from the galaxies Hubble was used to seeing. Steven Finkelstein, co-author of the new study and director of the Cosmic Frontier Center at the University of Texas at Austin, said in a statement, “The discovery of Little Red Dots was a major surprise from early JWST data, as they looked nothing like galaxies seen with the Hubble Space Telescope.”

    Galaxies born right after Big Bang

    These galaxies appeared as scarlet pinpricks in Webb’s vision and had never been detected by any telescope before it. Their discovery changed the beliefs scientists had about the universe. NASA stated that if these shining objects were galaxies, then it would imply that some of them had grown really big, really fast and did not fit into the current theories. Finkelstein and his team compiled data on these little red dot galaxies and published their findings in January. It included all of them that existed in the first 1.5 billion years after the Big Bang. They wrote that these galaxies likely had supermassive black holes that were still growing. He then collaborated with a team led by Anthony Taylor, a postdoctoral researcher at the Cosmic Frontier Center.

    They checked the spectroscopy data from Webb’s CAPERS (CANDELS-Area Prism Epoch of Reionization Survey) program. They noticed a distinct spectroscopic signature that is produced when black holes interact with gas clouds. As the gas rapidly swirls around and falls into a black hole, light from gas that is moving away from us stretches into redder wavelengths, while the one moving towards us has bluer wavelengths. Taylor says, “There aren’t many other things that create this signature.” This one turned out to be the earliest such example.

    Trending Stories

    Black holes and brightness from galaxies

    Their findings further cement the belief that supermassive black holes are the likely source of the brightness emitted by these galaxies. “We’ve seen these clouds in other galaxies,” Taylor said. “When we compared this object to those other sources, it was a dead ringer.” The oldest black hole discovered at the centre of this galaxy is up to 300 million times more massive than the Sun. “Finding a black hole of this size that existed so early on in the universe adds to growing evidence that early black holes grew much faster than we thought possible,” Finkelstein said.

    Continue Reading

  • “It’s astonishing”: animal in India sings when sun is precisely 3.8° below horizon every dawn, say scientists

    “It’s astonishing”: animal in India sings when sun is precisely 3.8° below horizon every dawn, say scientists

    Cicadas in southern India have been found to start their dawn chorus with remarkable accuracy, timing their song to a precise level of light during the pre-dawn hours.

    Researchers discovered that the insects begin singing when the sun is exactly 3.8 degrees below the horizon – a moment known as civil twilight.

    Recording cicada songs

    The study, published in the journal Physical Review E, involved several weeks of field recordings from two sites near Bengaluru (formally Bangalore), the capital of India’s southern Karnataka state.

    Site one was a shrubland area with scattered grasses and site two was a bamboo forest. The researchers focused their study on choruses produced by the species Platypleura capitata.

    Using specialised tools, the team were able to reveal just how closely cicada singing is linked to light changes.

    “We’ve long known that animals respond to sunrise and seasonal light changes,” says co-author professor Raymond Goldstein from Cambridge’s Department of Applied Mathematics and Theoretical Physics. “But this is the first time we’ve been able to quantify how precisely cicadas tune in to a very specific light intensity – and it’s astonishing.”

    The researchers found that the insects’ loud chorus takes just 60 seconds to reach full volume each morning. The midpoint of that build-up happens at almost the same solar angle every day, regardless of when sunrise occurs. On the ground, the light at that moment varies by about 25%, underscoring the insects’ precision.

    This is the first time we’ve been able to quantify how precisely cicadas tune in to a very specific light intensity – and it’s astonishing.

    To understand more about this phenomenon, the team created a mathematical model inspired by magnetic materials, where tiny units align with both an external field and each other. In the cicadas’ case, individuals appear to base their decision to sing on both the light level and the sound of nearby insects.

    “This kind of collective decision-making shows how local interactions between individuals can produce surprisingly coordinated group behaviour,” says co-author professor Nir Gov from the Weizmann Institute.

    The recordings were made by Bengaluru-based engineer Rakesh Khanna, who studies cicadas as a personal passion. Khanna worked with Goldstein and Dr Adriana Pesci at Cambridge to turn his observations into a formal analysis.

    “Rakesh’s observations have paved the way to a quantitative understanding of this fascinating type of collective behaviour,” says Goldstein. “There’s still much to learn, but this study offers key insights into how groups make decisions based on shared environmental cues.”

    Top image: sunrise over Indian landscape (not the study site). Credit: Getty

    More amazing wildlife stories from around the world

    Continue Reading

  • Is Mining Asteroids That Impacted The Moon Moon Easier Than Mining Asteroids Themselves?

    Is Mining Asteroids That Impacted The Moon Moon Easier Than Mining Asteroids Themselves?

    The resources tucked away in asteroids promise to provide the building blocks of humanity’s expansion into space. However, accessing those resources can prove tricky. There’s the engineering challenge of landing a spacecraft on one of the low-gravity targets and essentially dismantling it while still remaining attached to it. But there’s also a challenge in finding ones that make economic sense to do that to, both in terms of the amount of material they contain as well as the ease of getting to them from Earth. A much easier solution might be right under our noses, according to a new paper from Jayanth Chennamangalam and his co-authors – mine the remnants of asteroids that hit the Moon.

    Asteroids hitting the Moon is a relatively common occurrence, given the number of craters visible on its surface. According to the paper, larger craters (i.e. those 1km or more in diameter) can hold a significant amount of the material left over from the asteroid that caused them. Simulations back this up, showing that if the asteroid is going slowly enough, at around 12 km/s, a significant fraction of it survives the impact and is scattered around the crater. In some cases its disbursed amongst the breccia on the crater surface, but in some case its concentrated in the middle as a solid chunk of valuable material.

    To estimate the number of craters that might have valuable resources hiding in them, the authors modified an equation used to estimate the number of ore-bearing asteroids that was originally developed by one of their own (Martin Elvis of Harvard). In the original equation, Dr. Elvis defined 5 terms that, when multiplied together, gave the total number of near-Earth asteroids that could be economically mined for resources.

    Fraser discusses the crazy idea of intentionally hitting the Moon with an asteroid.

    First is the likelihood that the asteroid is of a type that actually contains the valuable material. For platinum group metals (PGMs), that would be an M-type asteroid – typically considered to be about 4% of the total asteroid population, whereas for water it could be a C-type, which is slightly more common at 10%. A second factor is what percentage of those asteroids are rich enough to hold a significant amount of material – which the paper estimated at 50% for the M-types and 31% for the C-types.

    In Dr. Elvis’ original formula, the next factor is the probability that the asteroid is accessible in space, given the delta-v required to reach it. However, the new paper modifies this factor, since every crater on the Moon is accessible with about the same amount of delta-v. The new factor represents the probability that the asteroid survives impact with the Moon, which is calculated at about 25% for the M-types and a lower 8.3% for C-types, given that the water that hold C-types together might be lost from the impact of heating.

    Another factor, near and dear to any engineer’s heart, is the likelihood that the engineering challenges of recovering the material are achievable. Every engineer will be glad to hear that, in both papers, this factor is treated as 100% – the astrophysicists and planetary scientists obviously have faith that engineers can overcome the difficulty of mining both on asteroids and on the Moon.

    Fraser discusses the Lunar south pole – home to many craters in its own right – and one of the places astronauts might first land.

    The last factor is again different between the original and the new paper, but serves similar purposes. In the original, it was meant to estimate the total number of asteroids that were large enough to be profitable to mine, while in the new one it represents the number of craters large enough to have a significant amount of recoverable material in it. Again, for water-bearing C-types, only smaller crater sizes were considered as they are known to create smaller craters.

    With these estimates of factors, the new paper calculates that there are orders of magnitude more ore-bearing craters on the Moon than there are recoverable near-Earth asteroids with significant amounts of mineable material. So, it sounds like it might be better to focus on lunar mining than on asteroid mining.

    However, there are some caveats – obviously any engineer will tell you that economically recovering material from 100% of craters is not feasible, so the assumption of a 100% success rate isn’t feasible. Considerations like impact dispersal and the speed with which the asteroid hits the lunar surface also have major effects on the economic viability of how easy it is to recover those materials. And there’s the consideration of lunar gravity – while it is a boost to making the engineering efforts of recovery easier, it serves as a hindrance when trying to get the mined material back into space.

    Ultimately, the suggestion of the paper is to put a remote-sensing satellite in orbit with a high resolution camera and see if some of the assumptions about the availability of material in craters is true. If it is, and if there’s enough economic demand for that material, mining the Moon rather than asteroids first begins to become an even more exciting proposition.

    Learn More:

    J. Chennamangalam et al. – On ore-bearing asteroid remnants in lunar craters

    UT – Could You Find What A Lunar Crater Is Made Of By Shooting It?

    UT – NASA to Probe the Secrets of the Lunar Regolith

    UT – One Crater on the Moon is Filled with Ice and Gas that Came from a Comet Impact

    Continue Reading