Category: 7. Science

  • Simpler models can outperform deep learning at climate prediction | MIT News

    Simpler models can outperform deep learning at climate prediction | MIT News

    Environmental scientists are increasingly using enormous artificial intelligence models to make predictions about changes in weather and climate, but a new study by MIT researchers shows that bigger models are not always better.

    The team demonstrates that, in certain climate scenarios, much simpler, physics-based models can generate more accurate predictions than state-of-the-art deep-learning models.

    Their analysis also reveals that a benchmarking technique commonly used to evaluate machine-learning techniques for climate predictions can be distorted by natural variations in the data, like fluctuations in weather patterns. This could lead someone to believe a deep-learning model makes more accurate predictions when that is not the case.

    The researchers developed a more robust way of evaluating these techniques, which shows that, while simple models are more accurate when estimating regional surface temperatures, deep-learning approaches can be the best choice for estimating local rainfall.

    They used these results to enhance a simulation tool known as a climate emulator, which can rapidly simulate the effect of human activities onto a future climate.

    The researchers see their work as a “cautionary tale” about the risk of deploying large AI models for climate science. While deep-learning models have shown incredible success in domains such as natural language, climate science contains a proven set of physical laws and approximations, and the challenge becomes how to incorporate those into AI models.

    “We are trying to develop models that are going to be useful and relevant for the kinds of things that decision-makers need going forward when making climate policy choices. While it might be attractive to use the latest, big-picture machine-learning model on a climate problem, what this study shows is that stepping back and really thinking about the problem fundamentals is important and useful,” says study senior author Noelle Selin, a professor in the MIT Institute for Data, Systems, and Society (IDSS) and the Department of Earth, Atmospheric and Planetary Sciences (EAPS), and director of the Center for Sustainability Science and Strategy.

    Selin’s co-authors are lead author Björn Lütjens, a former EAPS postdoc who is now a research scientist at IBM Research; senior author Raffaele Ferrari, the Cecil and Ida Green Professor of Oceanography in EAPS and co-director of the Lorenz Center; and Duncan Watson-Parris, assistant professor at the University of California at San Diego. Selin and Ferrari are also co-principal investigators of the Bringing Computation to the Climate Challenge project, out of which this research emerged. The paper appears today in the Journal of Advances in Modeling Earth Systems.

    Comparing emulators

    Because the Earth’s climate is so complex, running a state-of-the-art climate model to predict how pollution levels will impact environmental factors like temperature can take weeks on the world’s most powerful supercomputers.

    Scientists often create climate emulators, simpler approximations of a state-of-the art climate model, which are faster and more accessible. A policymaker could use a climate emulator to see how alternative assumptions on greenhouse gas emissions would affect future temperatures, helping them develop regulations.

    But an emulator isn’t very useful if it makes inaccurate predictions about the local impacts of climate change. While deep learning has become increasingly popular for emulation, few studies have explored whether these models perform better than tried-and-true approaches.

    The MIT researchers performed such a study. They compared a traditional technique called linear pattern scaling (LPS) with a deep-learning model using a common benchmark dataset for evaluating climate emulators.

    Their results showed that LPS outperformed deep-learning models on predicting nearly all parameters they tested, including temperature and precipitation.

    “Large AI methods are very appealing to scientists, but they rarely solve a completely new problem, so implementing an existing solution first is necessary to find out whether the complex machine-learning approach actually improves upon it,” says Lütjens.

    Some initial results seemed to fly in the face of the researchers’ domain knowledge. The powerful deep-learning model should have been more accurate when making predictions about precipitation, since those data don’t follow a linear pattern.

    They found that the high amount of natural variability in climate model runs can cause the deep learning model to perform poorly on unpredictable long-term oscillations, like El Niño/La Niña. This skews the benchmarking scores in favor of LPS, which averages out those oscillations.

    Constructing a new evaluation

    From there, the researchers constructed a new evaluation with more data that address natural climate variability. With this new evaluation, the deep-learning model performed slightly better than LPS for local precipitation, but LPS was still more accurate for temperature predictions.

    “It is important to use the modeling tool that is right for the problem, but in order to do that you also have to set up the problem the right way in the first place,” Selin says.

    Based on these results, the researchers incorporated LPS into a climate emulation platform to predict local temperature changes in different emission scenarios.

    “We are not advocating that LPS should always be the goal. It still has limitations. For instance, LPS doesn’t predict variability or extreme weather events,” Ferrari adds.

    Rather, they hope their results emphasize the need to develop better benchmarking techniques, which could provide a fuller picture of which climate emulation technique is best suited for a particular situation.

    “With an improved climate emulation benchmark, we could use more complex machine-learning methods to explore problems that are currently very hard to address, like the impacts of aerosols or estimations of extreme precipitation,” Lütjens says.

    Ultimately, more accurate benchmarking techniques will help ensure policymakers are making decisions based on the best available information.

    The researchers hope others build on their analysis, perhaps by studying additional improvements to climate emulation methods and benchmarks. Such research could explore impact-oriented metrics like drought indicators and wildfire risks, or new variables like regional wind speeds.

    This research is funded, in part, by Schmidt Sciences, LLC, and is part of the MIT Climate Grand Challenges team for “Bringing Computation to the Climate Challenge.”

    Continue Reading

  • Scientists Trace Jupiter’s Origins Using Ancient Rock Raindrop

    Scientists Trace Jupiter’s Origins Using Ancient Rock Raindrop

    Chondrules are tiny, round droplets of once-molten rock found in many meteorites. They formed when silicate liquids cooled and hardened in space, but how exactly they came to be has puzzled scientists for years.

    About 4.5 billion years ago, Jupiter grew rapidly, becoming a gravitational giant. Its pull stirred up chaos among nearby space rocks and icy bodies, called planetesimals, causing them to crash into each other at incredible speeds. These violent impacts melted the debris, creating floating droplets of molten rock: chondrules.

    Now, researchers from Nagoya University and INAF have cracked the mystery. By studying the size and cooling patterns of chondrules, they discovered that water inside the crashing planetesimals played a key role in shaping them. This not only explains what we see in meteorites today, but it also helps pinpoint when Jupiter formed.

    Co-lead author Professor Sin-iti Sirono from Nagoya University’s Graduate School of Earth and Environmental Sciences explained, “When planetesimals collided with each other, water instantly vaporized into expanding steam. This acted like tiny explosions and broke apart the molten silicate rock into the tiny droplets we see in meteorites today.”

    “Previous formation theories couldn’t explain chondrule characteristics without requiring particular conditions, while this model requires conditions that naturally occurred in the early solar system when Jupiter was born.”

    By developing and using computer simulations of Jupiter’s growth, the researchers tracked how its gravity caused high-speed collisions between rocky and water-rich planetesimals in the early solar system.  

    “We compared the characteristics and abundance of simulated chondrules to meteorite data and found that the model spontaneously generated realistic chondrules. The model also shows that chondrule production coincides with Jupiter’s intense accumulation of nebular gas to reach its massive size. As meteorite data tell us that peak chondrule formation took place 1.8 million years after the solar system began, this is also the time at which Jupiter was born.”

    This study helps us better understand how our solar system came to be. When Jupiter formed, it sparked the creation of molten rock droplets called chondrules. But that event alone was too short to explain why meteorites contain chondrules from many different time periods.

    The likely reason? Other giant planets, like Saturn, also caused chondrule formation when they were born. Each planetary arrival added a new wave of molten droplets to the mix.

    By examining chondrules of different ages, scientists can map out the birth order of the planets and see how the solar system evolved. Even more exciting, this research hints that similar violent processes may happen around other stars, giving us clues about how other planetary systems might form.

    Journal Reference:

    1. Sirono, Si., Turrini, D. Chondrule formation by collisions of planetesimals containing volatiles triggered by Jupiter’s formation. Sci Rep 15, 30919 (2025). DOI: 10.1038/s41598-025-12643-x

    Continue Reading

  • NASA’s Earth-observing satellites are crucial — commercial missions cannot replace them

    NASA’s Earth-observing satellites are crucial — commercial missions cannot replace them

    Companies have made impressive progress in measuring Earth’s environmental changes from space. GHGSat, an emissions-monitoring company in Montreal, Canada, tracks methane leaks from landfill sites and oil rigs. Earth-imaging firm Planet in San Francisco, California, uses more than 200 satellites to record land and infrastructure for the energy, insurance and maritime sectors. Data-analytics company Spire in San Francisco converts radio signals from navigation satellites into estimates of ocean height and wind speed to support weather forecasts. European aerospace firm Airbus operates radar satellites that can be used to study volcanoes, wetlands and sea ice.

    Space agencies are taking note, and several, including the European Space Agency and NASA, are incorporating commercial data into their portfolios to make them available to researchers. Both agencies have defined processes for evaluating externally produced data, providing science-based assessments of the accuracy, geographical targeting and usability of the observations.

    As an academic researcher, I have been excited to participate in efforts to increase the adoption of data from commercial satellites to complement publicly provided information. For example, supported by NASA, I have begun to apply GHGSat data to estimate methane emissions from a landfill site in Brazil. I am also exploring how to use data from companies such as Spire to support hurricane risk-reduction efforts in Puerto Rico and Mexico.

    I have found that data gathered by commercial organizations are innovative and useful. But I also know that private companies alone cannot provide all the Earth-observation data that the world needs. Nor should they.

    As governments debate science budgets and consider the role of the public and private sectors in environmental monitoring, it can be tempting to look for ways to increase efficiency and move public-sector operations to the private sector. The progress of commercial satellite operators might seem to provide evidence that NASA will not need to operate as many satellites in the future as it does now. Indeed, US President Donald Trump’s budget request for the 2026 fiscal year proposes to cancel NASA funding for several government-operated Earth-observation missions. But this is the wrong lesson to learn from private-sector progress.

    Instead, governments and researchers should continue to pursue a balance between the contributions of the commercial and public sectors to environmental monitoring. Satellite-based Earth-observation missions operated by the public sector remain relevant, because they have several unique features.

    First, such missions are set up to answer scientific questions or to maintain public services, such as weather forecasts or flood-response systems. Although commercial Earth-observation companies can contribute, governmental entities should take the lead to ensure that publicly controlled and validated data, models and forecasts are produced — and trusted.

    Continue Reading

  • How did a planet this big form around a star this small?

    How did a planet this big form around a star this small?

    The host star, TOI-6894, is a red dwarf with only 20% the mass of the Sun, typical of the most common stars in our galaxy. Until now, such low-mass stars were not thought capable of forming or retaining giant planets. But as published recently in Nature Astronomy, the unmistakable signature of a giant planet — TOI-6894b — has been detected in orbit around this tiny star.

    This exceptional system was first identified in data from NASA’s Transiting Exoplanet Survey Satellite (TESS), as part of a large search for giant planets around small stars, led by Dr. Edward Bryant from UCL’s Mullard Space Science Laboratory.

    The planetary nature of the signal was then confirmed by an extensive ground-based observation campaign, involving several telescopes — including those of the SPECULOOS and TRAPPIST projects, both led by the University of Liège.

    Dr. Khalid Barkaoui, researcher on the SPECULOOS and TRAPPIST teams, oversaw these crucial follow-up observations. He explained: “The transit signal was unambiguous in our data. Our analysis ruled out all alternative explanations — the only viable scenario was that this tiny star hosts a Saturn-sized planet with an orbital period of just over three days. Additional observations confirmed that its mass is about half that of Saturn. This is clearly a giant planet.”

    TOI-6894 is now the smallest star known to host a transiting giant planet, with a radius 40% smaller than that of any previous such host.

    Prof. Jamila Chouquar, who was an astronomer at ULiege at the time of the discovery, added: “We previously believed that stars this small couldn’t form or hold on to giant planets. But stars like TOI-6894 are the most common type in the Milky Way — so our discovery suggests there may be far more giant planets out there than we thought.”

    A Challenge to Planet Formation Models

    According to current planet formation models, giant planets are rare around small stars. This is because their protoplanetary disks — the gas and dust reservoirs from which planets form — are thought to lack the material needed to build massive cores and accrete thick gas envelopes.

    Dr. Mathilde Timmermans, member of the SPECULOOS team and ULiege astronomer at the time of the discovery, noted: “The existence of TOI-6894b is hard to reconcile with existing models. None can fully explain how it formed. This shows that our understanding is incomplete, and underscores the need to find more such planets. That’s exactly the goal of MANGO, a SPECULOOS sub-program led by myself and Dr. Georgina Dransfield at the University of Birmingham.”

    Prof. Michaël Gillon,Fund for Scientific Research — FNRS Research Director at ULiege and head of the SPECULOOS and TRAPPIST programs, concluded: “This giant planet orbiting a tiny star reveals that planetary diversity in the galaxy is even greater than we imagined. Most of the targets observed by SPECULOOS and TRAPPIST are similar stars, or even smaller — so we’re well positioned to uncover more cosmic outliers in the years ahead.”

    Continue Reading

  • El Capitan transforms complex physics into jaw-dropping detail

    El Capitan transforms complex physics into jaw-dropping detail

    El Capitan, the fastest supercomputer in the world, can now simulate extreme events like shock waves or fluid mixing at high speed in a way that looks much closer to reality than ever before. The supercomputer was built for scientists at Lawrence Livermore National Laboratory (LLNL) in the U.S.

    Usual computers are known for giving blurred pictures of these simulation events. El Capitan, however, creates high-resolution images, providing details about tiny features that can help analyze real physics.

    The researchers used the supercomputer to simulate what happens to a tin surface when powerful shock waves and high-speed impacts hit it.

    “The shocks were strong enough to melt the metal and throw a spray of hot liquified tin, known as ejecta, ahead of the surface,” said LLNL physicist Kyle Mackay.

    “The simulation was noteworthy for its high fidelity, employing advanced physics models for mechanisms like surface tension, detailed equations-of-state, and especially its sub-micron mesh resolution,” he added.

    When shock waves hit metal, it tends to melt and spray out tiny liquid droplets called ejecta. The simulations showed the effect of tiny scratches in the metal – details that don’t easily appear on other computer models.

    This advancement is significant because such precise detail is essential for advancing real-world applications in physics, national defense, and fusion energy research.

    Observing the Kelvin-Helmholtz instability

    The researchers used LLNL’s multiphysics code MARBL to study the physical phenomenon called the Kelvin-Helmholtz instability – a phenomenon that occurs when two fluids of different densities rub against each other, similar to wind blowing over water and creating waves.

    In extreme conditions like shockwaves or explosions, this effect becomes very turbulent and chaotic, making it hard to capture accurately in experiments.

    The researchers developed a model in which a shockwave struck a minute ripple at the boundary between two materials, triggering intense mixing and forming vortex-like patterns.

    These turbulent flows, resembling whirlpools, are notoriously complex and have long posed significant challenges for accurate modeling.

    Exploring the results

    El Capitan used 107 billion calculation points to track the physics. More than 8,000 AMD GPUs worked together to crunch the numbers.

    The result was a time-lapse of fluid behavior under intense energy conditions, revealing intricate shear and shock patterns that mirror — and in some cases go beyond — what’s possible to observe in experiments.

    “Experiments are the ultimate arbiter of physical truth, but can be difficult to extract necessary data from,” said Rob Rieben, a team researcher.

    “High-fidelity simulations let us probe aspects of an experiment in a virtual manner that would not be possible to access in a real experiment. El Captain is a powerful scientific instrument for exploring physics via simulation at fidelities never seen before,” he continued.

    Breaking the barriers

    El Capitan enables researchers to run high-resolution simulations that capture complex physical processes directly, reducing dependence on simplified models and assumptions.

    With 20 times more power than its predecessor, Sierra, El Capitan lets researchers run simulations far more often – about once an hour instead of once a day – and study details that are twenty times smaller.

    El Capitan’s expanded capabilities will allow researchers to conduct more precise studies, accelerate testing, and generate insights that could benefit fields such as physics, defense, and energy research in the years ahead.

    Continue Reading

  • Hidden messengers tell plants when to eat and breathe

    Hidden messengers tell plants when to eat and breathe

    Plants appear still, yet inside they are alive with communication. They must continuously balance how to capture energy without losing too much water. For decades, scientists suspected there were internal messengers guiding this process, but the exact molecules remained unidentified.

    A new study led by Penn State researchers finally reveals the nature of these signals. This work not only solves a long-standing mystery but also suggests new directions for agriculture and plant resilience research.

    How plants balance food and water


    “This discovery significantly advances our understanding of how plants coordinate their internal metabolism – the chemical reactions they use to make energy – with their external environment, a fundamental process for plant growth and survival,” said Professor Sarah Assmann.

    “Our findings open doors for future research into improving plant resilience and crop yields.”

    Guard cells, located on the leaf surface, control stomata, the tiny pores that regulate the intake of carbon dioxide and the release of water vapor.

    These pores act like microscopic mouths. When they open, plants can “eat” by absorbing CO2, but they risk losing vital water at the same time.

    Mysterious plant messengers

    “There is always a tradeoff for terrestrial plants between maximizing CO2 intake, which is needed for photosynthesis, and letting out water vapor, which can dry out the plant and ultimately kill it if it loses too much water,” explained Professor Assmann.

    “The stomata are the pores where that tradeoff takes place. When they open, they let in CO2 that allows the plant to feed, but they also let out water vapor, which dehydrates the plant. We knew there had to be some kind of messenger telling the guard cells how to regulate that life-or-death decision.”

    The new findings show that sugars, including sucrose, glucose, and fructose, as well as maleic acid, act as these critical messengers. These metabolites form a feedback loop, linking the plant’s energy production to its stomatal control.

    Tracking signals in plant leaves

    The researchers worked with the model plant Arabidopsis thaliana, or thale cress, and fava beans to uncover the energy production system in plants.

    By extracting apoplastic fluid from leaves exposed to either red light or darkness, they were able to isolate chemical compounds. Red light stimulates photosynthesis, making it easier to detect active signals.

    Through this process, the team identified 448 chemical compounds in the fluid. “We identified hundreds of metabolites in apoplastic fluid, which no one had analyzed to this extent before,” noted Professor Assmann.

    “That, on its own, is an important contribution to the field, independent of the research question that we specifically were addressing, because it gives a lot of leads on other potential signaling molecules for processes throughout the plant.”

    Sugars control plants water use

    Further experiments revealed sugars directly promoted stomatal opening under red light. When tested in intact leaves, these compounds increased carbon dioxide uptake and also altered water release, confirming their messenger role.

    Cell-level experiments explained the mechanism: sugars stimulate molecular machinery inside guard cells, activating them to open the stomata.

    This research provides the first full picture of the internal dialogue between photosynthesis and water regulation.

    Studying what makes plants resilient

    The Nature Plants study also emphasizes that stomatal control does not rely solely on hormones, as once thought. Instead, metabolic products themselves can serve as powerful signaling agents.

    This finding adds a new dimension to how scientists view plant-environment interactions. According to Professor Assmann, the team is focused on understanding how plants sense and respond to environmental conditions.

    “Plants can’t uproot themselves and find somewhere else to live; they have to deal with whatever the environment throws at them – increasingly drought and heat stress,” she noted.

    “So we study what makes plants resilient, from the very specific molecular level all the way up to whole plant physiology and field experiments, with the goal of improving crop productivity.”

    The project brought together researchers from Penn State, The Hebrew University of Jerusalem, Nagoya University, RIKEN Center for Sustainable Resource Science, and the University of Mississippi.

    The research was funded, in part, by the National Science Foundation.

    The study is published in the journal Nature Plants.

    —–

    Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

    Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

    —–

    Continue Reading

  • Satellite constellations fall short of meeting brightness goals

    Satellite constellations fall short of meeting brightness goals

    WASHINGTON — Developers of large satellite constellations say it may be impossible to meet brightness goals established by astronomers to minimize the impact of those satellites on their constellations.

    To continue reading this article:

    Register now and get
    3 free articles every month.

    You’ll also receive our weekly SpaceNews This Week newsletter every Friday. Opt-out at any time.

    Sign in to an existing account

    Get unlimited access to
    SpaceNews.com now.

    As low as $5 per week*

    Cancel anytime. Sales tax may apply. No refunds. (*Billed quarterly)

    See all subscription options

    Jeff Foust writes about space policy, commercial space, and related topics for SpaceNews. He earned a Ph.D. in planetary sciences from the Massachusetts Institute of Technology and a bachelor’s degree with honors in geophysics and planetary science… More by Jeff Foust


    Continue Reading

  • Teaching A.I. to Detect Life: Carnegie Scientist Co-Leads NASA-Funded Effort

    Teaching A.I. to Detect Life: Carnegie Scientist Co-Leads NASA-Funded Effort

    Washington, D.C. — A multi-institutional team co-led by Carnegie Science’s Michael L. Wong and Caleb Scharf of the NASA Ames Research Center has received a prestigious $5 million, five-year NASA Interdisciplinary Consortia for Astrobiology Research (ICAR) grant to develop A.I. tools for enhancing the search for signs of life on other planets.

    The cross-disciplinary project brings together experts in chemistry, geoscience, machine learning, and planetary science to address one of astrobiology’s biggest challenges—reliably distinguishing life from non-life in planetary data.

    At the heart of the project is a massive, curated dataset. Carnegie researchers—including Wong, Anirudh Prabhu, Robert Hazen, and George Cody—will lead the effort to generate highly detailed profiles of at least 1,000 samples, ranging from meteorites to fossils to living organisms. They will employ a suite of advanced techniques to analyze the molecular and chemical signatures across this broad sample set.

    “A.I. will help us identify patterns in these massive multidimensional datasets that no human, or team of humans, could sift through in one lifetime,” said Wong. “It’s a tool we can use to detect the subtle biosignatures we might otherwise miss in the noise. It may even help us illuminate the fundamental differences between life and non-life.”

    Partner institutions across the U.S.—including NASA Ames Research Center, Johns Hopkins University, Rutgers University, Caltech, Howard University, Purdue University, and NASA’s Goddard Space Flight Center—will provide additional instrumentation and laboratory expertise, transforming this effort into a national-scale, data-generation engine. Once the data collection is complete, the team will develop and train machine learning models on this expansive dataset to find patterns that consistently indicate life. 

    “Carnegie has a rich legacy of planetary science and cosmochemistry,” noted Carnegie Science Earth and Planets Laboratory Director Michael Walter. “Few places are better equipped to handle such a wide range of Earth and planetary samples.”

    This isn’t just about developing A.I. tools—it’s about putting that intelligence to work. Wong and his team will use their findings to recommend the most effective scientific instruments for future missions, ensuring we send the most promising tools to the most promising extraterrestrial locations in our search for life.

    “For NASA, this is incredibly valuable,” says Scharf, “exploring Mars, or an icy moon in the outer Solar System, is hugely challenging and we’re going to need to rely more and more on intelligent machines that carry an optimal collection of tools to seek out other life.”

    During data collection, the team aims to create an open-source sample library and data repository. This resource will enable future research by providing scientists with open access to these extremely rich datasets while building a shared foundation for life detection efforts across the planetary science community.

    “We’re at the edge of a new era in astrobiology,” Wong concluded. “We’ve never had more data or more computing power. Now is the moment to bring it all together and finally ask—and maybe answer—the biggest question of all: Are we alone?”

    Continue Reading

  • Can We Teach A.I. to Detect Life? Carnegie Scientist Co-Leads NASA-Funded Effort

    Can We Teach A.I. to Detect Life? Carnegie Scientist Co-Leads NASA-Funded Effort

    Washington, D.C. — A multi-institutional team co-led by Carnegie Science’s Michael L. Wong and Caleb Scharf of the NASA Ames Research Center has received a prestigious $5 million, five-year NASA Interdisciplinary Consortia for Astrobiology Research (ICAR) grant to develop A.I. tools for enhancing the search for signs of life on other planets.

    The cross-disciplinary project brings together experts in chemistry, geoscience, machine learning, and planetary science to address one of astrobiology’s biggest challenges—reliably distinguishing life from non-life in planetary data.

    At the heart of the project is a massive, curated dataset. Carnegie researchers—including Wong, Anirudh Prabhu, Robert Hazen, and George Cody—will lead the effort to generate highly detailed profiles of at least 1,000 samples, ranging from meteorites to fossils to living organisms. They will employ a suite of advanced techniques to analyze the molecular and chemical signatures across this broad sample set.

    “A.I. will help us identify patterns in these massive multidimensional datasets that no human, or team of humans, could sift through in one lifetime,” said Wong. “It’s a tool we can use to detect the subtle biosignatures we might otherwise miss in the noise. It may even help us illuminate the fundamental differences between life and non-life.”

    Partner institutions across the U.S.—including NASA Ames Research Center, Johns Hopkins University, Rutgers University, Caltech, Howard University, Purdue University, and NASA’s Goddard Space Flight Center—will provide additional instrumentation and laboratory expertise, transforming this effort into a national-scale, data-generation engine. Once the data collection is complete, the team will develop and train machine learning models on this expansive dataset to find patterns that consistently indicate life. 

    “Carnegie has a rich legacy of planetary science and cosmochemistry,” noted Carnegie Science Earth and Planets Laboratory Director Michael Walter. “Few places are better equipped to handle such a wide range of Earth and planetary samples.”

    This isn’t just about developing A.I. tools—it’s about putting that intelligence to work. Wong and his team will use their findings to recommend the most effective scientific instruments for future missions, ensuring we send the most promising tools to the most promising extraterrestrial locations in our search for life.

    “For NASA, this is incredibly valuable,” says Scharf, “exploring Mars, or an icy moon in the outer Solar System, is hugely challenging and we’re going to need to rely more and more on intelligent machines that carry an optimal collection of tools to seek out other life.”

    During data collection, the team aims to create an open-source sample library and data repository. This resource will enable future research by providing scientists with open access to these extremely rich datasets while building a shared foundation for life detection efforts across the planetary science community.

    “We’re at the edge of a new era in astrobiology,” Wong concluded. “We’ve never had more data or more computing power. Now is the moment to bring it all together and finally ask—and maybe answer—the biggest question of all: Are we alone?”

    Continue Reading

  • The Sun’s smallest loops ever seen in stunning new images

    The Sun’s smallest loops ever seen in stunning new images

    The highest-resolution images of a solar flare captured at the H-alpha wavelength (656.28 nm) may reshape how we understand the Sun’s magnetic architecture — and improve space weather forecasting. Using the U.S. National Science Foundation (NSF) Daniel K. Inouye Solar Telescope, built and operated by the NSF National Solar Observatory (NSO), astronomers captured dark coronal loop strands with unprecedented clarity during the decay phase of an X1.3-class flare on August 8, 2024, at 20:12 UT. The loops averaged 48.2 km in width — perhaps as thin as 21 km — the smallest coronal loops ever imaged. This marks a potential breakthrough in resolving the fundamental scale of solar coronal loops and pushing the limits of flare modeling into an entirely new realm.

    Coronal loops are arches of plasma that follow the Sun’s magnetic field lines, often preceding solar flares that trigger sudden releases of energy associated with some of these magnetic field lines twisting and snapping. This burst of energy fuels solar storms that can impact Earth’s critical infrastructure. Astronomers at the Inouye observe sunlight at the H-alpha wavelength (656.28 nm) to view specific features of the Sun, revealing details not visible in other types of solar observations.

    “This is the first time the Inouye Solar Telescope has ever observed an X-class flare,” says Cole Tamburri, the study’s lead author who is supported by the Inouye Solar Telescope Ambassador Program while completing his Ph.D. at the University of Colorado Boulder (CU). The program is funded by the NSF and is designed to support Ph.D. students as they create a well-networked cohort of early-career scientists at U.S. Universities, who will bring their expertise in Inouye data reduction and analysis to the broader solar community. “These flares are among the most energetic events our star produces, and we were fortunate to catch this one under perfect observing conditions.”

    The team — which includes scientists from the NSO, the Laboratory for Atmospheric and Space Physics (LASP), the Cooperative Institute for Research in Environmental Sciences (CIRES), and CU — focused on the razor-thin magnetic field loops (hundreds of them) woven above the flare ribbons. On average, the loops measured about 48 km across, but some were right at the telescope’s resolution limit. “Before Inouye, we could only imagine what this scale looked like,” Tamburri explains. “Now we can see it directly. These are the smallest coronal loops ever imaged on the Sun.”

    The Inouye’s Visible Broadband Imager (VBI) instrument, tuned to the H-alpha filter, can resolve features down to ~24 km. That is over two and a half times sharper than the next-best solar telescope, and it is that leap in resolution that made this discovery possible. “Knowing a telescope can theoretically do something is one thing,” Maria Kazachenko, a co-author in the study and NSO scientist, notes. “Actually watching it perform at that limit is exhilarating.”

    While the original research plan involved studying chromospheric spectral line dynamics with the Inouye’s Visible Spectropolarimeter (ViSP) instrument, the VBI data revealed something unexpected treasures — ultra-fine coronal structures that can directly inform flare models built with complex radiative-hydrodynamic codes. “We went in looking for one thing and stumbled across something even more intriguing,” Kazachenko admits.

    Theories have long suggested coronal loops could be anywhere from 10 to 100 km in width, but confirming this range observationally has been impossible — until now. “We’re finally peering into the spatial scales we’ve been speculating about for years,” says Tamburri. “This opens the door to studying not just their size, but their shapes, their evolution, and even the scales where magnetic reconnection — the engine behind flares — occurs.”

    Perhaps most tantalizing is the idea that these loops might be elementary structures — the fundamental building blocks of flare architecture. “If that’s the case, we’re not just resolving bundles of loops; we’re resolving individual loops for the first time,” Tamburri adds. “It’s like going from seeing a forest to suddenly seeing every single tree.”

    The imagery itself is breathtaking: dark, threadlike loops arching in a glowing arcade, bright flare ribbons etched in almost impossibly sharp relief — a compact triangular one near the center, and a sweeping arc-shaped one across the top. Even a casual viewer, Tamburri suggests, would immediately recognize the complexity. “It’s a landmark moment in solar science,” he concludes. “We’re finally seeing the Sun at the scales it works on.” Something made only possible by the NSF Daniel K. Inouye Solar Telescope’s unprecedented capabilities.

    The paper describing this study, titled “Unveiling Unprecedented Fine Structure in Coronal Flare Loops with the DKIST,” is now available in The Astrophysical Journal Letters.

    Continue Reading