Category: 7. Science

  • Rare breed of exploding star discovered by citizen scientists in cataclysmic find

    Rare breed of exploding star discovered by citizen scientists in cataclysmic find

    Astronomers have teamed up with citizen scientists to discover a brand-new exploding star that’s greedily feeding on a stellar companion.

    The newly observed binary system features a cataclysmic variable star, designated GOTO0650, which is in a rarely seen late stage of its evolution. This was also the first major discovery for the citizen astronomy project Kilonova Seekers.

    Continue Reading

  • New Google AI Will Work Out What 98% of Our DNA Actually Does for the Body

    New Google AI Will Work Out What 98% of Our DNA Actually Does for the Body

    Vast swathes of the human genome remain a mystery to science. A new AI from Google DeepMind is helping researchers understand how these stretches of DNA impact the activity of other genes.

    While the Human Genome Project produced a complete map of our DNA, we still know surprisingly little about what most of it does. Roughly 2 percent of the human genome encodes specific proteins, but the purpose of the other 98 percent is much less clear.

    Historically, scientists called this part of the genome “junk DNA.” But there’s growing recognition these so-called “non-coding” regions play a critical role in regulating the expression of genes elsewhere in the genome.

    Teasing out these interactions is a complicated business. But now a new Google DeepMind model called AlphaGenome can take long stretches of DNA and make predictions about how different genetic variants will affect gene expression, as well as a host of other important properties.

    “We have, for the first time, created a single model that unifies many different challenges that come with understanding the genome,” Pushmeet Kohli, a vice president for research at DeepMind, told MIT Technology Review.

    The so-called “sequence to function” model uses the same transformer architecture as the large language models behind popular AI chatbots. The model was trained on public databases of experimental results testing how different sequences impact gene regulation. Researchers can enter a DNA sequence of up to one million letters, and the model will then make predictions about a wide range of molecular properties impacting the sequence’s regulatory activity.

    These include things like where genes start and end, which sections of the DNA are accessible or blocked by certain proteins, and how much RNA is being produced. RNA is the messenger molecule responsible for carrying the instructions contained in DNA to the cell’s protein factories, or ribosomes, as well as regulating gene expression.

    AlphaGenome can also assess the impact of mutations in specific genes by comparing variants, and it can make predictions about RNA “splicing”—a process where RNA molecules are chopped up and packaged before being sent off to a ribosome. Errors in this process are responsible for rare genetic diseases, such as spinal muscular atrophy and some forms of cystic fibrosis.

    Predicting the impact of different genetic variants could be particularly useful. In a blog post, the DeepMind researchers report they used the model to predict how mutations other scientists had discovered in leukemia patients probably activated a nearby gene known to play a role in cancer.

    “This system pushes us closer to a good first guess about what any variant will be doing when we observe it in a human,” Caleb Lareau, a computational biologist at Memorial Sloan Kettering Cancer Center granted early access to AlphaGenome, told MIT Technology Review.

    The model will be free for noncommercial purposes, and DeepMind has committed to releasing full details of how it was built in the future. But it still has limitations. The company says the model can’t make predictions about the genomes of individuals, and its predictions don’t fully explain how genetic variations lead to complex traits or diseases. Further, it can’t accurately predict how non-coding DNA impacts genes that are located more than 100,000 letters away in the genome.

    Anshul Kundaje, a computational genomicist at Stanford University in Palo Alto, California, who had early access to AlphaGenome, told Nature that the new model is an exciting development and significantly better than previous models, but not a slam dunk. “This model has not yet ‘solved’ gene regulation to the same extent as AlphaFold has, for example, protein 3D-structure prediction,” he says.

    Nonetheless, the model is an important breakthrough in the effort to demystify the genome’s “dark matter.” It could transform our understanding of disease and supercharge synthetic biologists’ efforts to re-engineer DNA for our own purposes.

    Continue Reading

  • Wellcome backs ‘moonshot’ project to recreate human genome in the lab that could unlock new medical treatments

    Wellcome backs ‘moonshot’ project to recreate human genome in the lab that could unlock new medical treatments

    A team of researchers is beginning work on creating new tools that could eventually lead to the synthesis of the human genome in the lab. Wellcome is providing £10 million to the Synthetic Human Genome Project, which it expects will unlock new medical treatments.

    Making the whole genome of three billion base pairs of nucleotides is the ‘moonshot’, says Tom Ellis, one of the project leads who researches synthetic chromosomes at Imperial College, London.

    The scientists will first try to create a small chromosome, comprising about 2% of total human DNA. Along the way, they’ll also develop the tools to design DNA and get it into human cells that could enable the development of targeted treatments and better tools for screening drugs.

    ‘If we’re making huge progress in understanding health from reading and then editing [DNA], then logically, it makes sense that we’ll learn a lot more if we can do writing as well,’ says Ellis. Improving and standardising technologies so they can be routinely used to write whole genes or regions of multiple genes should help researchers understand how mutations in those genes lead to disease.

    Two of the groups involved in the new project, at Imperial and the University of Manchester, have been involved in synthesising the yeast genome and another group, the Escherichia coli genome, consisting of 4 million base pairs of nucleotides. In theory, says Ellis, scaling up to 50 million base pairs could be done with 10 times as many people working in parallel were it not for the practicalities.

    Compared with a yeast or bacterial genome, human DNA is ‘more full of junk, and that junk is a lot harder to work with because it contains a lot of the same sequence repeated many, many times’. A great number of those sequences are there for structural reasons rather than encoding information. ‘Those bits of DNA are much harder to work with in terms of synthesising them and linking them together,’ explains Ellis.

    And unlike fast-growing microbes that will accept DNA, ‘human cells are much harder to get big pieces of DNA into and it can take you weeks before you know whether it’s worked or not’, he points out.

    The project will rely on the commercial sector to synthesise sections of DNA. At present, says Ellis, biotech companies are chemically synthesising DNA up to about 300 bases at a time. Those sections are then linked together, getting to 10,000 to 20,000 bases by cloning the DNA using bacteria. ‘Where there’s room for innovation is if chemistry can do it all with very good accuracy – up to 20,000 bases or longer – then this huge effort of parallelised building can be dramatically reduced.’ The synthesis project will then focus on the means to assemble those long DNA sections.

    Screening for accuracy and isolating accurately synthesised DNA gets costlier the longer the sections are. And the cost of chemicals to custom-make synthetic DNA could swallow up half the project budget. ‘We don’t want to spend it on the DNA, we want to spend it on people innovating. So we really need to push the chemistry community to longer DNA, cheaper DNA,’ adds Ellis.

    Continue Reading

  • Big saving on the Celestron NexStar 8SE this early Prime Day telescope deal — cheapest since January

    Big saving on the Celestron NexStar 8SE this early Prime Day telescope deal — cheapest since January

    Save $200 this Amazon Prime Day on the Celestron NexStar 8SE. This telescope appears in several of our guides, ranking as the best overall telescope for seeing the planets as well as the best overall telescope for deep space and the best motorized telescope. Now you can get it at the cheapest price we’ve seen it since January, coming in reduced from $1699 to $1499 on Amazon.

    Get the Celestron NexStar 8SE on sale right now at Amazon for $1499.

    The Celestron NexStar 8SE received four and a half stars out of five in our review. We loved how accessible it was from beginner to advanced skywatchers as well as its portable nature. Not only this but, with its catadioptric construction, it means it is one of the most compact telescopes for deep space watching.

    Continue Reading

  • New Centaur AI model aims to mimic human decision-making

    New Centaur AI model aims to mimic human decision-making

    Researchers said they have developed an artificial intelligence system that can predict and simulate people’s decisions across a wide variety of situations.

    Dubbed Centaur, the model was trained on 160 psychology studies involving 60,000 participants making more than 10 million choices while completing different tasks, like memory games, gambling, and problem solving.

    Researchers found that Centaur was able to capture human behavior across several language-based scenarios, including ones it hadn’t been trained on.

    Some experts not involved with the project argued that Centaur doesn’t meaningfully mimic human cognition.

    Still, Centaur’s creators ultimately hope the model could be used to run experiments faster than conventional cognitive science studies.

    Continue Reading

  • 156-foot-long solid rocket motor produces 4 million pounds of thrust

    156-foot-long solid rocket motor produces 4 million pounds of thrust

    A Virginia-based propulsion-focused company conducted a full-scale static fire of NASA’s Booster Obsolescence and Life Extension (BOLE) solid rocket booster.

    The 156-foot-long five-segment solid rocket motor produced upwards of 4 million pounds of thrust during the test conducted by Northrop Grumman.

    This was the first demonstration test of the world’s largest and most powerful segmented solid rocket motor built for human spaceflight. 

    Booster features updated propellant formulation

    The company revealed that the booster features a composite case design, updated propellant formulation, and advanced components to increase booster performance by more than 10 percent compared with the current five-segment Space Launch System (SLS) booster design.

    The booster, claimed to be more efficient than its predecessor, provides another five metric tons of payload to lunar orbit, a capability critical to supporting deep space missions.

    Produces more than 4 million pounds of thrust from a single booster

    More than 700 data channels assessed the motor as it fired for just over two minutes, producing more than 4 million pounds of thrust from a single booster.

    Leveraging Northrop Grumman’s industry-leading experience in solid rocket motor manufacturing, BOLE improves on previous designs by replacing key components that are no longer in production.

    “Today’s test pushed the boundaries of large solid rocket motor design to meet rigorous performance requirements,” said Jim Kalberer, vice president, propulsion systems, Northrop Grumman. 

    “While the motor appeared to perform well through the most harsh environments of the test, we observed an anomaly near the end of the two-plus minute burn. As a new design, and the largest segmented solid rocket booster ever built, this test provides us with valuable data to iterate our design for future developments.”

    Carbon fiber composite case enables better booster performance

    The company revealed that the carbon fiber composite case enables better booster performance, faster manufacturing, and aligns with commercial standards by providing commonality among our infrastructure, supply chain, and manufacturing operations. Other aspects of the BOLE design, including metallic components, allow the company to support a U.S.-based supply chain of American manufacturers.

    The BOLE booster development, awarded in 2017, represents a significant step towards more sustainable commercial practices and incorporates commonality in design and construction standards from across all of Northrop Grumman’s production programs, according to a press release.

    The company supplied rocket propulsion for NASA’s Apollo and Space Shuttle Programs and developed the five-segment SLS solid rocket booster based on the flight-proven design of the space shuttle boosters. The five-segment booster, BOLE’s predecessor, generates 25 percent more power than its space shuttle predecessor and provided over 75 percent of the SLS rocket’s initial thrust during the Artemis I mission on November 15, 2022. 

    The BOLE booster is designed to power the Space Launch System for Artemis missions returning to the moon, with capabilities that could eventually support Mars exploration. The Artemis program currently has shuttle era boosters that will last through Artemis 8 but after that, the SLS launch system carrying the Orion capsule will need a new, more advanced motor.

    Continue Reading

  • Researchers reveal how coral dispersal strengthens reef populations-Xinhua

    SYDNEY, July 3 (Xinhua) — Researchers have discovered that the ability of coral larvae to disperse over long distances plays a critical role in strengthening Great Barrier Reef coral populations.

    Researchers from University of Queensland (UQ), Australia, revealed that well-connected coral communities are better equipped to adapt to climate change and recover from environmental disturbances, offering new hope for the future of the Great Barrier Reef, the world’s largest coral reef system, according to a UQ statement released on Thursday.

    “Species that don’t disperse or breed as far are more likely to form isolated populations, reducing their capacity to recover from bleaching events or habitat degradation,” said UQ PhD candidate Zoe Meziere.

    Researchers examined the genetics of two coral species, Stylophora pistillata and Popillopora verrucosa, across reefs from Far North Queensland to Flinders Reef, a small isolated reef near Brisbane, Queensland.

    The study found that S. pistillata larvae settle just 23 to 102 meters from their parent coral, while P. verrucosa larvae can disperse up to 52 kilometers, leading to greater genetic diversity and more resilient populations.

    The study, detailed in Science Advances, published by the American Association for the Advancement of Science, highlights that supporting natural coral connectivity is vital for effective conservation, as greater genetic exchange boosts reefs’ ability to recover and adapt.

    Continue Reading

  • When rainforests died, the planet caught fire: New clues from Earth’s greatest extinction

    When rainforests died, the planet caught fire: New clues from Earth’s greatest extinction

    The collapse of tropical forests during Earth’s most catastrophic extinction event was the primary cause of the prolonged global warming which followed, according to new research.

    The Permian-Triassic Mass Extinction – sometimes referred to as the “Great Dying,” happened around 252 million years ago, leading to the massive loss of marine species and significant declines in terrestrial plants and animals.

    The event has been attributed to intense global warming triggered by a period of volcanic activity in Siberia, known as the Siberian Traps, but scientists have been unable to pinpoint why super-greenhouse conditions persisted for around five million years afterwards.

    Now a team of international researchers led by the University of Leeds and the China University of Geosciences in Wuhan has gathered new data which supports the theory that the demise of tropical forests, and their slow recovery, limited carbon sequestration – a process where carbon dioxide is removed from the atmosphere and held in plants, soils or minerals.

    During extensive field studies, the team used a new type of analysis of fossil records as well as clues about past climate conditions found in certain rock formations to reconstruct maps of changes in plant productivity during the Permian-Triassic Mass Extinction.

    Their results, which are published on July 2 in Nature Communications,show that vegetation loss during the event led to greatly reduced levels of carbon sequestration resulting in a prolonged period where there were high levels of CO2.

    The paper’s lead author, Dr Zhen Xu, from the School of Earth and Environment, University of Leeds, said: “The causes of such extreme warming during this event have been long discussed, as the level of warming is far beyond any other event.

    “Critically, this is the only high temperature event in Earth’s history in which the tropical forest biosphere collapses, which drove our initial hypothesis. Now, after years of fieldwork, analysis and simulations, we finally have the data which supports it.”

    The researchers believe their results reinforce the idea that thresholds, or ‘tipping points’ exist in Earth’s climate-carbon system which, when reached, means that warming can be amplified.

    China is home to the most complete geological record of the Permian-Triassic mass Extinction and this work leverages an incredible archive of fossil data that has been gathered over decades by three generations of Chinese geologists.

    The lead author Dr Zhen Xu is the youngest of these and is continuing the work begun by Professor Hongfu Yin and Professor Jianxin Yu, who are also authors of the study. Since 2016, Zhen and her colleagues have travelled throughout China from subtropical forests to deserts, including visiting areas accessible only by boat or on horseback.

    Zhen came to the University of Leeds in 2020 to work with Professor Benjamin Mills on simulating the extinction event and assessing the climate impacts of the loss of tropical vegetation which is shown by the fossil record. Their results confirm that the change in carbon sequestration suggested by the fossils is consistent with the amount of warming that occurred afterwards.

    Professor Mills added: “There is a warning here about the importance of Earth’s present day tropical forests. If rapid warming causes them to collapse in a similar manner, then we should not expect our climate to cool to preindustrial levels even if we stop emitting CO2.

    “Indeed, warming could continue to accelerate in this case even if we reach zero human emissions. We will have fundamentally changed the carbon cycle in a way that can take geological timescales to recover, which has happened in Earth’s past.”

    Reflecting on the study’s broader mission, Professor Hongfu Yin and Professor Jianxin Yu of the China University of Geosciences, underscored the urgency of blending tradition with innovation: “Paleontology needs to embrace new techniques — from numerical modelling to interdisciplinary collaboration — to decode the past and safeguard the future,” explained Professor Yin.

    Professor Yu added: “Let’s make sure our work transcends academia: it is a responsibility to all life on Earth, today and beyond. Earth’s story is still being written, and we all have a role in shaping its next chapter.”

    This research is primarily funded by the UK Research and Innovation (UKRI) and the National Natural Science Foundation of China (NSFC), with additional funding for collaborators provided by UKRI, ETH+, and the Australian Research Council. The work was conducted in collaboration with the following institutions:

    • School of Earth and Environment, University of Leeds, Leeds, LS2 9JT, UK
    • State Key Laboratory of Geomicrobiology and Environmental Changes, School of Earth Sciences, China University of Geosciences, Wuhan, 430074, P.R. China
    • School of Physics, Chemistry and Earth Science, University of Adelaide, Adelaide, SA 5005, Australia
    • Birmingham Institute of Forest Research, University of Birmingham, Edgbaston, Birmingham, B15 2TT, UK
    • Department of Biosystems Science and Engineering, ETH Zürich, Basel, 4056, Switzerland
    • Computational Evolution Group, Swiss Institute of Bioinformatics, Lausanne, 1015, Switzerland
    • State Key Laboratory of Geological Processes and Mineral Resources, China University of Geosciences, Wuhan, 430074, P.R. China
    • Department of Biology, Howard University, Washington DC, USA
    • Géosciences Environnement Toulouse, CNRS-Université de Toulouse III, Toulouse, France
    • CEREGE, Aix Marseille Université, CNRS, IRD, INRA, Coll France, Aix-en-Provence, France

    Continue Reading

  • Years-Old Groundwater Dominates Spring Mountain Streams

    Years-Old Groundwater Dominates Spring Mountain Streams

    As winter gives way to spring, seasonal snowpack in the American West begins to melt.

    Though some of that melt flows over and through shallow alpine soil, new research shows that much of it sinks into bedrock where it percolates for years before resurfacing. Fresh snowmelt makes up less than half of the water in the region’s gushing spring streams, according to the study.

    The new finding could improve water resources forecasts. Hydrologic models, which inform the forecasts, largely overlook groundwater contributions and assume the spring’s heavy flows come directly from seasonal snowmelt.

    The authors of the study, published in Communications Earth & Environment, used a radioactive isotope of hydrogen known as tritium to measure when the water in 42 western U.S. catchments fell as precipitation.

    They found that during late winter, when rain and snowmelt were scarce and streams were fed primarily by groundwater, the water fell as precipitation an average of 10.4 years ago. Even during spring, when the same streams were overflowing with fresh runoff, their chilly waters had an average age of 5.7 years, still indicating significant contributions from groundwater.

    A Subterranean Bucket

    Hydrologic models typically simulate mountains as impermeable masses covered with a thin sponge of alpine soil, said the study’s first author, Paul Brooks, a hydrologist at the University of Utah. The sponge can absorb some water, but anything extra will quickly drain away.

    “Snowmelt is being recharged into groundwater and is mobilizing groundwater that has been stored over much longer [periods].”

    However, over the past few decades, scientists have uncovered a steady stream of hints that mountains may store huge volumes of water outside their spongy outer layer. Many high-elevation creeks carry dissolved minerals similar to those found in groundwater, suggesting a subterranean origin. Scientists studying healthy alpine ecosystems in arid conditions have wondered whether plants were tapping into a hidden reservoir of water.

    Though snowmelt and rainfall immediately increase streamflow, the relationship is not intuitive. “What appears to be happening is that snowmelt is being recharged into groundwater and is mobilizing groundwater that has been stored over much longer [periods],” said James Kirchner, a hydrologist at Eidgenössische Technische Hochschule Zürich who was not involved in the research.

    In areas where the mountains were made of porous sandstone, waters monitored in the new study were much older. In one such stream, the average age of water in winter was 14 years.

    Mountains are “more like a bucket with a sponge on top.”

    The authors were able to convincingly demonstrate the age of the flows because they used tritium, Kirchner said. Though scientists have previously used tritium to date water from individual streams and large bodies such as oceans and lakes, this study is the first to use tritium to date alpine groundwater and snowmelt across multiple catchments, Brooks said.

    On the basis of historic flows, annual precipitation, and the ages of the stream water, the mountains could store an order of magnitude more water than accounted for in current models, Brooks said. As opposed to the impermeable masses in traditional models, he explained, mountains are “more like a bucket with a sponge on top.”

    This finding could change how scientists think about the alpine water cycle. “If precipitation takes, on average, years to exit as streamflow, that means that streamflow in any one year is a function of years of climate and weather,” Brooks said. That means forecasters should consider more than just the most recent snowpack when estimating spring flows and potential flooding.

    But further research is needed to unearth the role mountains play in water storage. The current study is limited because it covers only snowmelt-driven streams in the arid western United States, Kirchner said. Things might work differently in wetter places, he added.

    —Mark DeGraff (@markr4nger.bsky.social), Science Writer

    Citation: DeGraff, M. (2025), Years-old groundwater dominates spring mountain streams, Eos, 106, https://doi.org/10.1029/2025EO250238. Published on 3 July 2025.
    Text © 2025. The authors. CC BY-NC-ND 3.0
    Except where otherwise noted, images are subject to copyright. Any reuse without express permission from the copyright owner is prohibited.

    Continue Reading

  • Warming Gulf of Maine Buffers Ocean Acidification—For Now – eos.org

    1. Warming Gulf of Maine Buffers Ocean Acidification—For Now  eos.org
    2. Rising ocean acidification prompts urgent calls for marine protection  Eco-Business
    3. Trevor Hancock: As the cliff edge looms, governments hit the accelerator  Times Colonist
    4. Bad news – the ocean is becoming increasingly acidic and is already threatening marine life and our food security  Unión Rayo
    5. As ocean acidification ramps up, experts call for speedy ocean protection  Mongabay

    Continue Reading