Category: 7. Science

  • Fighting extinction, coral reefs show signs of adapting to warming seas

    Fighting extinction, coral reefs show signs of adapting to warming seas

    As coral reefs decline at unprecedented rates, new research has revealed that some coral species may be more resilient to warming temperatures than others. 

    By studying how six months of elevated ocean temperatures would affect a species of coral from the northern Red Sea called Stylophora pistillata, scientists found that although these organisms can certainly survive in conditions that mimic future warming trends, they don’t thrive.

    Stylophora pistillata tend to be tolerant of high ocean temperatures, but when continuously exposed to temperatures of 27.5 and 30 degrees Celsius (81.5 and 86 degrees Fahrenheit) — baseline warming expected in tropical oceans by 2050 and 2100 — scientists saw various changes in coral growth, metabolic rates, and even energy reserves. For instance, coral in 27.5 degrees Celsius waters survived, but were 30% smaller than their control group; those placed in 30 degrees Celsius waters wound up being 70% smaller. 

    “In theory, if corals in the wild at these temperatures are smaller, reefs might not be as diverse and may not be able to support as much marine life,” said Ann Marie Hulver, lead author of the study and a former graduate student and postdoctoral scholar in earth sciences at The Ohio State University. “This could have adverse effects on people that depend on the reef for tourism, fishing or food.”

    Overall, the team’s results suggest that even the most thermally tolerant coral species may suffer in their inability to overcome the consequences of warming seas. 

    The study was published today in the journal Science of the Total Environment.

    While current predictions for coral reefs are dire, there is some good news. During the first 11 weeks of the experiment, researchers saw that corals were only minimally affected by elevated baseline temperatures. Instead, it was the cumulative impact of chronic high temperatures that compromised coral growth and caused them to experience a higher metabolic demand. 

    The coral later recovered after being exposed for a month to 25 degree Celsius waters, but had a dark pigmentation compared to corals that were never heated. This discovery implies that despite facing ever longer periods of threat from high ocean temperatures in the summer months, resilient coral like S. pistillata can bounce back when waters cool in the winter, researchers say. 

    Still, as ocean temperatures are expected to increase by 3 degrees Celsius by 2100, expecting coral reefs to predictably bend to projected climate models can be difficult, according to the researchers. 

    Andrea GrottoliThis team’s research does paint a more detailed picture of how coral reefs may look and function in the next 50 years, said Andrea Grottoli, co-author of the study and a professor in earth sciences at Ohio State.

    “Survival is certainly the No. 1 important thing for coral, but when they’re physiologically compromised, they can’t do that forever,” said Grottoli. “So there’s a limit to how long these resilient corals can cope with an ever increasing warming ocean.”

    Gaining a more complex understanding of how warming waters can alter coral growth and feeding patterns may also better inform long-term conservation efforts, said Grottoli. 

    “Conservation efforts could focus on areas where resilient coral are present and create protected sanctuaries so that there are some ecosystems that grow as high-probability-success reefs for the future,” she said. 

    For now, all coral reefs are still in desperate need of protection, researchers note. To that end, Hulver imagines future work could be aimed at investigating the resilience of similar species of coral, including replicating this experiment to determine if sustained warming might cause trade-offs in other biological processes, such as reproduction. 

    “For coral, six months is still a very small snapshot of their lives,” said Hulver. “We’ll have to keep on studying them.”

    Other Ohio state co-authors include Shannon Dixon and Agustí Muñoz-Garcia as well as Éric Béraud and Christine Ferrier-Pagès from the Centre Scientifique de Monaco, and Aurélie Moya, Rachel Alderdice and Christian R Voolstra from the University of Konstanz. The study was supported by the National Science Foundation and the German Research Foundation. 

    ';

    Continue Reading

  • Everything everywhere all at once: Decision-making signals engage entire brain

    Everything everywhere all at once: Decision-making signals engage entire brain

    Activity associated with choice showed up in cortical areas, in line with previous findings, but it also occurred in subcortical areas such as the hindbrain and cerebellum, challenging the notion that only a few select areas encode information about decision-making and supporting the idea that it is widespread.

    “It is interesting how much choice selectivity is everywhere,” says Long Ding, research associate professor of neuroscience at the University of Pennsylvania, who was not involved in the work.

    Movement- and feedback-related signals also pervaded across the brain: 81 percent of recorded brain regions contained information that could predict the animal’s wheel speed, and activity from nearly all recorded brain regions—including those beyond the associated reward areas—accurately predicted whether the mouse had received a reward or not, with stronger activity in the thalamus, the midbrain and the hindbrain.

    If the mice saw the circle more often on one side of the screen than the other, they eventually integrated that prior information into their next decision. This information was represented broadly across 20 to 30 percent of the brain, including in sensory processing areas, such as the dorsal lateral geniculate, that are located early in the visual pathway, the team reported in the second study.

    The findings contradict the long-standing idea that prior information is integrated into the process only in higher-order cortical or decision-making regions “at the very last step,” Churchland says. Instead, priors shape decisions all along, the new findings suggest.

    A

    ltogether, the studies suggest that the current model of decision-making and the brain regions that control it might be limited in scope and that other, unexplored brain areas might also be important, Churchland says.

    And although the analyses show that a distributed network of brain regions contains information about decision-making even at early stages of sensory processing, the results do not show causality, so future studies need to determine how that information is used, Ding says. “Yes, [the information is] reflected everywhere, but where is it actually used for the next decision, for learning?”

    The comprehensive map sets the stage for those next experiments and could even act as a “library” to help neuroscientists double-check results in their own labs, de Lange says, and ultimately, these studies underscore the importance of large-scale, multilab efforts, particularly for studying brain activity.

    The global consortium has since expanded to include 21 experimental and theoretical neuroscience labs and has established a new group called IBL 2.0 that plans to share the tools and expertise it has amassed with new partners, Churchland says. “I hope that our work makes clear that when larger groups of folks team up, they can accomplish things that are beyond the scale of a single laboratory and that really generate critical insights for the field.”

    Continue Reading

  • James Webb telescope discovers ‘exceptionally rare’ 5-galaxy crash in the early universe

    James Webb telescope discovers ‘exceptionally rare’ 5-galaxy crash in the early universe

    Astronomers have discovered an incredibly rare system in which at least five galaxies from the early universe are merging — just 800 million years after the Big Bang. The remarkable discovery was made using data from the James Webb Space Telescope (JWST) and the Hubble Space Telescope.

    Galaxy mergers play a key role in galaxy formation in the early universe. While not commonly seen, merging systems do occur, typically involving two galaxies. However, the newly identified merger, nicknamed JWST’s Quintet, contains at least five galaxies and 17 galaxy clumps.

    Continue Reading

  • Earth science’s future at NASA hangs in the balance

    Earth science’s future at NASA hangs in the balance

    • Acting NASA Administrator Sean Duffy’s statements indicated a potential shift in NASA’s focus away from Earth science and climate research, prioritizing human space exploration instead, aligning with proposed budget cuts significantly reducing Earth science funding.
    • This proposed shift contrasts with NASA’s historical mandate, enshrined in the 1958 National Aeronautics and Space Act, which includes the expansion of human knowledge of atmospheric and space phenomena, and with the agency’s established collaboration with other agencies like NOAA.
    • Critics argue that NASA’s Earth observation capabilities are crucial for national security, disaster preparedness, economic benefit, and informing planetary science research, and that a complete transfer of these functions to the private sector is neither feasible nor desirable.
    • The debate over NASA’s future direction involves not only budgetary considerations but also questions about the agency’s fundamental mission, the role of government in scientific research, and the potential consequences of prioritizing human space exploration at the expense of Earth science.

    During an Aug. 14 appearance on Fox Business, Acting NASA Administrator Sean Duffy declared that the agency’s mission is “to explore, not to do all of these earth sciences,” signaling a potential shift away from NASA’s decades-long role in Earth observation and climate research.

    Duffy later softened his stance during an Aug. 18 visit to Johnson Space Center, saying that NASA would still adhere to its congressional directives. But he still suggested that other agencies could take the lead on climate science. 

    “Listen, you can go other places for your climate change science. This is the only civil agency in government that does human space exploration. No one else does it, just us, and so that is, that’s the focus, and that’s what I meant by that,” Duffy said, according to a transcript of the comments provided to Astronomy by the NASA press office. 

    This vision for NASA largely aligns with the White House’s 2026 budget proposal, which cuts NASA’s overall budget by 24 percent; while human exploration receives an increase, the agency’s science funding is slashed by nearly half, including earth science by 53 percent.

    The comments from Duffy are some of the most explicit yet from NASA leadership about the Trump administration’s downsized view of NASA’s role — and many scientists have pushed back vigorously against it. They argue that earth science is essential to NASA’s mission, not a distraction from it, pointing to the agency’s legal charter and historical relationships, and contending that its unique capabilities cannot be simply offloaded to other government agencies or the private sector.

    And, they point out, an agency that seeks to lead the world in planetary science but ignores Earth would be missing out on studying the most unique planet yet known in the universe — our own.

    “Earth is a planet,” Camille Bergin, an aerospace engineer and science communicator, tells Astronomy. “And I think that’s what people forget.”

    Defining the mission

    At the heart of the matter is the interpretation of NASA’s core function. Duffy’s comment that NASA is “meant to explore” frames the agency’s purpose as looking outward. The subsequent suggestion that “other agencies” could handle earth science points to a vision of a more streamlined NASA, free to focus on the unique challenge of sending humans to the Moon, Mars, and beyond.

    However, critics argue that it overlooks the agency’s foundational charter. “It’s the National Aeronautics and Space Administration,” Bergin says. “So much of what NASA does is within our atmosphere.” This perspective is rooted in the 1958 National Aeronautics and Space Act, which explicitly lists as a primary objective “The expansion of human knowledge of phenomena in the atmosphere and space.” 

    Duffy may be unaware of this history, suggests Robert Kopp, a climate scientist at Rutgers University. “To my knowledge, Sean Duffy is the first person to serve as administrator or acting administrator of NASA without any relevant experience,” Kopp says. “He might be unaware that [this] has been part of NASA’s statutory mission since its establishment.”

    According to a historical overview from the agency’s science division, this dual mandate has historically created a division of labor between NASA and other agencies, particularly the National Oceanic and Atmospheric Administration (NOAA). The early model established in the 1960s saw NASA’s role as pioneering new technology — developing and launching novel satellites and instruments. In this partnership, NOAA and the U.S. Geological Survey would then analyze the data for their operational missions, such as daily weather forecasting.

    Proponents of this model argue that it is why a simple handover of the mission is not a straightforward solution. Bergin is direct in her assessment: “I don’t think that any other agency can do what NASA does.” She explains that NASA’s “holistic view” of the solar system, which includes Earth, allows the agency “to do this research in ways that I don’t think other agencies can.”

    The commercial question 

    If NASA’s role in earth science is reduced, some private sector companies are eager to fill the void. In an article for EMSNow, which covers the global electronics manufacturing services industry, European commercial space executives framed a potential NASA pullback as a significant opportunity. “The potential rollback of NASA’s Earth-observation programmes should not be seen as a loss, but as a turning point,” said Anthony Baker, CEO of SatVu.

    Thomas Grübler, co-founder of OroraTech, echoed this, saying, “Private Earth-observation firms already offer a broad range of data and intelligence to governments and taxpayers, often at a much lower cost and with greater flexibility.”

    However, the enthusiasm from the private sector is tempered by a more balanced perspective among scientists. In a recent article for Nature, Danielle Wood, director of the Space Enabled Research Group at MIT, argues that while commercial data is innovative and useful, a balance is essential. “Private companies alone cannot provide all the Earth-observation data that the world needs. Nor should they,” she writes. Wood points out that public missions are set up to answer scientific questions and maintain public services, providing a trusted benchmark for data quality. Commercial missions, in contrast, are more likely to collect data based on customer requests or market opportunities.

    This aligns with the perspective of the NASA employees who signed a statement called the Voyager Declaration, arguing that “Basic research … and the stewardship of the Earth are inherently governmental functions that cannot and will not be taken up by the private sector.” 

    Planetary scientist Michael Battalio of Yale University says that the most important question isn’t whether private companies can take over, but whether they would. He argues that a private company has a “fiduciary responsibility to only spend on infrastructure that provides a return on investment,” which may not align with the long-term research and maintenance required. “Separately,” he adds, “companies may be financially incentivized to not observe our planet,” citing fossil fuel companies as an example. 

    Even if a transition to commercial earth science were to take place, says Bergin, when NASA steps back, there is a risk of a gap opening up before the private sector steps in. “We cannot afford to have that gap in the current political climate that we’re in,” she says.

    The view from the high ground

    In addition to its scientific and commercial value, Earth-observing capabilities are also deeply entwined with national security, says Bergin. “It protects people and it protects power,” Bergin explains, framing Earth observation as a dual-use technology, with “power” referring to military and geopolitical advantage. Bergin warns that in a transition to commercial imagery, any gap in access could present a direct security risk to the U.S.

    As polar ice melts, new shipping and military routes are opening in the Arctic. “If Russia and China have a clearer picture of the Arctic, for example, than we do, that’s not just a science gap, it’s a security risk,” Bergin warns. 

    A NASA brief on its Earth Science at Work initiative supports this view, stating its missions support national security by “enhancing situational awareness of ice cover and other conditions around Arctic seas.” Losing this capability, Bergin argues, means losing the upper hand. “We’re completely blind to not only what they’re doing, but we lose our decision advantage.”

    But protecting people isn’t just about military advantage; it’s also about safeguarding against infrastructure failure and natural disasters. Battalio says that government agencies from the USDA to the departments of Energy and Commerce depend on NASA’s observations. “NASA observations help farmers plan for droughts and floods so that we can feed ourselves, even with increasingly devastating natural disasters from our warming climate,” he says. “FEMA organizes aid based on NASA satellite imagery. The EPA uses NASA observations to monitor pollution and keep American citizens healthy.” 

    Bergin likens this to essential infrastructure, like the power grid or a bridge. “You don’t think about your power grid until you can’t turn your lights on,” she says. “Space is infrastructure. It’s critical infrastructure. It just happens to be above our planet.” This unseen infrastructure underpins modern life in countless ways. The ability to predict solar storms, a key function of Earth and space science, is crucial for protecting the GPS satellites that enable navigation and credit card transactions. 

    This support for the U.S. economy also extends to resource management. According to NASA, its data provides a “competitive advantage to American businesses” by aiding in tasks like “mapping rare Earth minerals” for the energy and technology sectors and helping farmers with “continuous measurements of water resources, crop health, and global production.” 

    And the data from Earth-observing satellites is essential for logistics. As Bergin notes, “Earth observation is critical to you getting your Amazon package.”

    Look to what you know

    The argument for shifting NASA’s focus to exploration rests on the premise that studying Earth and exploring space are two separate, competing missions. However, many scientists contend that the two are fundamentally intertwined. “We have no hope of understanding other planets if we do not understand the planet that we inhabit,” says Battalio.

    He explains that our knowledge of Earth provides the essential baseline for all planetary science. “When Mars rovers discover minerals that point to the presence of liquid water in the past, we know that is the case because we study Earth,” he says. “Everything we know about every planet and exoplanet is informed or interpreted against our knowledge of Earth.” 

    He cites his own work on martian dust storms, which was directly inspired by research on Earth’s climate patterns made possible by NASA observations. “Without NASA observing Earth, climatologists would not have discovered this variability, and I would not have known to look for it on Mars.”

    This synergy is at the heart of the scientific pushback against Duffy’s comments. Bergin points out the irony of searching for other worlds while deemphasizing our own. “People always ask me what’s your favorite planet? Earth obviously, right? It’s like so unique, like seriously we haven’t found anything like it,” she says. The scientific value of such a unique planet is immeasurable. “Why are we exploring if not to benefit life on Earth?” she asks. “It’s all to improve humanity, and humanity is never going to leave Earth. And so it all funnels back to Earth.”

    More than budget cuts

    Ultimately, the conversation sparked by Duffy’s comments reveals a fundamental choice about NASA’s identity. The path forward pits a vision of a streamlined agency, singularly focused on the outward push of exploration, against the view that NASA’s mission begins at home — that studying Earth is a foundational part of its mandate, a national security imperative, and a scientific necessity for the very exploration it seeks to champion. While the administration has proposed deep cuts, Congress has signaled resistance, leaving the final budget — and thus, the agency’s direction — in a state of negotiation.

    For some scientists, this debate extends beyond NASA’s budget, reflecting a broader pattern. Kopp sees a parallel between the proposed shift at NASA and what he calls “’science’ being manufactured to serve a political end,” arguing that “shutting down research to slow the growth of scientific understanding would be in line with that.” This perspective frames the choice facing NASA not just as a strategic decision, but as a political one with implications for the role of independent science in public policy.

    For observers like Bergin, the outcome is not predetermined. She argues that public awareness and engagement are crucial. “Even though you are one person, one voice, your voice does matter,” she says, encouraging people to stay informed and talk to others in their community. The resolution of this debate, which will be decided in the halls of Congress but influenced by public sentiment, will define not just NASA’s priorities, but its very purpose for a generation to come.

    Continue Reading

  • UAH researchers use X-rays from quasars to answer one of the three major questions in cosmology: where are the missing baryons?

    UAH researchers use X-rays from quasars to answer one of the three major questions in cosmology: where are the missing baryons?

    BYLINE: Russ Nelson

    Researchers at The University of Alabama in Huntsville (UAH), a part of The University of Alabama System, have published a series of two papers in the Monthly Notices of the Royal Astronomical Society that resolve one of three major outstanding puzzles in cosmology: the “missing baryon problem,” a discrepancy between the amount of baryonic matter detected from shortly after the Big Bang when compared with recent epochs. Dr. Massimiliano “Max” Bonamente, a professor of physics and astronomy, along with Dr. David Spence and international colleagues, used X-ray radiation from quasars to determine the “missing” particles reside in the warm-hot intergalactic medium, or WHIM, a state of matter characterized by low density and high temperatures.

    “This is the result of over 10 years of work at UAH, primarily by myself and a recent graduate of ours, David Spence, also in collaboration with several scientists worldwide,” Bonamente says. “With three major problems in modern cosmology – missing baryons, identification of dark matter, identification of dark energy – it’s case closed on the first.”

    Baryons are subatomic particles, most commonly protons and neutrons, that comprise the bulk of visible matter in the universe. The WHIM is a crucial component of the cosmic web, the large-scale structure of the universe composed of enormous filaments of dark matter and gas that connect galaxies to one another.

    “The universe is believed to start off as a ‘ball of fire,’ the Big Bang; it then cools and forms structures on various scales: stars, galaxies, filaments of galaxies,” Bonamente notes. “The gas is attracted by the gravity of these filaments that stretch hundreds of millions of light-years, and the gas heats back up as it falls towards the WHIM. This is something any graduate student in physics would learn in their classical dynamics course, so astronomers were confident in this simple picture.”

    Current observations suggest baryons make up about five percent of the total energy density of the universe. Yet, a significant fraction of present-day baryons remained unaccounted for in deep far-ultraviolet (FUV) searches. A census of baryons in the recent observable universe found that the observed baryonic matter accounts for about one half the expected amount.

    “The location of the missing baryons near these WHIM structures was proposed back in 1999 in a seminal paper by Princeton scientists, and then consistently seen ever since in all subsequent simulations,” the researcher says. “But simulations aren’t real, and we needed to look in the real, uppercase Universe.”

    To accomplish this, Bonamente, Spence and their colleagues analyzed X-ray sources from the European Space Agency’s orbiting X-ray telescope XMM-Newton and NASA’s Chandra Observatory to augment the FUV findings to make the breakthrough.

    The core of the discovery lies in studying the cosmological density of missing baryons by analyzing X-ray absorption lines in quasars, focusing on the warm-hot intergalactic medium. In quasars, X-ray absorption lines arise when X-rays emitted from the quasar’s central black hole pass through intervening gas clouds, either within the quasar’s host galaxy or even farther away in the intergalactic medium

    Staying grounded

    The study made a systematic search for the absorption lines of highly ionized oxygen atoms in the spectra of 51 XMM-Newton and Chandra background quasars. These show up as “dark lines” in the X-ray spectrum, created when specific wavelengths are absorbed by atoms in the gas. By learning the properties of these absorption lines, like their strength and velocity, astronomers can learn about the physical conditions and amount of the absorbing gas.

    “For nearly 20 years, astronomers used a single source – or a few at most – to study this effect. This results in a strong bias of results, and the problem that one source (or few) are not representative of the whole universe,” Bonamente notes. “So, we wanted to amend that problem, and the only way to do it is to go ‘big’ with the largest sample we could come up with. Statistics demands a large sample in order to make the best possible estimates, so we did just that.”

    The results of this analysis contributes to the characterization of the missing baryons, indicating they are associated with the high–temperature portion of the WHIM, and possibly with large-scale WHIM filaments traced by galaxies, as predicted by numerical simulations and by other independent probes.

    “It is only in X-rays that the relevant absorption lines occur,” Bonamente says. “This is something that is dictated by the atomic structures. The calculations of the wavelengths where the absorption lines occur – precisely in X-rays – are reliable, and make use of standard quantum-mechanics calculations that have been confirmed in our laboratories. So, we had no other choice: hot gas at those temperatures are only ‘visible’ in X-rays.”

    Looking to the future of this research, Bonamente says there are still plenty of questions to answer.

    “There is work to do to improve the characterization of the WHIM and the missing baryons,” the researcher says. “What is exactly the temperature of the WHIM? How do these baryons distribute in the cosmos – closer to galaxy clusters or more in the intergalactic space? Are they rich in ‘metals’ or mostly hydrogen and helium? Answers to these questions will reduce the error bars on our measurements.”

    With regards to solving the big challenges in cosmology in general, Bonamente likes to take a well-grounded approach.

    “The other problems in cosmology are more speculative, in my opinion, and it may even be that there is no dark matter or dark energy after all. But it’s good to know that the ordinary matter is as it should be. Cosmologists – and many other scientists – like to look at headline-grabbing, but speculative topics such as dark energy, when in fact there are problems more down-to-earth – pun intended – that need to be solved first. Wouldn’t you want to make sure your shoelaces are tied before you run the 100 meters at the Olympics?”

    Bonamente adds that an effort of this magnitude could not be done without the support of many other dedicated scientists who bought into the idea and shared in the hard work.

    “Throughout the years, I’ve had the good fortune of being surrounded by great colleagues who made this project happen,” the researcher concludes. “It’s my pleasure to thank and acknowledge Drs. Jussi Ahoranta and Kimmo Tuominen from Helsinki University, Dr. Natasha Wijers of Northwestern University and Dr. Jelle de Plaa of SRON Utrecht, who co-authored the papers describing these findings. And my good friend Dr. Jukka Nevalainen, who has also been a long-term collaborator on this and many other projects.”



    Continue Reading

  • A spine-tingling discovery: This dinosaur had spiked body armor

    A spine-tingling discovery: This dinosaur had spiked body armor

    Fossils of the creature Spicomellus revealed elaborate body armor used to attract mates and deter rivals. Image: Matt Dempsey/The Natural History Museum, London

    A dinosaur that roamed modern-day Morocco more than 165 million years ago had a neck covered in three-foot long spikes, a weapon on its tail and bony body armor, according to researchers who unearthed the curious beast’s remains.

    The discovery of the animal Spicomellus in the Moroccan town of Boulemane painted a clearer picture of the bizarre, spiked ankylosaur, which was first described in 2021 based on the discovery of a single rib bone.

    Researchers now understand that the four-legged herbivore, which was about the size of a small car, was much more elaborately armored than originally believed, according to research published last month in the journal Nature.

    “Spicomellus had a diversity of plates and spikes extending from all over its body, including metre-long neck spikes, huge upwards-projecting spikes over the hips, and a whole range of long, blade-like spikes, pieces of armour made up of two long spikes, and plates down the shoulder,” research co-lead Susannah Maidment said in a statement to London’s Natural History Museum.

    “We’ve never seen anything like this in any animal before.”

    The Spicomellus‘ ribs were lined with fused spikes projecting outward — a feature never witnessed before in any other vertebrate, living or extinct.

    Co-lead of the project Richard Butler, a paleobiology professor at the University of Birmingham, described seeing the fossil for the first time as “spine-tingling.”

    “We just couldn’t believe how weird it was and how unlike any other dinosaur, or indeed any other animal we know of, alive or extinct,” Butler told the Natural History Museum.

    “It turns much of what we thought we knew about ankylosaurs and their evolution on its head and demonstrates just how much there still is to learn about dinosaurs,” he added.

    Researchers suggest that the Spicomellus‘ complex bone structure was used both to attract mates and deter rivals.

    Discovering that the dinosaur had such elaborate armor that possibly prioritized form as much as function set the animal apart from its predecessors, which had less, more defensive covering on their bodies.

    In addition to showy barbs along Spicomellus‘ exterior, remains of the animal’s tail also provided a stunning new detail for scientists.

    Fused vertebrae going down into its tail formed a “handle,” likely leading to a club-like weapon at the end — a detail ankylosaur scientists had previously believed not to have evolved until the Cretaceous period, millions of years later.

    “To find such elaborate armour in an early ankylosaur changes our understanding of how these dinosaurs evolved,” Maidment said.

    “It shows just how significant Africa’s dinosaurs are, and how important it is to improve our understanding of them,” she said.

    Continue Reading

  • The Butterfly Star And Its Planet-Forming Disk

    The Butterfly Star And Its Planet-Forming Disk

    The Taurus star-forming region is only a few hundred light-years away, and it may be the nearest star formation region to Earth. It’s a stellar nursery with hundreds of young stars, and attracts a lot of astronomers’ attention. One of the young stars in Taurus is named IRAS 04302. IRAS 04302 is sometimes called the “Butterfly Star” because of its appearance when viewed edge-on.

    The JWST image of IRAS 04302 is the latest ESA/Webb Picture of the Month.

    Astronomers are intensely interested in the details of planet formation, and one of the JWST’s science goals is the study of planets forming in protoplanetary disks around young stars like IRAS 04302. Many of the images we have of planets forming in protoplanetary disks come from ALMA, the Atacama Large Millimeter/submillimeter Array. These images show the disks in a “top down” orientation. In those images, astronomers can spot the rings and gaps that signal planet formation.

    ALMA has captured many images of protoplanetary disks around young stars. These three are typical, and show gaps and rings where planets are likely forming. As young planets take shape, they sweep up gas and dust in the disk, creating the gaps. Image Credit: ALMA/ESO

    IRAS 04302 is oriented so that we see its protoplanetary disk from the side. IRAS 04302 is a fine example of a young star that is still accreting mass while planets could be forming in its protoplanetary disk, and the edge-on view provides more than just a pretty picture. This viewpoint gives astronomers a different look at disks. It shows the disk’s vertical structure and can reveal how thick the dusty disk is.

    In this image, the dust disk acts almost like a coronagraph, blocking out some of the star’s light and making detail in the disk stand out. Reflection nebulae on either side of the disk are illuminated by the star, giving IRAS 04302 its nickname Butterfly Star.

    The image is created from the JWST’s Mid-Infrared Instrument (MIRI) and Near-Infrared Camera (NIRCam), and the Hubble also contributed optical data. The Webb shows how dust grains are distributed and how dust extending out from the disk reflects near-infrared light. The Hubble shows the dust lane itself, as well as clumps and streaks, evidence that the star is still gathering mass. It also shows jets and outflows, more evidence of its ongoing growth.

    IRAS 04302 as imaged with the JWST. The disk is about 65 million km across, making it several times larger than our Solar System. Image Credit: ESA/Webb, NASA & CSA, M. Villenave et al. LICENCE: CC BY 4.0 INT IRAS 04302 as imaged with the JWST. The disk is about 65 million km across, making it several times larger than our Solar System. Image Credit: ESA/Webb, NASA & CSA, M. Villenave et al. LICENCE: CC BY 4.0 INT

    There’s no scientific journal devoted solely to protoplanetary disk, but there could be, considering how much research goes into them. These JWST images are more than just pictures, they’re associated with a study published in The Astrophysical Journal titled “JWST Imaging of Edge-on Protoplanetary Disks. II. Appearance of Edge-on Disks with a Tilted Inner Region: Case Study of IRAS04302+2247.” The lead author is Marion Villenave from NASA’s Jet Propulsion Laboratory.

    “Because planet formation occurs in the protoplanetary disk phase, studying protoplanetary disk evolution can allow us to better understand planet formation,” the article’s authors write. The main thrust of this type of research is to understand how tiny dust particles gradually form kilometer-sized bodies that eventually form planetesimals and then planets. It can take only a few million years, or even less, for these kilometer-size rocks to form. One of the big questions is sometimes called the “Bouncing Barrier.” The problem is that once dust grains reach a certain size, their collisions are more energetic. Instead of sticking to one another, they bounce off each other. For planetesimals to form, some force has to overcome the Bouncing Barrier.

    This figure from the research is an image gallery of the JWST observations of IRAS04302. Image Credit: M. Villenave et al. 2025. ApJ This figure from the research is an image gallery of the JWST observations of IRAS04302. Image Credit: M. Villenave et al. 2025. ApJ

    “In the current paradigm, high dust concentrations are thought to accelerate grain growth by promoting disk instabilities that lead to planetesimal formation (e.g., streaming instability), and subsequently allowing efficient growth via pebble accretion,” the authors write.

    Answers to the Bouncing Barrier and other questions regarding planet formation can only be found in protoplanetary disks. In this research, the scientists examined IRAS 04302’s edge-on disk hoping to find clues. One of the answers to planet formation questions may lie in dust settling.

    “Dust vertical settling in the disk is the result of gas drag on dust grains subject to stellar gravity and gas turbulence,” the authors write. “This mechanism leads large dust grains to fall into the disk midplane and accumulate there, which is favorable for planet formation.” The authors note that this mechanism is poorly constrained by observations.

    This is why IRAS 04302 is such a desirable target.

    “Highly inclined protoplanetary disks are favorable targets to investigate this mechanism because they allow a direct view of the disk’s vertical structure,” the researchers explain.

    The authors observed that IRAS 04302’s inner disk is tilted and asymmetrical, as are 15 out of 20 other observed edge-on disks. If tilt and asymmetry are this common, it has implications. It affects how disks evolve and how their dynamics play out. In turn, it must affect how planets form, and what the eventual architecture of a solar system will be.

    This figure shows 20 observed edge-on disks. 15 of them show clear asymmetry, while five do not. Though the five that are considered symmetrical have some curves, they're not curved enough to be considered asymmetrical. Image Credit: M. Villenave et al. 2025. ApJ This figure shows 20 observed edge-on disks. 15 of them show clear asymmetry, while five do not. Though the five that are considered symmetrical have some curves, they’re not curved enough to be considered asymmetrical. Image Credit: M. Villenave et al. 2025. ApJ

    The researchers didn’t reach a clear conclusion for how all of this works. No single study can answer all of our questions, but each one nudges us toward a greater understanding. They note that further observations will deepen their understanding of tilted disks and how they affect planet formation.

    Continue Reading

  • A new generative AI approach to predicting chemical reactions | MIT News

    A new generative AI approach to predicting chemical reactions | MIT News

    Many attempts have been made to harness the power of new artificial intelligence and large language models (LLMs) to try to predict the outcomes of new chemical reactions. These have had limited success, in part because until now they have not been grounded in an understanding of fundamental physical principles, such as the laws of conservation of mass. Now, a team of researchers at MIT has come up with a way of incorporating these physical constraints on a reaction prediction model, and thus greatly improving the accuracy and reliability of its outputs.

    The new work was reported Aug. 20 in the journal Nature, in a paper by recent postdoc Joonyoung Joung (now an assistant professor at Kookmin University, South Korea); former software engineer Mun Hong Fong (now at Duke University); chemical engineering graduate student Nicholas Casetti; postdoc Jordan Liles; physics undergraduate student Ne Dassanayake; and senior author Connor Coley, who is the Class of 1957 Career Development Professor in the MIT departments of Chemical Engineering and Electrical Engineering and Computer Science.

    “The prediction of reaction outcomes is a very important task,” Joung explains. For example, if you want to make a new drug, “you need to know how to make it. So, this requires us to know what product is likely” to result from a given set of chemical inputs to a reaction. But most previous efforts to carry out such predictions look only at a set of inputs and a set of outputs, without looking at the intermediate steps or considering the constraints of ensuring that no mass is gained or lost in the process, which is not possible in actual reactions.

    Joung points out that while large language models such as ChatGPT have been very successful in many areas of research, these models do not provide a way to limit their outputs to physically realistic possibilities, such as by requiring them to adhere to conservation of mass. These models use computational “tokens,” which in this case represent individual atoms, but “if you don’t conserve the tokens, the LLM model starts to make new atoms, or deletes atoms in the reaction.” Instead of being grounded in real scientific understanding, “this is kind of like alchemy,” he says. While many attempts at reaction prediction only look at the final products, “we want to track all the chemicals, and how the chemicals are transformed” throughout the reaction process from start to end, he says.

    In order to address the problem, the team made use of a method developed back in the 1970s by chemist Ivar Ugi, which uses a bond-electron matrix to represent the electrons in a reaction. They used this system as the basis for their new program, called FlowER (Flow matching for Electron Redistribution), which allows them to explicitly keep track of all the electrons in the reaction to ensure that none are spuriously added or deleted in the process.

    The system uses a matrix to represent the electrons in a reaction, and uses nonzero values to represent bonds or lone electron pairs and zeros to represent a lack thereof. “That helps us to conserve both atoms and electrons at the same time,” says Fong. This representation, he says, was one of the key elements to including mass conservation in their prediction system.

    The system they developed is still at an early stage, Coley says. “The system as it stands is a demonstration — a proof of concept that this generative approach of flow matching is very well suited to the task of chemical reaction prediction.” While the team is excited about this promising approach, he says, “we’re aware that it does have specific limitations as far as the breadth of different chemistries that it’s seen.” Although the model was trained using data on more than a million chemical reactions, obtained from a U.S. Patent Office database, those data do not include certain metals and some kinds of catalytic reactions, he says.

    “We’re incredibly excited about the fact that we can get such reliable predictions of chemical mechanisms” from the existing system, he says. “It conserves mass, it conserves electrons, but we certainly acknowledge that there’s a lot more expansion and robustness to work on in the coming years as well.”

    But even in its present form, which is being made freely available through the online platform GitHub, “we think it will make accurate predictions and be helpful as a tool for assessing reactivity and mapping out reaction pathways,” Coley says. “If we’re looking toward the future of really advancing the state of the art of mechanistic understanding and helping to invent new reactions, we’re not quite there. But we hope this will be a steppingstone toward that.”

    “It’s all open source,” says Fong. “The models, the data, all of them are up there,” including a previous dataset developed by Joung that exhaustively lists the mechanistic steps of known reactions. “I think we are one of the pioneering groups making this dataset, and making it available open-source, and making this usable for everyone,” he says.

    The FlowER model matches or outperforms existing approaches in finding standard mechanistic pathways, the team says, and makes it possible to generalize to previously unseen reaction types. They say the model could potentially be relevant for predicting reactions for medicinal chemistry, materials discovery, combustion, atmospheric chemistry, and electrochemical systems.

    In their comparisons with existing reaction prediction systems, Coley says, “using the architecture choices that we’ve made, we get this massive increase in validity and conservation, and we get a matching or a little bit better accuracy in terms of performance.”

    He adds that “what’s unique about our approach is that while we are using these textbook understandings of mechanisms to generate this dataset, we’re anchoring the reactants and products of the overall reaction in experimentally validated data from the patent literature.” They are inferring the underlying mechanisms, he says, rather than just making them up. “We’re imputing them from experimental data, and that’s not something that has been done and shared at this kind of scale before.”

    The next step, he says, is “we are quite interested in expanding the model’s understanding of metals and catalytic cycles. We’ve just scratched the surface in this first paper,” and most of the reactions included so far don’t include metals or catalysts, “so that’s a direction we’re quite interested in.”

    In the long term, he says, “a lot of the excitement is in using this kind of system to help discover new complex reactions and help elucidate new mechanisms. I think that the long-term potential impact is big, but this is of course just a first step.”

    The work was supported by the Machine Learning for Pharmaceutical Discovery and Synthesis consortium and the National Science Foundation.

    Continue Reading

  • The cling of doom: How staph bacteria latch onto human skin

    The cling of doom: How staph bacteria latch onto human skin

    image: 

    Illustration of the molecular handshake driving Staphylococcus aureus adhesion to human skin. The bacterial adhesin SdrD (purple) binds tightly to the host receptor desmoglein-1 (DSG-1, orange) on keratinocytes, with calcium ions (Ca²⁺) stabilizing the interaction. This calcium-dependent bond enables S. aureus to attach strongly to the skin surface, providing a molecular explanation for the bacterium’s ability to resist mechanical forces and establish infection.


    view more 

    Credit: Department of Physics – Auburn University

    (Auburn, AL) Imagine a child with eczema who scratches a patch of irritated skin. A tiny opening forms, invisible to the eye. Into that breach slips a common bacterium, Staphylococcus aureus. For many people, the bacteria would remain harmless. But in someone with a weakened skin barrier, the microbe can cling tightly, multiply, and trigger an infection that is difficult to control. In severe cases, staph spreads beyond the skin and becomes life-threatening. Resistant strains such as MRSA turn what should be a treatable infection into a medical nightmare, one that claims tens of thousands of lives each year in the United States alone.

    The question that has puzzled researchers for years is why staph bacteria cling so tenaciously to human skin. A new study, co-led by Auburn University’s Department of Physics alongside scientists in Belgium and the United Kingdom, has uncovered the answer. Published in Science Advances, the research shows that staph locks onto human skin with the strongest biological grip ever measured, stronger than superglue and nearly unmatched in nature.

    At the center of this discovery is a bacterial protein called SdrD, which the pathogen uses like a grappling hook to attach itself to a human protein called desmoglein-1. The bond between the two is unlike anything seen before. It withstands forces so powerful that they rival the strength of some chemical bonds. This helps explain why staph bacteria remain attached to the skin even after scratching, washing, or sweating. “It is the strongest non-covalent protein-protein bond ever reported,” says Rafael Bernardi, Associate Professor of Physics at Auburn University and one of the senior authors. “This is what makes staph so persistent, and it helps us understand why these infections are so difficult to get rid of.”

    The study also revealed that calcium, an element better known for strengthening bones, plays a key role in fortifying this bacterial grip. When calcium levels were reduced in laboratory experiments, the bond between SdrD and desmoglein-1 weakened significantly. When calcium was added back, the bond became even stronger. This finding is particularly relevant for patients with eczema, where calcium balance in the skin is disrupted. Instead of protecting the skin, these irregular levels may actually make staph’s grip tighter. “We were surprised to see how much calcium contributed to the strength of this interaction,” explains Priscila Gomes, a researcher in Auburn’s Department of Physics and co-author of the study. “It not only stabilized the bacterial protein, it made the whole complex much more resistant to breaking.”

    To uncover these details, the team combined single-molecule experiments with advanced computational simulations. Using atomic force microscopy, researchers in Europe measured the force of a single staph bacterium attaching to human skin proteins. Meanwhile, Auburn physicists modeled the interaction atom by atom on powerful supercomputers. The two approaches converged on the same remarkable conclusion: SdrD’s grip on desmoglein-1 is stronger than any other protein bond known in biology.

    This discovery opens the door to new strategies for combating antibiotic-resistant infections. Instead of trying to kill bacteria directly, which often drives the evolution of resistance, scientists could design therapies that block or weaken bacterial adhesion. If staph cannot cling to the skin, the immune system has a better chance of clearing it before infection takes hold. “By targeting adhesion, we are looking at a completely different way to fight bacterial infections,” Bernardi says. “We are not trying to destroy the bacteria, but to stop them from latching on in the first place.”

    For the Department of Physics at Auburn University, the study highlights the growing role of biophysics in addressing urgent problems in human health. By combining physical measurements, biological insights, and international teamwork, the researchers have solved a long-standing mystery of staph pathogenesis and uncovered a potential weakness that could be exploited in future therapies. As Gomes reflects, “This project shows how much can be achieved when different fields and different countries come together to answer questions that none of us could solve alone.”

    The discovery of the strongest protein bond in nature not only sets a new benchmark in biophysics but also provides a fresh perspective on how to outsmart one of the most stubborn pathogens in medicine.

    —-

    About Auburn’s Computational Biophysics Group

    The Computational Biophysics Group at Auburn University’s Department of Physics uses advanced computer simulations to uncover how biological molecules behave under force. By blending physics, biology, and high-performance computing, the team develops cutting-edge software, collaborates with experimental partners worldwide, and trains the next generation of scientists at the interface of physics, chemistry, and life sciences.


    Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

    Continue Reading

  • Bigger is Better for Interstellar Spacecraft | by Avi Loeb | Sep, 2025

    Bigger is Better for Interstellar Spacecraft | by Avi Loeb | Sep, 2025

    Press enter or click to view image in full size

    (Credit: Adrian Mann)

    All spacecraft ever built by humans are smaller than 100 meters, the scale of a football field. They were all designed to explore the solar system and maintain their technological functions for less than a century.

    However, exploration of interstellar space requires long journeys, lasting millions to billions of years. Since the heat loss and the damage from impact by interstellar radiation, micrometeorites, dust or gas particles, scale with surface area (size squared) while precious commodities and power supply scale with volume (size cubed), it is advantageous to make interstellar spacecraft bigger. Another reason to go big is that one can create artificial gravity by the centrifugal acceleration of a rotating spacecraft. For a given rotation period, the centrifugal acceleration scales with size squared. The larger is the size, the smaller is the variation of the artificial gravity across a human body — equivalent to a tidal force that can rip it apart.

    The reflection of sunlight from the interstellar object 3I/ATLAS, suggests a diameter of 20–46 kilometers for the reflecting object or dust cloud around it. This range of scales is similar to the dimensions of the interstellar spacecraft Rama in the science fiction novel “Rendezvous with Rama” by Arthur C. Clarke. The possibility that 3I/ATLAS is a giant spacecraft — as suggested by its fine-tuned alignment with the planets around the Sun, was discussed in an email that I received today from co-founder of the Galileo Project and CEO of the Bruker Corporation, Dr. Frank Laukien, who wrote:

    Hi Avi,

    As I have been following your essays on 3I/ATLAS, I realized that for future biological space exploration by humans (not just AI machines) and the settlement of habitable-life zone exoplanets or exomoons, or perhaps human life on space stations, will require megastructures.

    3I/ATLAS with 20–46 km potential core diameter (unless there is a dense dust cloud) might be the right order of magnitude, … much, much larger than spaceships that we may have conceived of so far at NASA, SpaceX or in SciFi movies. These megastructures would have to be assembled in orbit, and large Starship SpaceX rockets can help with lift-off from Earth to assemble them.

    Why so big?

    A first analogy was the thick fur and body size of large mammals during the last glacial period, aka the ‘Ice Age’, which had to contend with bitter cold, and a lower surface-to-volume ratio was beneficial to preserve warmth. During space travel, we have to contend with high-energy radiation … so we need a thick outer layer for shielding, and that can be destroyed by cosmic dust and radiation over time, while preserving the interior. For this, small surface-to-volume ratio and a thick ‘skin’ are favorable, which makes very large structures or megastructures the preferred architecture over small spaceships or International-Space-Station type small structures.

    In addition, we will need small modular reactor fission or large (3 gigawatts) fusion power plants with full tritium breeding blanket and tritium full cycle. Such a nearly inexhaustible fusion energy reactor and power plant tend to be big and heavy (of the order of 100 meters in diameter, and a mass of ~50 kilotons), and again only compatible with megastructures.

    Such megastructures with a vast amount of energy could generate the lighting conditions and atmospheres we humans need from our evolutionary history on Earth, and could support synthetic agriculture, water and oxygen renewal, cryogenics to freeze space travelers, or enough space for perhaps 10–100 thousand humans on such a space megastructure with a habitable-zone conditions inside its protective skin, and with enough space that passengers don’t all go totally claustrophobic and insane.

    In that sense, just the speculations about 3I/ATLAS perhaps being technological has already been eye-opening for me, in that megastructures make so much sense for human space exploration or as star-orbiting living platforms…

    Best wishes,

    Frank

    In order to acquire more inspiration from 3I/ATLAS, we need to image it. On October 3, 2025, 3I/ATLAS will pass at a distance of 29 million kilometers from Mars. At that time, several Mars orbiters will have the opportunity to image it. First, NASA’s HiRISE camera onboard the Mars Reconnaissance Orbiter will be able to image 3I/ATLAS with a resolution of 30 kilometers per pixel. China also has its Tianwen-1 orbiter, which carries a comparable high-resolution camera. The European Space Agency (ESA) is planning to use spectrometers as well as the High-Resolution Stereo Camera onboard the Mars Express and the Color and Stereo Surface Imaging System onboard the ExoMars Trace Gas Orbiter.

    But with a traditional mindset of comet experts, the anomalies in the data might not be recognized as such. The manager of the Galileo Project, Dr. Zhenia Shmeleva, emailed me today a related insight:

    Dear Avi,

    Your latest essay made me think about how we approach the unknown, and it brought to mind one of my favorite books, Solaris by Stanislaw Lem, where he questions whether we seek other worlds or only ourselves.

    In the common-room scene in the spaceship over Solaris, Snaut says: “We have no need of other worlds. We need mirrors. We don’t know what to do with other worlds… We don’t want to conquer the cosmos; we simply want to extend the boundaries of Earth to the frontiers of the cosmos… We are only seeking Man. We have no need of other worlds. A single world, our own, suffices us; but we can’t accept it for what it is.”

    Snaut’s critique is harsh. When we set out into the cosmos, we claim to seek other civilizations, knowledge, and the alien. In truth, we tend to extend humanity outward and encounter ourselves again in different forms. We colonize conceptually and practically, imposing our own categories rather than engaging with what is genuinely other. When we meet something truly alien, like the Solaris-ocean, we recoil because it fails to mirror us and does not fit our frameworks.

    This is a real failure mode: treating the unknown as confirmation of ourselves. How should science fight this, how can we prevent anthropocentric biases from steering hypotheses and interpretations? A tricky question for me.

    Best wishes,

    Zhenya”

    Speaking about anthropocentric biases, the visionary congresswoman Anna Paulina Luna who chairs the Task Force on the “Declassification of Federal Secrets”, announced today (here) a congressional hearing on September 9, 2025 to discuss the transparency of the U.S. government regarding Unidentified Anomalous Phenomena (UAP) — which motivate data collection by three Galileo Project observatories. Congressional Task Force members will hear from witnesses on concerns regarding the disclosure of UAP and information held by federal agencies.

    When reading the morning news, I often feel like being in a party where the attendees are misbehaving. All I can hope for is that new guests in the form of 3I/ATLAS or UAP will improve the situation.

    ABOUT THE AUTHOR

    Press enter or click to view image in full size

    (Image Credit: Chris Michel, National Academy of Sciences, 2023)

    Avi Loeb is the head of the Galileo Project, founding director of Harvard University’s — Black Hole Initiative, director of the Institute for Theory and Computation at the Harvard-Smithsonian Center for Astrophysics, and the former chair of the astronomy department at Harvard University (2011–2020). He is a former member of the President’s Council of Advisors on Science and Technology and a former chair of the Board on Physics and Astronomy of the National Academies. He is the bestselling author of “Extraterrestrial: The First Sign of Intelligent Life Beyond Earth” and a co-author of the textbook “Life in the Cosmos”, both published in 2021. The paperback edition of his new book, titled “Interstellar”, was published in August 2024.

    Continue Reading