Long before the first mammals walked the Earth, the oceans were home to strange, powerful fish. Some of these ancient fish eventually crawled onto land and kicked off a new chapter in evolution.
Now, thanks to new fossil research, we’re learning even more about how these early animals lived – and ate.
In a new study led by scientists from Flinders University, researchers examined jawbones from 380-million-year-old lungfish found in Western Australia’s remote Gogo fossil field.
The findings shed light on the evolution of feeding strategies in fish that are closely related to all land-dwelling vertebrates, including us.
Biting power of ancient fish
The researchers used 3D finite element modeling (FEM) – a tool more commonly seen in engineering. They digitally tested how different fossilized jaws handled biting stress.
By running simulations on fish jawbones from multiple species, the team revealed surprising variation in shape, strength, and eating behavior.
“Lungfish are ‘sister taxa’ to the tetrapods – or all four-limbed animals with a backbone, including humans – which means they are our closest ‘fishy’ relatives,” noted Dr. Alice Clement.
“They have an extensive fossil history stretching back over 400 million years and still with living representatives today and their phylogenetic proximity to tetrapods giving insight into our long-distant ancestors who first made the move from water to land.”
That close relationship makes lungfish fossils especially valuable. The Gogo Formation in northern Western Australia is a goldmine of such specimens. With 11 known species, it holds the most diverse group of lungfish ever discovered from one place or time.
Fish bite force and eating style
Until now, scientists knew these fish had different jaw shapes. What they didn’t know was how those jaws worked. The new models changed that.
“We’re slowly teasing apart the details of how the bodies and lifestyles of these animals changed, as they moved from being fish that lived in water, to becoming tetrapods that moved about on land,” said Dr. Clement.
By scanning seven fossil species and applying FEM to five of them, the team was able to measure how the jawbones responded to stress during simulated bites.
Fossil bites defy expectations
The dataset is the most detailed look yet at biting performance in any fossil fish. It provides biomechanical evidence for diverse feeding adaptations and niche partitioning within Gogo lungfishes, noted Dr. Olga Panagiotopoulou, a functional anatomist at Touro University California.
Some of the findings contradicted assumptions. Jaws that looked sturdy on the outside didn’t always perform well in bite-force simulations. And some that looked more delicate turned out to be much stronger than expected.
“The results were somewhat surprising, with some ‘robust’-looking lower jaws appearing to not be all that well suited to biting stress, and some of the more gracile or slender jaws appeared to be able to withstand stress and strain very well,” said Professor John Long of Flinders University.
“This diversity of biomechanical function seen in the Gogo lungfishes suggest that there was niche partitioning and trophic differentiation among lungfishes, possibly accounting for their incredibly high species diversity at this site.”
When fish reinvented eating
The Devonian period, often called the “Age of Fishes,” was ruled by placoderms and other ancient species for about 60 million years. Many of their fossils were found decades ago, but new tools like FEM are finally revealing how these animals actually lived.
Joshua Bland, a researcher at the Flinders Palaeontology Lab, is the study’s lead author. He noted that the Gogo lungfish of the Late Devonian reefs were truly unique, with species possessing a host of different behaviors and abilities.
“To capture parts of that story, hidden in the bone, was extremely rewarding. It felt like we lifted the veil on some real functions behind the form. It was impressive to see the more complex morphology perform better in our tests,” said Bland.
Future science from old bones
All 3D models from the study are now available on Morphosource, allowing other researchers to explore and expand on this work.
By combining old fossils with new technology, the team has brought us one step closer to understanding the creatures that helped shape life on land.
The researchers have also shown that even after 380 million years, ancient bones can still have something new to say.
The full study was published in the journal iScience.
Image Credit: John Long, Flinders University
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
Disaster victims trapped beneath the rubble of a collapsed building or mine may one day be rescued by a tiny and unlikely savior: a beetle with a backpack.
Researchers have made major strides in cyborg technology, creating a breed of cyborg beetles that can climb walls, obstacles, and sloped surfaces while being remotely guided by a video game controller.
Called “ZoBorgs,” the cyborg beetles are a collaborative effort between The University of Queensland and the University of New South Wales, both in Australia, and Singapore’s Nanyang Technological University.
To imbue their darkling beetles (Zophobas morio) with remote control, the researchers equipped them with a microchip backpack that sends electrical signals to the beetles’ antennae or forewings (elytra), prompting them to move in different directions.
Related: Scientists Turned Cockroaches Into Cyborgs, Giving Them Navigation Superpowers
Darkling beetles are also known as ‘superworms’ for the worm-like form of their larvae. These creatures may help the world in multiple ways. Culinarily, they’re a rich source of fatty acids and protein, commonly consumed in countries like Mexico and Thailand.
The ZoBorg is able to navigate a complex environment by crossing obstacles, going up inclines, and climbing walls. (Fitzgerald et al., Adv. Sci., 2025)
The larvae also love dining on one of the world’s most prevalent plastics, polystyrene, which is used to make common conveniences like packing materials and disposable cutlery. This is not good for the beetles, but copying how they digest the substance could help us tackle the plastic waste problem.
At up to 32 millimeters (1.26 inches) in body length and about 8 millimeters (0.3 inches) in height, darkling beetles are small and nimble, possessing natural gifts that allow them to maneuver where robots cannot: within the tight confines of dense, jumbled rubble.
Featured in Advanced Science, the new study harnesses the beetles’ natural gifts and “adds programmable controls that allow for precise directional guidance, without affecting the lifespan of the beetle,” says engineer Thang Vo-Doan of the School of Mechanical and Mining Engineering at The University of Queensland.
These programmable controls are transmitted via a beetle-backpack with electrodes that act like electrical reins. Stimulating the antennae causes the beetle to turn, decelerate, or walk backwards. Stimulating both elytra causes acceleration or forward walking, while stimulating a single elytron causes sideways movement.
The ZoBorg’s major components. (Fitzgerald et al., Adv. Sci., 2025)
As a result, the ZoBorgs can cross obstacles equal to their body height with a success rate of 92 percent. They can also move from horizontal to vertical surfaces with a 71.2 percent success rate – a rate unmatched by previous cyborg insects or robots.
Lachlan Fitzgerald, an engineer at The University of Queensland, explains that while “robots at this scale have made strides in locomotion, the transition from horizontal surfaces to walls remains a formidable challenge for them.” But not so for the ZoBorgs.
The ZoBorg’s on-demand climbing protocol. (Fitzgerald et al., Adv. Sci., 2025)
Plus, using beetles means that researchers do not have to design actuators, sensors, or control systems – the beetles are already naturally equipped by many millions of years of evolutionary adaptations. These climbing adaptations include flexible, adhesive footpads, gripping claws, and rigid but agile body structures.
In combination with their antennae, insects use sensors in their legs and mechanoreceptors in their exoskeletons to sense physical stimuli, such as surface textures and vibrations.
Future advances may focus on improving the beetles’ climbing ability and autonomy by incorporating an inertial measurement unit (IMU) that provides real-time, non-visual data like acceleration and other forces.
The addition of a compact, lightweight visual camera can further boost control mechanisms, and will be necessary for identifying trapped individuals in search and rescue situations. Finally, cyborg advances described here could inspire innovations in robotics, such as the incorporation of beetle-like feelers to improve robots’ navigational abilities.
Notably, scientists maintained ethical practices to ensure the beetles’ well-being. Compared to other animals used in research, the beetles lived in relatively ritzy conditions, sleeping on wheat-bran bedding and eating fresh apple slices. Following the experiments, they received care for the remainder of their three-month lifespans.
This study demonstrates that cyborg science is making essential strides. It may not yet be the robotic organs promised by science fiction, but a cyborg beetle may be just as likely to save lives.
This article was originally published at The Conversation. The publication contributed the article to Space.com’s Expert Voices: Op-Ed & Insights.
About a century ago, scientists were struggling to reconcile what seemed a contradiction in Albert Einstein’s theory of general relativity.
Published in 1915, and already widely accepted worldwide by physicists and mathematicians, the theory assumed the universe was static – unchanging, unmoving and immutable. In short, Einstein believed the size and shape of the universe today was, more or less, the same size and shape it had always been.
But when astronomers looked into the night sky at faraway galaxies with powerful telescopes, they saw hints the universe was anything but that. These new observations suggested the opposite – that it was, instead, expanding.
Clouds of dust and gas mix among the stars of our universe. (Image credit: NASA/JPL-Caltech/S. Stolovy (Spitzer Science Center/Caltech))
Scientists soon realized Einstein’s theory didn’t actually say the universe had to be static; the theory could support an expanding universe as well. Indeed, by using the same mathematical tools provided by Einstein’s theory, scientists created new models that showed the universe was, in fact, dynamic and evolving.
I’ve spent decades trying to understand general relativity, including in my current job as a physics professor teaching courses on the subject. I know wrapping your head around the idea of an ever-expanding universe can feel daunting – and part of the challenge is overriding your natural intuition about how things work. For instance, it’s hard to imagine something as big as the universe not having a center at all, but physics says that’s the reality.
The space between galaxies
First, let’s define what’s meant by “expansion.” On Earth, “expanding” means something is getting bigger. And in regard to the universe, that’s true, sort of. Expansion might also mean “everything is getting farther from us,” which is also true with regard to the universe. Point a telescope at distant galaxies and they all do appear to be moving away from us.
Breaking space news, the latest updates on rocket launches, skywatching events and more!
What’s more, the farther away they are, the faster they appear to be moving. Those galaxies also seem to be moving away from each other. So it’s more accurate to say that everything in the universe is getting farther away from everything else, all at once.
An illustration that shows the timeline of our universe, from the Big Bang to today. (Image credit: RubinObs/NOIRLab/SLAC/NSF/DOE/AURA)
This idea is subtle but critical. It’s easy to think about the creation of the universe like exploding fireworks: Start with a big bang, and then all the galaxies in the universe fly out in all directions from some central point.
But that analogy isn’t correct. Not only does it falsely imply that the expansion of the universe started from a single spot, which it didn’t, but it also suggests that the galaxies are the things that are moving, which isn’t entirely accurate.
It’s not so much the galaxies that are moving away from each other – it’s the space between galaxies, the fabric of the universe itself, that’s ever-expanding as time goes on. In other words, it’s not really the galaxies themselves that are moving through the universe; it’s more that the universe itself is carrying them farther away as it expands.
A common analogy is to imagine sticking some dots on the surface of a balloon. As you blow air into the balloon, it expands. Because the dots are stuck on the surface of the balloon, they get farther apart. Though they may appear to move, the dots actually stay exactly where you put them, and the distance between them gets bigger simply by virtue of the balloon’s expansion.
Now think of the dots as galaxies and the balloon as the fabric of the universe, and you begin to get the picture.
Unfortunately, while this analogy is a good start, it doesn’t get the details quite right either.
An artist’s representation of what the early expansion of the universe looked like. (Image credit: Johanwikipedia1028 via Wikimedia Commons)
The 4th dimension
Important to any analogy is an understanding of its limitations. Some flaws are obvious: A balloon is small enough to fit in your hand – not so the universe. Another flaw is more subtle. The balloon has two parts: its latex surface and its air-filled interior.
These two parts of the balloon are described differently in the language of mathematics. The balloon’s surface is two-dimensional. If you were walking around on it, you could move forward, backward, left, or right, but you couldn’t move up or down without leaving the surface.
Now it might sound like we’re naming four directions here – forward, backward, left and right – but those are just movements along two basic paths: side to side and front to back. That’s what makes the surface two-dimensional – length and width.
The inside of the balloon, on the other hand, is three-dimensional, so you’d be able to move freely in any direction, including up or down – length, width and height.
This is where the confusion lies. The thing we think of as the “center” of the balloon is a point somewhere in its interior, in the air-filled space beneath the surface.
But in this analogy, the universe is more like the latex surface of the balloon. The balloon’s air-filled interior has no counterpart in our universe, so we can’t use that part of the analogy – only the surface matters.
So asking, “Where’s the center of the universe?” is somewhat like asking, “Where’s the center of the balloon’s surface?” There simply isn’t one. You could travel along the surface of the balloon in any direction, for as long as you like, and you’d never once reach a place you could call its center because you’d never actually leave the surface.
In the same way, you could travel in any direction in the universe and would never find its center because, much like the surface of the balloon, it simply doesn’t have one.
The “hole in the universe” captured by Atacama Large Millimeter/submillimeter Array (ALMA) shows a dark spot that one of the most massive known galaxy clusters, RX J1347.5–1145. (Image credit: ALMA (ESO/NAOJ/NRAO)/T. Kitayama (Toho University, Japan)/ESA/Hubble & NASA)
Part of the reason this can be so challenging to comprehend is because of the way the universe is described in the language of mathematics. The surface of the balloon has two dimensions, and the balloon’s interior has three, but the universe exists in four dimensions. Because it’s not just about how things move in space, but how they move in time.
Our brains are wired to think about space and time separately. But in the universe, they’re interwoven into a single fabric, called “space-time.” That unification changes the way the universe works relative to what our intuition expects.
And this explanation doesn’t even begin to answer the question of how something can be expanding indefinitely – scientists are still trying to puzzle out what powers this expansion.
So in asking about the center of the universe, we’re confronting the limits of our intuition. The answer we find – everything, expanding everywhere, all at once – is a glimpse of just how strange and beautiful our universe is.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Astronomers have found a fresh way to look for planets, and it starts by searching for binary stars – stars that come in pairs and keep their orbits tidily aligned.
A new study shows that, when two sibling suns wheel around each other edge‑on from Earth’s viewpoint, they may indicate the presence of planets that are far easier to spot than usual.
“This could be an unprecedented avenue for examining how deterministic, or orderly, the process of planet formation is,” said Malena Rice of Yale University who led the work, which lays out a practical map for planet hunters.
Why binary stars hold promise
Most Sun‑like stars live with at least one stellar companion, forming what astronomers call binary stars.
When that duo circles in a flat plane that happens to face us, telescopes see the stars move directly toward and away from Earth. This presents an edge‑on orientation that magnifies every wobble caused by orbiting planets.
Earlier surveys of Kepler and TESS data found that planets in binaries that are less than about 74 billion miles, or 800 astronomical units, often share the same plane as the twin suns, suggesting a natural alignment during birth.
That discovery hinted that the companion star might act like a gyroscope, steadying the protoplanetary disk and locking everything into one orderly sheet, instead of a random tilt.
Turning alignment into a search map
Rice’s team mined the Gaia DR3 catalog, filtering 20 million entries down to nearly 600 bright, nearby binaries whose motion angles scream “edge‑on.”
Because the European Space Agency (ESA) satellite records minute changes in position and proper motion, its data let the group calculate the orbital tilt of each pair with degree‑level precision.
For every qualified system the researchers ran computer simulations, populating each star with thousands of hypothetical planets that follow the size and period statistics measured around single stars.
They then asked how many of those worlds could be recovered with today’s detectors. An aligned orientation boosts both radial velocity signals and the chance of a planet passing in front of its star, the transit method that astronomers use to see dips in light.
Building the 591 star shortlist
The final catalog listed 591 binaries, all brighter than magnitude 14 and separated by less than two arcseconds on the sky.
That narrow spacing matters because most high‑precision spectrographs collect starlight through fibers that are about one arcsecond wide, so the companion’s glare stays outside the slit, keeping the measurements clean.
Nearly 90 percent of the stars identified fall into the FGK temperature class, meaning they are close cousins of the Sun and rotate slowly enough for stable spectroscopy.
Removing some hotter, broad‑lined stars leaves 940 individual suns that are suitable for velocity work. About two thirds of those show low magnetic jitter, which is an extra help for teasing out planet signals.
What binary star hunters may find
At a precision threshold of 1 meter per second, simulations predict that 74 percent of the target stars should reveal at least one planet within 3 years of monitoring.
Even when the detection bar is raised to 10 meters per second, 1 percent of the stars still host worlds whose tugs are big enough to see from Earth.
Transits are rarer but still rewarding. With a typical 200 parts‑per‑million dip, and a 3‑hour crossing time, roughly 1 in 100 modeled planets could be tracked by a 1‑meter class ground telescope.
A handful of binaries are likely to show 2 separate planetary systems eclipsing in the same field of view.
“We outline how this could, for the first time, be used to conduct comparative studies of planet formation where we have a control sample,” said Rice.
Having two planets born side by side around different stars lets astronomers test whether chemistry, mass, or disk turbulence drives the final architecture.
Next steps for binary star mapping
Because the catalog covers the whole sky, observers in both hemispheres can assign targets to unused nights or piggyback on existing exoplanet surveys.
The list also offers prime candidates for the upcoming Thirty Meter Telescope and ESO’s Extremely Large Telescope, whose adaptive optics imaging could pick out wide giants that are missed by current techniques.
Follow‑up teams plan to measure stellar rotation periods and projected spin speeds, a trick that can confirm whether the stars themselves tilt the same way as their orbit.
If both stellar equators lie edge‑on, the case strengthens that any detected planets kept their original alignment and avoided later gravitational chaos.
Gaia’s next data release may even show the subtle side‑to‑side wobble of massive outer planets directly, providing a mass estimate that pairs neatly with radial velocity curves.
Meanwhile, citizen‑science projects like the Eclipsing Binary Patrol keep flagging new, edge‑on pairs, thus feeding the pipeline with fresh targets every year.
The edge‑on binary approach will not catch every type of planet, especially those in wildly tilted orbits or circling lone stars.
Yet by focusing on where nature already lines up the cue ball, astronomers can rack up discoveries faster and, for the first time, compare sister worlds that were born in the same stellar nursery.
The study is published in The Astrophysical Journal Letters.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
Humans have built so many dams around the world that the Earth’s poles have wandered away from the planet’s rotational axis, new research suggests.
Over the last 200 years, humans have constructed nearly 7,000 massive dams, impounding enough water to nudge the Earth’s poles by about three feet (one meter) and cause a 0.83-inch (21-millimeter) drop in global sea levels, according to a new study in Geophysical Research Letters.
This drift is possible because Earth’s solid crust forms a hard shell around a molten layer of gooey magma. This means that whenever a significant amount of mass is redistributed across the planet’s surface, the outermost rock layer wobbles, shifting relative to Earth’s molten interior. When this happens, different areas on the Earth’s surface end up directly over the planet’s rotational axis. As a result, the planet’s poles pass through different surface locations than before, a phenomenon known as true polar wander.
“As we trap water behind dams, not only does it remove water from the oceans—thus leading to a global sea level fall—it also redistributes mass around the world,” Natasha Valencic, a graduate student at Harvard University and lead author of the new study, said in a statement.
In the study, Valencic and her team analyzed a previously published global database of dams to figure out their locations, the volume of water they store, and how that stored water has impacted Earth’s mass distribution. Previously, the database revealed that 6,862 large dams built between 1835 and 2011 contributed to a decrease in sea levels. Collectively, these dams hold enough water to fill the Grand Canyon twice.
The results showed that global dam-building has caused Earth’s poles to shift in two phases. The first phase, from 1835 to 1954, coincided with a boom in dam construction in North America and Europe. These areas shifted toward the equator, and as a result, the North Pole moved about 8 inches (25 centimeters) toward the 103rd meridian east, a line that passes through Russia, Mongolia, and China.
During the second wave of dam construction, between 1954 and 2011, most dams were built in Asia and East Africa. As a result, the North Pole shifted 22 inches (57 centimeters) toward the 117th meridian west, which passes through western South America and the South Pacific. Polar wander is not linear; instead, it follows a wobbly path, which is why the total shift does not add up precisely to 3.7 feet.
While the results are relatively subtle, they highlight the need for researchers to account for water stored in dams when predicting future sea level rise. In the 20th century, global sea levels rose 4.7 to 6.7 inches (12 to 17 centimeters), but humans trapped about a quarter of that volume behind dams, according to Valencic.
“Depending on where you place dams and reservoirs, the geometry of sea level rise will change,” she said. “That’s another factor we need to consider, because these changes can be quite large and significant.”
Cannabis use may leave lasting fingerprints on the human body, a study of over 1,000 adults suggests – not in our DNA code itself, but in how that code is expressed.
US researchers found it may cause changes in the epigenome, which acts like a set of switches that activate or deactivate genes involved in how our bodies function.
“We observed associations between cumulative marijuana use and multiple epigenetic markers across time,” explained epidemiologist Lifang Hou from Northwestern University when the research was published in 2023.
Cannabis is a commonly used substance in the US, with nearly half of Americans having tried it at least once, Hou and team report in their published paper.
Related: Cannabis Can Leave a Distinct Mark on Your DNA, Study Reveals
Some US states and other countries have made cannabis use legal, but we still don’t fully understand its effects on our health.
To investigate this, the researchers analyzed data from a long-running health study that had tracked around 1,000 adults over two decades.
Participants, who were between 18 and 30 years old when the study began, were surveyed about their cannabis use over the years and gave blood samples at the 15- and 20-year marks.
Using these blood samples from five years apart, Hou and her team looked at the epigenetic changes, specifically DNA methylation levels, of people who had used cannabis recently or for a long time.
The addition or removal of methyl groups from DNA is one of the most studied epigenetic modifications.
When epigenetic factors, which can come from other genes or the environment inside a cell or beyond, recruit a methyl group, it changes the expression of our genes. (ttsz/iStock/Getty Images)
Without changing the genomic sequence, DNA methylation affects how easily cells ‘read’ and interpret genes, much like someone covering up key lines in your set of instructions.
Environmental and lifestyle factors can trigger these methylation changes, which can be passed to future generations, and blood biomarkers can provide information about both recent and historical exposures.
“We previously identified associations between marijuana use and the aging process as captured through DNA methylation,” Hou said.
“We wanted to further explore whether specific epigenetic factors were associated with marijuana and whether these factors are related to health outcomes.”
Environmental and lifestyle factors can trigger methylation changes, which can be passed to future generations. (Monkey Business Images/Canva)
The comprehensive data on the participants’ cannabis use allowed the researchers to estimate cumulative use over time as well as recent use and compare it with DNA methylation markers in their blood for analysis.
They found numerous DNA methylation markers in the 15-year blood samples, 22 that were associated with recent use, and 31 associated with cumulative cannabis use.
In the samples taken at the 20-year point, they identified 132 markers linked to recent use and 16 linked to cumulative use.
The buds of a cannabis plant contain the most cannabinoids. (Esteban López/Unsplash)
“Interestingly, we consistently identified one marker that has previously been associated with tobacco use,” Hou explained, “suggesting a potential shared epigenetic regulation between tobacco and marijuana use.”
Multiple epigenetic changes associated with cannabis use had previously been linked to things like cellular proliferation, hormone signaling, infections, neurological disorders like schizophrenia and bipolar disorder, and substance use disorders.
It’s important to note that this study doesn’t prove that cannabis directly causes these changes or causes health problems.
“This research has provided novel insights into the association between marijuana use and epigenetic factors,” said epidemiologist Drew Nannini from Northwestern University.
“Additional studies are needed to determine whether these associations are consistently observed in different populations. Moreover, studies examining the effect of marijuana on age-related health outcomes may provide further insight into the long-term effect of marijuana on health.”
The study has been published in Molecular Psychiatry.
An earlier version of this article was published in July 2023.
In brilliant new images, the James Webb Space Telescope has captured a rare glimpse at the gaseous “shrouds” that surround dying stars before they go supernova.
Known as Wolf-Rayet stars, which were discovered nearly 160 years ago by astronomers Charles Wolf and Georges Rayet at the Paris Observatory and named in their honor, these ancient stars are, as Space.com notes, surrounded by a “shroud” of cosmic dust that will eventually explode outward and lay the foundations for new stars.
These aged stars, as Space.com explains, have burned off most of their hydrogen. According to Noel Richardson, the leader of the team at Florida’s Embry-Riddle Aeronautical University that found four new Wolf-Rayet systems, the hydrogen burn-off signals that the stars are dying. As they do, powerful winds that pump out of the star system roughly every eight years create the concentric rings that make up these ghastly “shrouds.”
The #JWST has revealed that multiple Wolf‑Rayet star systems, not just the previously studied WR 140, are encircled by long-lived, carbon‑rich dust shells. These aging, hydrogen‑depleted massive stars, often part of binary systems, generate intense stellar winds that collide and… pic.twitter.com/TI3xbTlprY
While these star systems’ existence has been known for the better part of two centuries, their dusty veils have only been observed once before, when the Webb telescope caught similar imagery around WR-140, an aging binary star system located about 5,000 light-years from Earth in the constellation Cygnus.
As explained in an Embry-Riddle statement, this discovery from Richardson and his students not only affirms that other Wolf-Rayet stars form those beautiful, dusty shrouds in the harsh void of space, but also could contribute to our understanding of the stellar life cycle.
Astronomer Ryan Lau — who works at the National Science Foundation’s NOIRLab in Tucson and helped Richardson’s team with the new Wolf-Rayet images — said in the university’s statement that he’s looking forward to seeing what else these strange shells can teach us.
“Where does this dust go?” Lau posited. “We want to learn what exactly the chemistry of this dust is. To do that, we need to take spectra to identify specific grain composition — the physical properties — to get an idea of the chemical contribution to the interstellar medium.”
To capture such poignant and awe-inspiring moments in the lives of star systems is already an incredible feat — and with the Webb telescope’s sophisticated equipment, there will likely be more where that came from.
More on Webb: James Webb Discovers First-Ever Exoplanet by Taking a Picture of It
Behold! The 2025 ZWO Astronomy Photographer of the Year Awards shortlist has been released, showcasing a spectacular array of astrophotography images ranging from solar prominences and auroras to distant galaxies and beguiling nebulas.
For the past 17 years, the Royal Observatory Greenwich — supported by astronomy camera maker ZWO — has called on the global photography community to compete in an open competition celebrating the majesty and variety of our night sky.
The 2025 competition saw photographers from 69 countries submit over 5,500 entries to compete in a plethora of diverse categories to gain recognition and, naturally, prize money. The overall winner of the ZWO Astronomy Photographer of the year will bag a £10,000 (about $13,560 U.S.) grand prize, while the photographers who come out on top in each individual category receive £1,500 (about $2,030 U.S.) for their valiant efforts.
“At ZWO, we believe that astrophotography is not only a way to record the cosmos, but also a way to inspire curiosity, foster education and build communities that transcend borders,” said ZWO founder Sam Wen in a press release revealing the shortlisted images.“Everyone deserves a chance to connect with the universe — and through our support, we hope to bring that experience to more people.”
The winners of each category will be announced in an awards ceremony in September later this year, with the victorious entries — and select runners-up — later being exhibited at the National Maritime Museum in London.
Read on to see the spectacular images shortlisted for the 2025 ZWO Astronomy Photographer of the year awards!
Auroras
This image of a swirling green aurora was captured from a remote location on the Senja Peninsula in northern Norway by Filip Brebenda on Sept 12, 2024. Silvery birch trees dominate the foreground, while a rocky outcrop frames the aurora dancing through the sky above, which is reflected in a placid pool of water between the trunks.
Breaking space news, the latest updates on rocket launches, skywatching events and more!
Photographer Daniel Zafra was able to capture a rare occurrence of a magenta and green aurora reflecting off the waters of California’s Mono Lake in October 2024, alongside protruding rocky formations.
Vincent Beudez imaged a breathtaking auroral display reminiscent of an arctic flower unfolding in the skies over Tromsø in northern Norway on April 4, 2024, framed by the snowy peaks of nearby mountains. He used a Sony Alpha 7S III camera to get the shot.
An image of the Triangulum Galaxy (M33), as captured by astrophotographers Bence Tóth, Péter Feltóti, Bertalan Kecskés from Hungary over the course of several sessions over November-December 2024. The galaxy can be seen undergoing a burst of star formation thanks to the tidal influence of a galactic neighbor, with the glowing red form of an emission nebula visible throughout, giving the impression of a cosmic firework display.
The Andromeda Galaxy (M31) can be seen shining with the light of countless stars and nebulas in this image taken from the Tibetan Autonomous Prefecture in Sichuan, China in late 2024. It took around 216 hours to capture the ancient light used in the creation of the image, which shows the bright central bulge and spiral arms of the Milky Way’s closest galactic neighbor in phenomenal detail.
The barred spiral galaxy NGC 2997 — also known as the Antlia Cabbage Galaxy — is pictured surrounded by glowing red cosmic clouds in this shot by Xinran Li taken from Río Hurtado, Chile in January and February earlier this year. The galaxy exists at a distance of 35 million light-years from Earth in the constellation Antlia and took around 10 hours of observing time to capture using a range of filters.
A distorted moon can be seen rising over the French château of Villebois-Lavalette in this shot by Flavien Beauvais. The shot was taken during the full moon phase in November 2024 using a Canon EOD R7 camera in conjunction with a Sigma 150-600 mm lens.
This composite image captured by Chayaphon Phanitloet from the Nakhon Ratchasima region of Thailand depicts a period in October 2024 when the moon slid in front of Saturn — visible to the left of the image — blocking its light.
Photographer Karthik Easvur created this portrait of a supermoon looming large in the sky over Delhi, India, in November 2024, by stitching together 24 separate images into a seamless mosaic. The so-called “Beaver Moon” was captured using a 6-inch aperture telescope in conjunction with a ZWO camera along with several more helpful peripherals and filters.
Zhang Yanguang was able to capture this perfectly timed composite shot of the International Space Station (ISS) sweeping across the face of the sun from the Fujian region of China on Jan. 24 ofthis year. The photographer deftly maintained the crisp profile of the space station when combining the images during post-processing, while revealing phenomenal detail on the surface of our parent star.
This image of a 311,000-mile-long (500,000 kilometers) solar prominence erupting from the sun was captured on Nov. 7, 2024 from Guangdong province in China by astrophotographer PengFei Chou. The image is constructed from 20 stacked data sets captured over the course of the hour-long eruption.
An artistic view of the sun captured by photographer Damien Cannane, depicting the different phases of a solar eclipse. The arcs in between the eclipsed suns represent a phenomenon known as “Baily’s Beads,” which arise as sunlight shines through valleys on the lunar surface in the moments before and after totality.
Tianyao Yang took this picture of the July 2024 full moon behind skyscrapers in the Lujiazui district of Shanghai. The shot was the culmination of five years of planning and was taken from a distance of 16.5 miles (26.5 km) using a long lens, allowing the photographer to give the moon an outsized appearance compared to the foreground buildings.
This picture, taken from Songyang County, China by photographer Yujie Zhang in August 2024, shows the bright ribbon of the Milky Way tumbling toward a collection of geometric buildings reflected in a foreground body of water.
This placid scene taken with a Canon R6 Mark II camera by photographer Paul Joels captures the Milky Way in the sky over a boathouse in Lulworth Cove in the U.K. A multitude of stars can be seen shining down on the seaside vista, twinkling above a lone boat resting on the roadside.
Comet C/2023 A3 (Tsuchinshan-ATLAS) is seen streaking through the sky above Honolulu, Hawaii, as captured from the Pu’u O Kaimukī Park by photographer Ran Shen on Oct. 12, 2024.
Chester Hall-Fernandez captured this view of the Milky Way setting parallel to the horizon over the Mount John Observatory in New Zealand on July 21, 2024. The MOA-II telescope — the largest telescope on New Zealand’s South Island — can be seen to the right of the image, observing the countless stars populating the southern hemisphere night sky.
An image of Comet Tsuchinshan-ATLAS captured from Namibia in southern Africa by photographers Gerald Rhemann and Michael Jäger in September last year, featuring dust (grey) and ion (blue) tails. The “kinks” in the more tenuous ion trail are created as the solar wind pouring from our parent star impacts the particle trail shed by the wandering comet.
This family portrait of solar system planets — excluding Earth for obvious reasons — was captured with the aid of a 20-inch Dobsonian telescope in Bavaria, Germany, between September 2023 and December 2024 by astrophotographer Sophie Paulin. The planets — Mercury, Venus, Mars, Jupiter, Saturn, Uranus and Neptune — can be seen parading in a line from left to right in the composite piece.
Benjamin Barakat captured this image of a lone tree standing in front of star trails from the Hidaybu district of Yemen on March 13, 2024 using a Sony Alpha 7 IV camera.
A full moon is pictured rising over the Dolomite mountains in Italy by photographer Fabian Dalpiaz in November 2024, as the last of the sunlight catches the upper slopes on a cloudless evening.
This composite shot of the Milky Way was captured by Yoshiki Abe from the mouth of a remote cave in the coastal region of Yamaguchi, Japan on Oct. 12, 2024. The foreground image was snapped during a brief window known to photographers as the “blue hour,” which occurs around the time that the sun sets, infusing the environment with a blueish hue. Abe captured his image of the Milky Way later that same night.
A 23,000 pixel-wide panorama of the Utah Desert, imaged at night by astrophotographer Jim Hildreth with the Milky Way arcing high overhead amongst a sea of stars.
Photographer Andreas Karaolis captured this panorama of the Milky Way’s Cygnus region streaking over a verdant hillside in Cyprus in October 2024. Karaolis also made use of the blue hour to capture the foreground image, snapping a series of 30- and 120-second exposures to capture it and the cosmic scene above, before combining them in the post-processing step.
This composite view of the Christmas Tree Nebula and Rosetta Nebula was captured from the Deep Sky Chile Observatory in November and December last year. The colorful view is the result of 150 hours of observation, during which the ancient light of the nebulas was collected using a wide range of filters.
A portrait of the “Running Chicken Nebula” (IC 2944) — an enormous stellar nursery located in the constellation Centaurus — captured by astrophotographer Rod Prazeres from Queensland, Australia over the course of several nights in March and April 2024.
Shaoyu Zhang took this electric view of the “Spaghetti Nebula” (Simeis 147) from Chile and Sichuan, China between December 2024 and February 2025. Over 148 hours of exposure time were used to obtain a “full-spectrum” image of the vast supernova remnant to reveal structures ordinarily hidden behind a veil of cosmic dust.
An image of the Abel 85 supernova remnant captured in the skies above China by Deqian Li. Li used 23.4 hours of light data to create the image, which was captured over the course of a six-day camping trip in Hongyuan county, China with a Takahashi Epsilon-160ED telescope paired with a ZWO astronomy camera.
A 22-megapixel panorama showing the different stages of a total solar eclipse captured during the April 8, 2024 event by photographer Louis Egan from Quebec, Canada. The final piece was created using around 200 individual images.
Peter Ward’s “neon sun” effect was created using ultraviolet data from NASA’s Solar Dynamics Observatory, which was remapped to colors visible to the naked eye, and turned “inside out” to surround the sun.
Editor’s Note: If you would like to share your astrophotography with Space.com’s readers, then please send your photo(s), comments, and your name and location to spacephotos@space.com.
NASA has officially chosen three new scientific instruments to study the moon, specifically its south polar region, as part of the upcoming Artemis mission. Two of these instruments will be mounted on a new Lunar Terrain Vehicle (LTV), and one will fly on a future moon-orbiting satellite.
The LTV, or rover, is believed to resemble a high-tech sports utility vehicle (SUV). It will carry two astronauts across the lunar surface, but it can also drive itself remotely when no one is aboard. This mission will mark the first time a rover has been on the moon in over 50 years.
Three private companies are building rover designs, including Texas-based Intuitive Machines, Lunar Outpost from Colorado, and California-headquartered Venturi Astrolab. NASA will choose one for a demonstration mission by late 2025. NASA’s Artemis program aims to send humans back to the moon for the first time since the Apollo missions.
NASA’s new moon toys
As for the instruments in question, the first is called the Artemis Infrared Reflectance and Emission Spectrometer (AIRES). This will be mounted directly on the LTV and will be used to detect minerals and volatiles (like water or carbon dioxide) by analyzing how sunlight reflects off the moon’s surface. According to NASA, AIRES will also create detailed maps showing what materials are present, especially around the satellite’s south pole.
The second instrument, Lunar Microwave Active-Passive Spectrometer (L-MAPS), will also be mounted on the rover. This will use ground-penetrating radar and temperature sensors to scan up to 40 meters underground. It will help locate buried ice and gain a deeper understanding of the moon’s subsurface structure.
The third and final piece of kit, Ultra-Compact Imaging Spectrometer for the Moon (UCIS-Moon), will be mounted on a future moon-orbiting satellite, not on the rover. This device will capture high-resolution images and scans of surface water, minerals, and assess how human activity (such as landings) may be affecting the moon. It will also help guide astronauts to areas rich in resources or scientific value.
Together, these instruments will help map resources for future missions. They will also support astronaut safety and planning by understanding the terrain and environment. The tools will further help contribute to science by revealing how the moon evolved and what it tells us about other rocky planets.
Giant leap for mankind
Overall, the move marks a critical step in NASA’s effort to build infrastructure on and around the moon to support long-term exploration. The development is part of a broader effort to return humans to the moon, explore more deeply than ever before, and eventually prepare for missions to mars.
“The Artemis Lunar Terrain Vehicle will transport humanity farther than ever before across the lunar frontier on an epic journey of scientific exploration and discovery,” said Nicky Fox, associate administrator, Science Mission Directorate at NASA Headquarters in Washington.
“By combining the best of human and robotic exploration, the science instruments selected for the LTV will make discoveries that inform us about Earth’s nearest neighbor as well as benefit the health and safety of our astronauts and spacecraft on the Moon,” he added.
“Together, these three scientific instruments will make significant progress in answering key questions about what minerals and volatiles are present on and under the surface of the Moon,” said Joel Kearns, deputy associate administrator for exploration, Science Mission Directorate at NASA Headquarters.
“With these instruments riding on the LTV and in orbit, we will be able to characterize the surface not only where astronauts explore, but also across the south polar region of the Moon, offering exciting opportunities for scientific discovery and exploration for years to come,” Kearns stated.
“Very early in Mars’ history, maybe 4 billion years ago, the planet was warm enough to support lakes and river networks,” Kite told Ars. “There were seas, and some of those seas were as big as the Caspian Sea, maybe bigger. It was a wet place.” This wet period, though, didn’t last long—it was too short to make the landscape deeply weathered and deeply eroded.
Kite’s team used their model to focus on what happened as the planet got colder, when the era of salts started. “Big areas of snowmelts created huge salt flats, which eventually built up over time, accumulating into a thick sedimentary deposit Curiosity rover is currently exploring,” Kite said. But the era of salts did not mark the end of liquid water on the Martian surface.
Flickering habitability
The landscape turned arid, judging by Earth’s standards, roughly 3.5 billion years ago. “There were long periods when the planet was entirely dry,” Kite said. During these dry periods, Mars was almost as cold as it is today. But once in a while, small areas with liquid water appeared on the Martian surface like oases amidst an otherwise unwelcoming desert. It was a sterile planet with flickering, transient habitable spots with water coming from melted snow.
This rather bleak picture of the Martian landscape’s evolution makes questions about our chances for finding traces of life in there tricky.
“You can do a thought experiment where you take a cup of water from the Earth’s ocean and pour it into one of those transient lakes on Mars,” Kite said. “Some microbes in this cup of water would do fine in such conditions.” The bigger question, he thinks, is whether life could originate (rather than just survive) on ancient Mars. And, perhaps more critically, whether hypothetical life that originated even before the salts era, when the planet was warm and wet, could persist in the oases popping up in the Kite’s model.