SYDNEY, Sept. 3 (Xinhua) — Rare rocks buried deep beneath central Australia reveal the origins of a promising niobium deposit, a critical metal essential for clean energy and advanced steelmaking.
The study found the newly discovered niobium-rich carbonatites were emplaced more than 800 million years ago, rising from deep within the Earth through pre-existing fault zones during a tectonic rifting event that ultimately tore apart the supercontinent Rodinia, said a statement released Wednesday by Australia’s Curtin University.
These carbonatites contain important concentrations of niobium, a strategic metal used to make lighter, stronger steel for aircraft, pipelines and electric vehicles, and a key component in some next-generation battery and superconducting technologies, according to the research published in the Geological Magazine in Britain.
The findings reveal how rare, metal-rich magmas reach the surface, and why this particular deposit is so interesting, said the study’s lead author Maximilian Drollner from the Timescales of Mineral Systems Group within Curtin’s Frontier Institute for Geoscience Solutions and Germany’s University of Gottingen.
Using multiple isotope-dating techniques on drill core samples, the team found that these carbonatites were emplaced between 830 and 820 million years ago, during a period of continental rifting that preceded the breakup of Rodinia.
“This tectonic setting allowed carbonatite magma to rise through fault zones that had remained open and active for hundreds of millions of years, delivering metal-rich melts from deep in the mantle up into the crust,” Drollner said. ■
Climate models are complex, just like the world they mirror. They simultaneously simulate the interacting, chaotic flow of Earth’s atmosphere and oceans, and they run on the world’s largest supercomputers.
Critiques of climate science, such the report written for the Department of Energy by a panel in 2025, often point to this complexity to argue that these models are too uncertain to help us understand present-day warming or tell us anything useful about the future.
But the history of climate science tells a different story.
The earliest climate models made specific forecasts about global warming decades before those forecasts could be proved or disproved. And when the observations came in, the models were right. The forecasts weren’t just predictions of global average warming – they also predicted geographical patterns of warming that we see today.
Syukuro Manabe was awarded the Nobel Prize in physics in 2021. Johan Nilsson/TT News Agency/AFP
These early predictions starting in the 1960s emanated largely out of a single, somewhat obscure government laboratory outside Princeton, New Jersey: the Geophysical Fluid Dynamics Laboratory. And many of the discoveries bear the fingerprints of one particularly prescient and persistent climate modeler, Syukuro Manabe, who was awarded the 2021 Nobel Prize in physics for his work.
Manabe’s models, based in the physics of the atmosphere and ocean, forecast the world we now see while also drawing a blueprint for today’s climate models and their ability to simulate our large-scale climate. While models have limitations, it is this track record of success that gives us confidence in interpreting the changes we’re seeing now, as well as predicting changes to come.
Forecast No. 1: Global warming from CO2
Manabe’s first assignment in the 1960s at the U.S. Weather Bureau, in a lab that would become the Geophysical Fluid Dynamics Laboratory, was to accurately model the greenhouse effect – to show how greenhouse gases trap radiant heat in Earth’s atmosphere. Since the oceans would freeze over without the greenhouse effect, this was a key first step in building any kind of credible climate model.
To test his calculations, Manabe created a very simple climate model. It represented the global atmosphere as a single column of air and included key components of climate, such as incoming sunlight, convection from thunderstorms, and his greenhouse effect model.
Results from Manabe’s 1967 single-column global warming simulations show that as carbon dioxide (CO2) increases, the surface and lower atmosphere warm, while the stratosphere cools. Syukuro Manabe and Richard Wetherald, 1967
Despite its simplicity, the model reproduced Earth’s overall climate quite well. Moreover, it showed that doubling carbon dioxide concentrations in the atmosphere would cause the planet to warm by about 5.4 degrees Fahrenheit (3 degrees Celsius).
This estimate of Earth’s climate sensitivity, published in 1967, has remained essentially unchanged in the many decades since and captures the overall magnitude of observed global warming. Right now the world is about halfway to doubling atmospheric carbon dioxide, and the global temperature has warmed by about 2.2 F (1.2 C) – right in the ballpark of what Manabe predicted.
Other greenhouses gases such as methane, as well as the ocean’s delayed response to global warming, also affect temperature rise, but the overall conclusion is unchanged: Manabe got Earth’s climate sensitivity about right.
Forecast No. 2: Stratospheric cooling
The surface and lower atmosphere in Manabe’s single-column model warmed as carbon dioxide concentrations rose, but in what was a surprise at the time, the model’s stratosphere actually cooled.
Temperatures in this upper region of the atmosphere, between roughly 7.5 and 31 miles (12 and 50 km) in altitude, are governed by a delicate balance between the absorption of ultraviolet sunlight by ozone and release of radiant heat by carbon dioxide. Increase the carbon dioxide, and the atmosphere traps more radiant heat near the surface but actually releases more radiant heat from the stratosphere, causing it to cool.
IPCC 6th Assessment Report
This cooling of the stratosphere has been detected over decades of satellite measurements and is a distinctive fingerprint of carbon dioxide-driven warming, as warming from other causes such as changes in sunlight or El Niño cycles do not yield stratospheric cooling.
Forecast No. 3: Arctic amplification
Manabe used his single-column model as the basis for a prototype quasi-global model, which simulated only a fraction of the globe. It also simulated only the upper 100 meters or so of the ocean and neglected the effects of ocean currents.
In 1975, Manabe published global warming simulations with this quasi-global model and again found stratospheric cooling. But he also made a new discovery – that the Arctic warms significantly more than the rest of the globe, by a factor of two to three times.
Map from IPCC 6th Assessment Report
This “Arctic amplification” turns out to be a robust feature of global warming, occurring in present-day observations and subsequent simulations. A warming Arctic furthermore means a decline in Arctic sea ice, which has become one of the most visible and dramatic indicators of a changing climate.
Forecast No. 4: Land-ocean contrast
In the early 1970s, Manabe was also working to couple his atmospheric model to a first-of-its-kind dynamical model of the full world ocean built by oceanographer Kirk Bryan.
Around 1990, Manabe and Bryan used this coupled atmosphere-ocean model to simulate global warming over realistic continental geography, including the effects of the full ocean circulation. This led to a slew of insights, including the observation that land generally warms more than ocean, by a factor of about 1.5.
As with Arctic amplification, this land-ocean contrast can be seen in observed warming. It can also be explained from basic scientific principles and is roughly analogous to the way a dry surface, such as pavement, warms more than a moist surface, such as soil, on a hot, sunny day.
The contrast has consequences for land-dwellers like ourselves, as every degree of global warming will be amplified over land.
Forecast No. 5: Delayed Southern Ocean warming
Perhaps the biggest surprise from Manabe’s models came from a region most of us rarely think about: the Southern Ocean.
This vast, remote body of water encircles Antarctica and has strong eastward winds whipping across it unimpeded, due to the absence of land masses in the southern midlatitudes. These winds continually draw up deep ocean waters to the surface.
Winds around Antarctica contribute to upwelling of cold deep water that keeps the Southern Ocean cool while also raising nutrients to the surface waters. NOAA
Manabe and colleagues found that the Southern Ocean warmed very slowly when atmospheric carbon dioxide concentrations increased because the surface waters were continually being replenished by these upwelling abyssal waters, which hadn’t yet warmed.
This delayed Southern Ocean warming is also visible in the temperature observations.
What does all this add up to?
Looking back on Manabe’s work more than half a century later, it’s clear that even early climate models captured the broad strokes of global warming.
Manabe’s models simulated these patterns decades before they were observed: Arctic Amplification was simulated in 1975 but only observed with confidence in 2009, while stratospheric cooling was simulated in 1967 but definitively observed only recently.
Climate models have their limitations, of course. For instance, they cannot predict regional climate change as well as people would like. But the fact that climate science, like any field, has significant unknowns should not blind us to what we do know.
Ice does not usually show up in conversations about electricity. A new study reports that ordinary frozen water generates electric charge when it bends, and the measured response is on the same order as benchmark electroceramics such as titanium dioxide and strontium titanate.
The research also links this behavior to how storms build up charge, offering a fresh way to think about why lightning starts inside clouds. It adds a surface twist at extremely low temperatures that could matter in special environments.
Ice and flexo-electricity
Scientists call this effect flexo-electricity, the coupling between electric polarization and strain gradients in an insulator. A comprehensive review explains why any solid can show some flexoelectric response when it is bent unevenly or shaped with strong curvature.
This is not the same as being piezoelectric, which requires a crystal structure that lacks inversion symmetry and creates charge directly under uniform compression or tension. Flexo-electricity does not need that symmetry break, so it can appear in materials that fail the piezoelectric test.
Dr. Xin Wen of the Catalan Institute of Nanoscience and Nanotechnology (ICN2), located on the Universitat Autonoma de Barcelona campus, helped lead the experiments and modeling. The team combined precise bending tests with theory to tie the electrical signal to the mechanical shape of the ice.
Testing a slab of ice
The researchers shaped an ice slab, placed it between metal plates, then bent it in a controlled way while monitoring the voltage that appeared. The signal tracked how strongly the slab curved, which is exactly what flexo-electricity predicts.
“We discovered that ice generates electric charge in response to mechanical stress at all temperatures,” said Dr. Wen.
The tests showed that ice keeps producing a strong electrical signal across the whole range of temperatures where it stays solid, right up until it melts. That puts frozen water in the same league as some engineered materials, like certain oxides, that are commonly used in electronic sensors and capacitors.
At extremely low temperatures, the researchers also noticed a very thin surface layer of ice that could flip its electrical orientation when an outside electric field was applied. This layer acts like a ferroelectric, but only on the surface and not throughout the entire block of ice.
Ice interacting with its environment
Surface structure can dramatically change how ice interacts with its surroundings. In thunderclouds, tiny ice crystals crash into soft hailstones known as graupel, and those collisions shift electric charge from one particle to another.
Studies in the lab and in real storms have shown that these encounters separate charge in ways that depend on temperature, building up the electric fields that allow lightning to form.
Flexo-electricity offers an additional microphysical pathway for those particles to charge up during bouncy, irregular impacts that bend and twist their surfaces. The new measurements match the scale of charge transfers inferred for real collisions, which helps knit lab physics to storm electrification without requiring piezoelectricity.
A clear overview from NOAA outlines how separate charge regions form in a storm, build an electric field, and finally trigger a lightning discharge. The present work slides a mechanical bending effect into that picture, adding a way for collisions to do electrical work when particles deform unevenly.
This matters most in the mixed phase region of a storm where supercooled droplets coat graupel and ice crystals ricochet through updrafts. Nonuniform stresses there are normal, so a bending driven mechanism is a natural candidate.
Ice powers new electricity tech
Ice is cheap to make, it molds into shapes easily, and it is abundant in cold places. Flexoelectric transduction could let engineers build simple sensors or pressure to voltage converters in situ, using water and metal contacts without high temperature processing or rare elements.
Devices would not be limited to extreme cold, since the flexoelectric response persists up to the melting point. Designs would focus on geometry, because stronger curvature and sharper gradients usually drive larger signals in flexoelectric systems.
The ferroelectric surface layer at about -171°F raises interesting options for switching behavior in deep cold. It could enable memory-like responses in polar regions or high altitude labs, where a modest electric field flips the surface polarization while the interior remains nonpolar.
Electricity lessons from ice
Flexo-electricity turns uneven bending into electrical charge, even in a material long treated as electromechanically quiet under uniform pressure.
Ice now joins the small set of everyday materials proven to convert mechanical shape changes into measurable voltage. In storm physics, it emerges as a credible new factor working alongside well-known non-inductive charging processes.
Charge generated by bending fits naturally with the chaotic collisions of particles, linking lab findings to the electric dynamics of real clouds.
The study is published in the journal Nature Physics.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
Glow-in-the-dark plants bright enough to light up streets at night may sound like the stuff of science fiction or fantasy.
But scientists have alreadymade plants that emit a greenish glow. They are even commercially available in the United States.
A group of Chinese researchers has just gone even further, creating what they say are the first multicolored and brightest-ever luminescent plants.
“Picture the world of Avatar, where glowing plants light up an entire ecosystem,” biologist Shuting Liu, a researcher at South China Agricultural University in Guangzhou and co-author of the study published August 27 in the journal Matter, said in a statement.
“We wanted to make that vision possible using materials we already work with in the lab. Imagine glowing trees replacing streetlights,” she added.
To make the plants glow, Liu and her fellow researchers injected the leaves of the succulent Echeveria “Mebina” with strontium aluminate, a material often used in glow-in-the-dark toys that absorbs light and gradually releases it over time.
This method marks a departure from the traditional gene-editing technique that scientists use to achieve this effect, following a model pioneered by a team at the Massachusetts Institute of Technology.
Injecting a plant with nanoparticles instead of editing its genes allowed the researchers to create plants that glow red, blue and green. Normally, constrained by the plant’s natural color, scientists can only create a green glow.
“Gene editing is an excellent approach,” Liu told CNN in an email Tuesday, but added: “We were particularly inspired by inorganic afterglow materials that can be ‘charged’ by light and then release it slowly as afterglow, as well as by prior efforts on glowing plants that hinted at plant-based lighting — even concepts like plant streetlights.”
“Our goal was therefore to integrate multicolor, long-afterglow materials with plants to move beyond the usual color limits of plant luminescence and provide a photosynthesis-independent way for plants to store and release light —essentially, a light charged, living plant lamp,” she added.
The research team attempted to show the practical application of their idea by constructing a green wall made of 56 plants that produced enough light to see text, images and a person located up to 10 centimeters (four inches) away, according to the study.
Once injected and placed under direct sunlight for a couple of minutes, the plants continued to glow for up to two hours.
While the brightness of the afterglow gradually weakened during that time period, “plants can be recharged repeatedly by exposure to sunlight,” Liu said, replenishing the plants’ stored energy and “allowing the plants to continue glowing after the sunlight is removed.”
The plants maintain the ability to emit the afterglow effect 25 days after treatment, Liu said, and older leaves injected with the afterglow particles continue to emit light under UV stimulation “even after wilting.”
While strontium aluminate can readily decompose in plants, posing harm to plant tissue, Liu said, the scientists developed a chemical coating for the material that acts as a protective barrier.
The researchers said in the paper that they see their findings as highlighting “the potential of luminescent plants as sustainable and efficient lighting systems, capable of harvesting sunlight during the day and emitting light at night.”
However, other scientists are skeptical about the practicality. “I like the paper, it’s fun, but I think it’s a little beyond current technology, and it might be beyond what plants can bear,” biochemist John Carr, a professor of plant sciences at the University of Cambridge, who was not involved in the study, told CNN.
“Because of the limited amount of energy that these plants can emit, I don’t really see them as streetlights anytime soon,” he added.
Liu acknowledged that the plants “are still far from providing functional illumination, as their luminescence intensity remains too weak for practical lighting applications. Additionally, the safety assessment of afterglow particles for both plants and animals is still ongoing.”
She said the luminescent plants currently “can primarily serve as decorative display pieces or ornamental night lights.”
However, Liu added, “Looking ahead, if we can significantly enhance the brightness and extend the duration of luminescence — and once safety is conclusively demonstrated — we could envision gardens or public spaces being softly illuminated at night by glowing plants.”
Sign up for CNN’s Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more.
“When I joined Colossal, there were three projects that I was really passionate about from my history of losing elephants in human care, and one of them was EEHV,” says Matt James, the chief animal officer at Colossal and a former director of animal care at zoos in Miami and Dallas. According to James, coming to the biogenetics company included him pushing for using this technology to combat the elephantine herpesvirus.
“We went and found [virologist Paul Ling], the smartest guy that we knew working in that space,” James notes, “and it took us 13 months from the moment we invested in the project to the moment when we had the first trial. That’s an incredible representation of the scale and pace at which Colossal can work, and that’s been one of the most meaningful projects for me because I personally lost elephants to EEHV.”
It also might be the first breakthrough in which the same technology that is coming under debate for bringing back the dire wolf (or at least a version of it) is used to even more radically change the fates of living species—albeit not without its own debate as well.
“We work with 60 conservation partners around the world, and we’re doing more conservation projects than de-extinction projects, but like no one seems to–I shouldn’t say care–but it’s never a focus,” Colossal CEO Ben Lamm muses. The entrepreneur is clearly proud of the impact the dire wolf news had on the world last April—the conference room we chat in is decorated with the mythic creature’s profile painted on walls like a gnarly ‘80s rock album cover—but he also appears bemused by how much less fanfare the same news cycle had for data that stated Colossal had cloned several American red wolves with ancient, seemingly long-lost biodiversity. (There are currently only an estimated 17 red wolves roaming free and on the verge of extinction in North Carolina.)
“What we’ve done so far is we’ve worked on the ghost wolf side,” James says, using a term applied to an admix hybrid of coyote and red wolves found along the Gulf Coast states. “We’ve cloned animals out of the ghost wolf population in Louisiana and Texas. What we’re doing now is once we have this historical analysis, we can understand what part of the ghost wolf genome is red wolf ancestry and which part is coyote ancestry, and then we can begin to use genomic editing tools to remove the coyote ancestry and replace it with historical red wolf ancestry.”
Yet even that aim has come under academic deliberation. While Colossal is partnered with several red wolf conservation groups, as well as Bridgett vonHoldt, a Princeton professor, geneticist, and lifelong advocate for preserving the red wolf, other groups reacted to the new of Colossal cloning four red wolves with wary skepticism. One conservation group posted on Facebook, “… the samples cloned were NOT from Red Wolves, but were from Gulf Coast canids. The samples, acquired from canids in LA and TX, were analyzed and taxonomically classified as coyotes.”
As part of its mission to explore the surface of Mars, NASA’s Perseverance rover continues its epic journey beyond Jezero Crater’s rim.
Recently, the rover captured a remarkable image from the summit of an outcrop named Soroya Ridge using its onboard Left Navigation Camera (Navcam).
What is it?
NASA’s Perseverance rover launched in July 2020 and landed in Jezero Crater in 2021. The car-sized robot has since become a key tool in studying our planetary neighbor. Perseverance was built upon previous rover designs like Opportunity and Curiosity, but with a sharper focus on astrobiology and the long-term goal of returning Martian samples to Earth.
Where is it?
This photo was taken at the Soroya Ridge, southeast of the Jezero crater.
An image taken at the Soroya Ridge by NASA’s Perseverance rover. (Image credit: NASA/JPL-Caltech)
Why is it amazing?
Perseverance’s primary mission is to search for signs of ancient microbial life and to collect samples of rock and soil that could one day be analyzed on Earth. To do this, it carries a suite of instruments for precise sample collection. These samples will eventually be retrieved by a proposed future mission, although its current status is in jeopardy.
Want to learn more?
You can read more about Mars rovers and looking for life on Mars.
Breaking space news, the latest updates on rocket launches, skywatching events and more!
The most powerful computer might one day be made of living cells instead of silicon and wires.
A new project at Rice University in Texas is working to make that vision a reality.
With a $1.99 million grant from the National Science Foundation, the team will develop engineered bacterial systems that could serve as the foundation for biological computing systems.
This four-year project, which includes collaborators from the University of Houston, is set to turn traditional computing on its head.
The concept is simple: individual bacterial cells act as tiny processors. By joining them, scientists can construct a powerful biological computer network.
“Microbes are remarkable information processors, and we want to understand how to connect them into networks that behave intelligently,” said Professor Matthew Bennett, who leads the project from Rice University.
“By integrating biology with electronics, we hope to create a new class of computing platforms that can adapt, learn, and respond to their environments,” he added.
‘Living computer’
Building computers from living biological matter has been around for many years.
This field, known as biocomputing, uses synthetic biology and living matter like lab-grown brain cells (organoids) to create computer architecture rather than standard silicon-based hardware.
It is driven by the knowledge that brains, whether human or animal, can perform lots of calculations per second with very little energy.
Researchers believe this biological efficiency could solve the ballooning energy demands of artificial intelligence.
For example, a Swiss company called FinalSpark has already developed a computer platform powered by human-brain organoids, which scientists can rent over the internet. The company’s goal is to create an AI computing system that uses much less energy than current designs.
The new Rice University project, however, is unique in its focus on using microbes.
The project seeks to develop platforms for a new class of computing systems, which will be built from living cells by linking microbial sensing and communication with electronic networks.
Medical biosensors
The team’s work is based on the idea that each individual microbial cell can be treated as a processor.
Since these microorganisms naturally communicate with each other through chemical or electrical signals, they can be linked to form a parallel computing system.
Using continuous culture systems, the researchers will maintain these microbes and link them with electronics.
This will allow the microbial networks to learn and adapt over time, enabling them to recognize patterns. As a result, the system will be able to respond to real-world chemical inputs in impossible ways for conventional computers.
If successful, the project could advance medical diagnostics, environmental monitoring, and next-generation computing.
A key application is the development of smart biosensors that can detect specific chemical markers, such as disease biomarkers or environmental contaminants, and transmit the information electronically.
“Beyond diagnostics and monitoring, living computers may one day adapt and evolve in ways that surpass the capabilities of traditional machines,” Bennett said.
The project will also examine the ethical, legal, and social implications of creating programmable living computers. This includes exploring how these technologies should be regulated and how the public will receive them.