Author: admin

  • The Lenovo ThinkBook G6 is a powerhouse for work and school, and it’s 70% off at Amazon

    The Lenovo ThinkBook G6 is a powerhouse for work and school, and it’s 70% off at Amazon

    Lenovo/ZDNET

    The Lenovo ThinkBook G6 is a great, high-end productivity laptop for students and professionals. It features a 16-inch touchscreen display, fa fngerprint reader for enhanced security, and support for Wi-Fi 6 and Bluetooth 5.3 connectivity. If you’re looking to pick up a robust laptop before heading off to college or upgrade your work-from-home setup without breaking the bank, you can save 71% on the ThinkBook G6 at Amazon, bringing the price to just $700.

    Also: The best Lenovo laptops you can buy

    Along with support for touch inputs, the 16-inch display produces up to 1920 x 1200 resolution for enhanced detailing and contrast. It also has an anti-glare coating for better visibility in harsh overhead office or classroom lighting. It even has an integrated blue light filter to help reduce eye fatigue and strain during long days before the screen, writing term papers or quarterly reports.

    Read more: I recommend this Lenovo ThinkPad to remote workers

    The build on offer from Amazon features 32GB of RAM, 1TB SSD storage drive, and an AMD Ryzen 5 7430U processor for all the power and performance you need for everything from photo and video editing to streaming, web browsing, and multitasking in productivity programs like Photoshop. And if you need more memory and storage, the ThinkBook G6 supports up to 64GB of RAM and features dual M.2 SSD slots for quick and easy upgrades if and when you need them. 

    The ThinkBook G6 supports Thunderbolt 4 connectivity for charging mobile devices, quickly transferring large files and data, and even setting up a second display if you need more room to work (or prefer having a second screen for streaming video or music while you work). All of this is backed up by a decent battery life for such power-hungry components, topping out at around 10 hours on a full charge. This means you can get through a typical work or school day without having to worry about where the nearest outlet is. 

    Looking for the next best product? Get expert reviews and editor favorites with ZDNET Recommends.

    How I rated this deal 

    The Lenovo ThinkBook G6 is a high-end productivity laptop worth its original price tag. Built with premium components like an AMD Ryzen 5 7000 series CPU, 32GB of RAM, and a 1TB SSD, it’s a solid foundation for office and class work and a robust laptop for handling casual home use like streaming and web browsing. With a discount of just over 70%, this is an almost unbelievably good value, so I gave it a 5/5 Editor’s rating.

    Deals are subject to sell out or expire at any time, though ZDNET remains committed to finding, sharing, and updating the best product deals so that you can score the best savings. 

    Our team of experts regularly checks in on the deals we share to ensure they are still live and obtainable. We’re sorry if you’ve missed out on this deal, but don’t fret — we’re constantly finding new chances to save and sharing them with you at ZDNET.com.

    Show more

    We aim to deliver the most accurate advice to help you shop smarter. ZDNET offers 33 years of experience, 30 hands-on product reviewers, and 10,000 square feet of lab space to ensure we bring you the best of tech. 

    In 2025, we refined our approach to deals, developing a measurable system for sharing savings with readers like you. Our editor’s deal rating badges are affixed to most of our deal content, making it easy to interpret our expertise to help you make the best purchase decision.

    At the core of this approach is a percentage-off-based system to classify savings offered on top-tech products, combined with a sliding-scale system based on our team members’ expertise and several factors like frequency, brand or product recognition, and more. The result? Hand-crafted deals chosen specifically for ZDNET readers like you, fully backed by our experts. 

    Also: How we rate deals at ZDNET in 2025

    Show more


    Continue Reading

  • Why plane turbulence is really becoming more frequent and severe

    Why plane turbulence is really becoming more frequent and severe

    Simon King profile image
    BBC Two treated images of a person holding on to a plane arm restBBC

    Andrew Davies was on his way to New Zealand to work on a Doctor Who exhibition, for which he was project manager. The first leg of his flight from London to Singapore was fairly smooth. Then suddenly the plane hit severe turbulence.

    “Being on a rollercoaster is the only way I can describe it,” he recalls. “After being pushed into my seat really hard, we suddenly dropped. My iPad hit me in the head, coffee went all over me. There was devastation in the cabin with people and debris everywhere.

    “People were crying and [there was] just disbelief about what had happened.”

    Mr Davies was, he says, “one of the lucky ones”.

    Other passengers were left with gashes and broken bones. Geoff Kitchen, who was 73, died of a heart attack.

    Death as a consequence of turbulence is extremely rare. There are no official figures but there are estimated to have been roughly four deaths since 1981. Injuries, however, tell a different story.

    REUTERS/Stringer The interior of Singapore Airlines flight SQ32 after severe turbulence REUTERS/Stringer

    Severe turbulence on the Singapore Airlines flight caused the plane to drop 178ft (54m) in 4.6 seconds

    In the US alone, there have been 207 severe injuries – where an individual has been admitted to hospital for more than 48 hours – since 2009, official figures from the National Transportation Safety Board show. (Of these, 166 were crew and may not have been seated.)

    But as climate change shifts atmospheric conditions, experts warn that air travel could become bumpier: temperature changes and shifting wind patterns in the upper atmosphere are expected to increase the frequency and intensity of severe turbulence.

    “We can expect a doubling or tripling in the amount of severe turbulence around the world in the next few decades,” says Professor Paul Williams, an atmospheric scientist at the University of Reading.

    “For every 10 minutes of severe turbulence experienced now, that could increase to 20 or 30 minutes.”

    So, if turbulence does get more intense, could it become more dangerous too – or are there clever ways that airlines can better “turbulence-proof” their planes?

    The bumpy North Atlantic route

    Severe turbulence is defined as when the up and down movements of a plane going through disturbed air exert more than 1.5g-force on your body – enough to lift you out of your seat if you weren’t wearing a seatbelt.

    Estimates show that there are around 5,000 incidents of severe-or-greater turbulence every year, out of a total of more than 35 million flights that now take off globally.

    Of the severe injuries caused to passengers flying throughout 2023 – almost 40% were caused by turbulence, according to the annual safety report by the International Civil Aviation Organization.

    Brazil Photos/LightRocket via Getty Images Passengers seated in airplaneBrazil Photos/LightRocket via Getty Images

    In 2023, nearly 40% of serious passenger injuries were caused by turbulence, according to the International Civil Aviation Organization

    The route between the UK and the US, Canada and the Caribbean is among the areas known to have been affected. Over the past 40 years, since satellites began observing the atmosphere, there has been a 55% increase in severe turbulence over the North Atlantic.

    But the frequency of turbulence is projected to increase in other areas too according to a recent study – among them, parts of East Asia, North Africa, North Pacific, North America and the Middle East.

    The knock-on effect of climate change

    There are three main causes of turbulence: convective (clouds or thunderstorms), orographic (air flow around mountainous areas) and clear-air (changes in wind direction or speed).

    Each type could bring severe turbulence. Convective and orographic are often more avoidable – it is the clear-air turbulence that, as the name might imply, cannot be seen. Sometimes it seemingly comes out of nowhere.

    KIRILL KUDRYAVTSEV /AFP via Getty Birds fly near an aircraft of German airline Lufthansa in front of a cloudy skyKIRILL KUDRYAVTSEV /AFP via Getty

    Avoiding turbulence-producing storms can crowd airspace, as more planes are forced to change routes, according to experts

    Climate change is a major factor in driving up both convective and clear-air turbulence.

    While the relationship between climate change and thunderstorms is complex, a warmer atmosphere can hold more moisture – and that extra heat and moisture combine to make more intense thunderstorms.

    Linking this back to turbulence — convective turbulence is created by the physical process of air rising and falling in the atmosphere, specifically within clouds. And you won’t find more violent up and downdrafts than in cumulonimbus, or thunderstorm clouds.

    This was the cause of the severe turbulence on Andrew Davies’s journey back in 2024.

    A report by Singapore’s Transport Safety Investigation Bureau found that the plane was “likely flying over an area of developing convective activity” over south Myanmar, leading to “19 seconds of extreme turbulence that included a drop of 178 feet in just under five seconds”.

    MediaNews Group/Boston Herald via Getty Images Lightning strikes near a departing airliner at Logan airportMediaNews Group/Boston Herald via Getty Images

    A 2014 US study found that for every 1°C rise in global temperature, lightning strikes increase by 12%

    One study from the US published in the Science journal in 2014 showed that for 1C increase in global temperature, lightning strikes increase by 12%.

    Captain Nathan Davies, a commercial airline pilot, says: “I have noticed more large storm cells spreading 80 miles plus in diameter in the last few years, something you’d expect to be rare.”

    But he adds: “The large cumulonimbus clouds are easy to spot visually unless embedded within other clouds, so we can go around them.”

    Clear-air turbulence could also soon rise. It is caused by disturbed air in and around the jet stream, (a fast-moving wind at around six miles in the atmosphere, which is the same height as where planes cruise).

    Wind speeds in the jet stream travelling from west to east across the Atlantic can vary from 160mph to 250mph.

    There is colder air to the north and warmer air to the south: this temperature difference and change in winds is useful for airliners to use as a tailwind to save time and fuel. But it also creates the turbulent air.

    “Climate change is warming the air to the south of the jet stream more than the air to the north so that temperature difference is being made stronger,” explains Prof Williams. “Which in turn is driving a stronger jet stream.”

    ‘It should worry us all’

    The increase in severe turbulence – enough to lift you out of your seat – could potentially bring more incidents of injury, or possibly death in the most severe cases. And some passengers are concerned.

    For Mr Davies, the prospect of more turbulence is worrying. “A lot. Not just for me, but my children too,” he explains.

    “I’m pleased there hasn’t been an incident as severe as mine but I think it should worry us all”.

    More than a fifth of UK adults say they are scared of flying, according to a recent YouGov survey, and worsening turbulence could make journeys even more of a nightmare for these people.

    As Wendy Barker, a nervous flyer from Norfolk, told me: “More turbulence to me equals more chance of something going wrong and less chance of survival.”

    Aircraft wings are, however, designed to fly through turbulent air. As Chris Keane, a former pilot and now ground-school instructor says, “you won’t believe how flexible a wing is. In a 747 passenger aircraft, under ‘destructive’ testing, the wings are bent upwards by some 25 degrees before they snap, which is really extreme and something that will never happen, even in the most severe turbulence.”

    For airlines, however, there is a hidden concern: that is the economic costs of more turbulence.

    The hidden cost of turbulence

    AVTECH, a tech company that monitors climate and temperature changes – and works with the Met Office to help warn pilots of turbulence – suggests that the costs can range from £180,000 to £1.5 million per airline annually.

    This includes the costs of having to check and maintain aircraft after severe turbulence, compensation costs if a flight has to be diverted or delayed, and costs associated with being in the wrong location.

    Kevin Carter/GETTY An aerial view showing severe thunderstorms moving through the Washington, D.C. metropolitan area Kevin Carter/GETTY

    Climate change is one factor in making turbulence worse, increasing both storm-related and clear-air turbulence

    Eurocontrol, a civil-military organisation that helps European aviation understand climate change risks, says that diverting around turbulence-producing storms can have a wider impact – for example, if lots of aircraft are having to change flight paths, airspace can get more crowded in certain areas.

    “[This] increases workload for pilots and air traffic controllers considerably,” says a Eurocontrol spokesperson.

    Having to fly around storms also means extra fuel and time.

    In 2019 for example, Eurocontrol says bad weather “forced airlines to fly one million extra kilometres, producing 19,000 extra tonnes of CO2.”

    With extreme weather predicted to increase, they expect flights will need to divert around bad weather such as storms and turbulence even more by 2050.

    “Further driving up the costs to airlines, passengers and [increasing] their carbon footprint.”

    How airlines are turbulence-proofing

    Forecasting turbulence has got better in recent years and while it is not perfect, Prof Williams suggests we can correctly forecast about 75% of clear-air turbulence.

    “Twenty years ago it was more like 60% so thanks to better research that figure is going up and up over time,” he says.

    Aircraft have weather radar that will pick up storms ahead. As Capt Davies explains, “Before a flight, most airlines will produce a flight plan that details areas of turbulence likely throughout the route, based on computer modelling.”

    It is not 100% accurate, but “it gives a very good idea combined with other aircraft and Air Traffic Control reports once we are en-route”.

    RUNGROJ YONGRIT/EPA - EFE/REX/Shutterstock Singapore Airlines plane park at tarmac for maintenance after emergency landingRUNGROJ YONGRIT/EPA – EFE/REX/Shutterstock

    An Austrian start-up, Turbulence Solutions, says it has developed turbulence-cancelling tech for light aircraft

    Southwest Airlines in the US recently decided to end cabin service earlier, at 18,000ft instead of the previous 10,000ft. By having the crew and passengers seated with belts on ready for landing at this altitude, Southwest Airlines suggests it will cut turbulence-related injuries by 20%.

    Also last year, Korean Airlines decided to stop serving noodles to its economy passengers as it had reported a doubling of turbulence since 2019, which raised the risk of passengers getting burned.

    From owls to AI: extreme measures

    Some studies have taken turbulence-proofing even further, and looked at alternative ways to build wings.

    Veterinarians and engineers have studied how a barn owl flies so smoothly in gusty winds, and discovered wings act like a suspension and stabilise the head and torso when flying through disturbed air.

    The study published in the Royal Society proceedings in 2020 concluded that “a suitably tuned, hinged-wing design could also be useful in small-scale aircraft…helping reject gusts and turbulence”.

    Separately, a start-up in Austria called Turbulence Solutions claims to have created turbulence cancelling technology for light aircraft, where a sensor detects turbulent air and sends a signal to a flap on the wing which counteracts that turbulence.

    These can reduce moderate turbulence by 80% in light aircraft, according to the company’s CEO.

    NurPhoto via Getty A plane flies over the Port of BarcelonaNurPhoto via Getty

    Turbulence forecasting has improved in recent years, helping pilots avoid bumpy areas

    Then there are those arguing that AI could be a solution. Fourier Adaptive Learning and Control (FALCON) is a type of technology being researched at the California Institute of Technology that learns how turbulent air flows across a wing in real-time. It also anticipates the turbulence, giving commands to a flap on the wing which then adjusts to counteract it.

    However Finlay Ashley, an aerospace engineer and member of Safe Landing, a community of aviation workers calling for a more sustainable future in aviation, explained that these types of technology are some time away.

    “[They’re] unlikely to appear on large commercial aircraft within the next couple of decades.”

    But even if turbulence does become more frequent, and more severe, experts argue this isn’t cause for worry. “It’s generally nothing more than annoying,” says Captain Davies.

    But it might mean more time sitting down, with the seat-belt fastened.

    Andrew Davies has already learnt this the hard way: “I do get a lot more nervous and don’t look forward to flying like I used to,” he admits. “But I won’t let it define me.

    “The moment I sit down, my seat belt goes on and if I do need to get up, I pick my moment – then I’m quickly back in my seat, buckled up again.”

    Top Image credit: Ivan-balvan via GETTY

    BBC InDepth is the home on the website and app for the best analysis, with fresh perspectives that challenge assumptions and deep reporting on the biggest issues of the day. And we showcase thought-provoking content from across BBC Sounds and iPlayer too. You can send us your feedback on the InDepth section by clicking on the button below.

    Continue Reading

  • Mondelez beats second-quarter estimates on strong international demand – Reuters

    1. Mondelez beats second-quarter estimates on strong international demand  Reuters
    2. Reporting Second Quarter and FY 2025 Earnings  Mondelēz International, Inc.
    3. Mondelēz International Reports Q2 2025 Results  GlobeNewswire
    4. Mondelez falls after profit guidance is impacted by ‘unprecedented’ cocoa cost inflation  MSN
    5. Mondelez Logs Higher Profit, Revenue in Second Quarter  The Wall Street Journal

    Continue Reading

  • 2025 One Hertz Challenge: Precise Time Ref Via 1 Pulse-Per-Second GPS Signal

    2025 One Hertz Challenge: Precise Time Ref Via 1 Pulse-Per-Second GPS Signal

    Our hacker [Wil Carver] has sent in his submission for the One Hertz Challenge: Precise Time Ref via 1 Pulse-Per-Second GPS Signal.

    This GPS Disciplined Oscillator (GPSDO) project uses a Piezo 2940210 10 MHz crystal oscillator which is both oven-controlled (OCXO) and voltage-controlled (VCXO). The GPSDO takes the precision 1 Pulse-Per-Second (PPS) GPS signal and uses it to adjust the 10 MHz crystal oscillator until it repeatedly produces 10,000,000 cycles within one second.

    [Wil] had trouble finding all the specs for the 2940210, particularly the EFC sensitivity (S), so after doing some research he did some experiments to fill in the blanks. You can get the gory details in his notes linked above.

    In a Voltage-Controlled Crystal Oscillator (VCXO), the EFC pin is the tuning-voltage input. EFC stands for Electronic Frequency Control. [Wil] found that he needed to push the EFC up to around 4.34V in order to get 10 MHz output, which is a bit out of spec, usually the center of the range should be around 2.5V. [Wil] put this discrepancy down to the age of the crystal oscillator. You can see a chart of this behavior in the notes.

    [Wil] had nice things to say about Tom Van Baak’s website, LeapSecond.com, where you can learn about timing accuracy, precision, and stability. He also suggested searching for “Allan Variance” if you’re interested in the measurement of stable timing sources.

    If you’re interested in OCXOs be sure to check out XOXO For The OCXO and Inside A Vintage Oven Controlled Crystal Oscillator.

    Continue Reading

  • Cash Cobain on Working With Justin Bieber on ‘SWAG’

    Cash Cobain on Working With Justin Bieber on ‘SWAG’

    Cash Cobain served as a pioneer, ushering in the titillating subgenre of sexy drill, and Justin Bieber took note of the New York native’s ingenuity as the pop star recruited Cash for his SWAG album earlier in July.

    On Tuesday (July 29), Rolling Stone caught up with the “Fisherrr” rapper, who detailed how his relationship with JB went from a DM showing him love to a collaboration.

    “Everybody would just send me that sh–, so I just followed bro and I wrote, ‘Now we got to work. We got to get something in,’” Cash recalled after seeing Bieber post his “Trippin on a Yacht” track. “He was like, ‘All right, bet. Definitely got to do it, I f— with your sh–.’”

    Cash pulled up on Bieber at his home and they got to work on a few ideas. “We had, like, a laptop, speakers and instruments in this circle,” he recalled. “We was just vibing and sh–. … That n—a’s a movie. He was just doing ideas. I never got to really see him really locked in, like, on some Justin Bieber sh–. I’m pretty sure that sh– will be fire.”

    The rapper-producer continued to pepper Bieber with beats until landing on something they liked. “He’s swaggy. He’s been on the swag sh– for a little minute,” he added. “But we was talking regular sh–, too. We cool. He’d check on me like, ‘I’m just checking on you.’”

    The week of SWAG, Cash Cobain heard back from JB that he was “going to try to get” one of their collabs on the album. “I’m like, ‘All right, bet.’ So I fixed it up and then the next day or the day after the next day, that sh– was out.”

    “SWAG” ended up as track No. 18 on the album and also features Eddie Benjamin. The project debuted at No. 2 on the Billboard 200 with 163,000 album-equivalent units earned.

    “Man, who hasn’t been listening to Justin Bieber? Come on, it’s Bieber,” Cobain said of being a longtime fan of JB. “That boy into it, you know? He on top of his game. He knows what’s good. He knows what’s going on.”

    As for Cash, he’s gearing up for the Party With Slizzy tour, which kicks off in NYC on Sept. 7, and is set to release an album this fall.

    Continue Reading

  • Kraken seeks $500 million funding at $15 billion valuation, The Information reports

    Kraken seeks $500 million funding at $15 billion valuation, The Information reports

    (Reuters) -Cryptocurrency exchange Kraken is set to raise $500 million in funding at a $15 billion valuation, The Information reported on Tuesday, citing people familiar with the matter.

    A spokesperson for Kraken declined to comment on the report.

    Cryptocurrency-focused companies have been attracting increased investor interest as the digital asset class benefits from regulatory clarity and growing institutional adoption.

    This trend has also prompted several crypto firms, including custody startup BitGo and asset manager Grayscale, to pursue U.S. listings.

    Kraken has been actively investing capital to expand into various asset classes and grow its user base.

    In March, Kraken said it would acquire futures trading platform NinjaTrader in a $1.5 billion deal.

    (Reporting by Pritam Biswas and Ateev Bhandari in Bengaluru; Editing by Leroy Leo and Mohammed Safi Shamsi)

    Continue Reading

  • Visa beats quarterly estimates on resilient consumer spending but steady forecast drags shares – Reuters

    1. Visa beats quarterly estimates on resilient consumer spending but steady forecast drags shares  Reuters
    2. Visa Profit Beats Estimates on Gains in Cross-Border Spending  Bloomberg
    3. Visa Inc reports results for the quarter ended June 30 – Earnings Summary  TradingView
    4. Visa, Mastercard set for higher profits on solid spending trends  MSN
    5. Visa’s Earnings Topped Expectations. Its CEO Says the Credit Giant Is Exploring Stablecoins  Investopedia

    Continue Reading

  • West Gate’s 4th Cohort Innovators Pursue Advanced Energy Applications Throughout Energy Landscape

    West Gate’s 4th Cohort Innovators Pursue Advanced Energy Applications Throughout Energy Landscape

    Share


    Four portraits next to text that reads Introducing West Gate's Cohort 4 Selected Innovators: Ian Brownstein, James Clegern, Kian Lopez, Ying Sun.

    From technologies for extracting rare earth elements from plants to advanced membranes for water filtration, and from vertical-axis wind turbines to next-generation, long-lived flywheel energy storage, innovators bring exciting ideas to the fourth cohort of West Gate—NREL’s Lab-Embedded Entrepreneurship Program.

    “We are supporting the vast landscape of energy technologies that enable resilient, secure, and affordable energy for the country,” West Gate Program Director Shelly Curtiss said. “Innovation across that landscape opens more avenues to more advanced energy systems.”

    West Gate is one of four Lab-Embedded Entrepreneurship Program node and receives key support from the U.S. Department of Energy’s Advanced Materials and Manufacturing Technologies Office. Innovators embed at NREL for two years and receive access to researchers and resources to help de-risk their technologies while undergoing specialized entrepreneurial training.

    The fourth cohort includes:

    • James Clegern, president and founder of KineticCore Solutions
    • Kian Lopez, cofounder and CEO of OsmoPure Technologies
    • Ying Sun, cofounder of Rare Flora
    • Ian Brownstein, cofounder and CEO of XFlow Energy.

    KineticCore Solutions’ James Clegern

    Flywheel technology has been around for over 4,000 years—think a potter’s wheel. Modern flywheels are the power smoothing part of every internal combustion engine for nonelectric vehicles.

    “The only technology that’s maybe older than ours is thermal energy storage with rocks being warmed by a fire,” Clegern said.

    Clegern said that if you want to add energy storage capacity to flywheels, you must make them spin faster or you have to add mass. The flywheel’s traditional cylindrical shape limits how fast you can spin it, so modern flywheels must add mass to make them viable “kinetic batteries,” increasing costs.

    Clegern wanted to bring down those costs and make flywheels competitive with chemical batteries, so he focused on optimizing a new flywheel structural design made with carbon composite that resembles a flying saucer (an ovoid shape). This new three-dimensional shape distributes stress better so that it can spin 450% faster than a traditional steel flywheel, increasing the energy stored while requiring up to 95% less mass. Less mass and an overall simpler design significantly reduce costs, making KineticCore kinetic batteries cost-effective for a wide variety of applications needing high power, long life and multiple battery cycles per day in a non-chemical package with greatly reduced fire, freezing, and explosive risks.

    West Gate presents a great opportunity for testing KineticCore’s product prototype.

    “NREL has great testing facilities and outstanding ‘third-party’ testing support,” Clegern said. “As a small business, that’s sometimes millions of dollars of equipment that a startup just cannot afford. Plus, NREL has a great reputation for supporting energy technologies and transportation, too.”

    Clegern’s vision for the technology is integration into transportation and commercial/industrial facilities, where behind-the-meter peak power reductions can actively be engaged.

    “We’re looking at applications across EV charging and advanced facility electrification where the needed grid high-power draw is expensive to implement and a higher cost to continuously use over a project’s lifetime. By allowing commercial and industrial customers to minimize their peak power demand, their electrical costs go down and local utility grids improve stability by having reduced power spikes,” he said.

    OsmoPure Technologies’ Kian Lopez

    Lopez’s technology takes inspiration from natural evaporation and condensation processes but departs from traditional thermal methods, which are rarely used in modern water treatment due to their high energy demands.

    Taking advantage of evaporation and condensation to purify water, the OsmoPure technology applies pressure to drive water vapor across a membrane, offering a far more energy-efficient way to remove impurities. This approach is particularly promising for the semiconductor industry where ultrapure water is essential for maintaining product quality and system integrity.

    The technology works by applying pressure across an air gap in the membrane, driving water vapor from the contaminated side to the clean side, where it condenses as purified water. Using pressure instead of heat to distill water makes the process more energy efficient—up to 10 times more efficient than state-of-the-art thermally driven distillation, Lopez said.

    Through the West Gate program, OsmoPure will work at NREL to scale up the membrane’s active area, a key step toward making the technology viable for industrial deployment.

    “We’re excited to work with NREL to build a pilot-ready membrane module by the end of the program,” Lopez said. “We have a pilot partner ready.”

    The main goal of the work with West Gate will be to increase the membrane’s active area, thereby boosting water output to meet industrial demands. But Lopez said they also need to create the technology’s specifications necessary for industrial drop-in integration in ultrapure water treatment systems.

    “It’s cool to see NREL’s holistic approach to technology development, where you start with the original idea for a material and see it through to large-scale manufacturing,” Lopez said. “It’s really exciting to be a part of that.”

    Rare Flora’s Ying Sun

    Rare Flora uses plant roots to recover valuable elements from low-grade ore, mine tailings, and contaminated soils. This process, known as phytomining, uses specialized plants called hyperaccumulators that naturally absorb metals through their roots and store them in their stems and leaves. Once harvested, these plants can be processed to extract metals like rare earth elements, nickel, and cobalt, all critical for technologies ranging from smartphones to electric vehicles.

    Sun said the West Gate partnership will help Rare Flora scale up its extraction process and improve efficiency, not just to recover more critical elements but also to make the technology compatible with existing ore-processing systems.

    “Plant biotechnology has already allowed us to develop entirely new crop varieties and innovative plant-based products,” Sun said. “Now, we have a unique opportunity to apply this technology to a new sector, using plants to meet a critical need in domestic security.”

    Sun is particularly excited about West Gate because of its proximity to key technical partners. The Colorado School of Mines brings deep expertise in mining engineering, while Colorado State University offers agricultural insight that can support Rare Flora in refining plant–soil interactions.

    “What’s exciting about West Gate is that it places us at the center of everything we need to build transformative partnerships and establish phytomining as a real solution in the United States,” Sun said. “Through this program, we’ll be able to take Rare Flora to the next level of technology readiness and better understand how our innovation fits into commercial metal supply chains.”

    XFlow Energy’s Ian Brownstein

    People have been harnessing the wind for energy for thousands of years. While horizontal-axis wind setups have been the dominant approach for the last half century, Brownstein built XFlow around the lesser-used vertical-axis wind energy.

    Vertical axis turbines do not need to turn into the wind like traditional turbines, removing the need for yaw systems. XFlow has also demonstrated these turbines can operate efficiently without pitch systems. Together, these lead to a mechanically simple design. Brownstein said that reducing components and complexity also reduces costs.

    XFlow has built multiple prototypes and has collected data to validate complex engineering models. Now, with NREL, they will validate their design and make sure that the design can handle predicted loads through accelerated fatigue testing.

    The physics behind vertical axis wind energy is complicated. Cyclical behavior produces a lot of fatigue, Brownstein added, so designing a turbine that is highly efficient and boasts a long lifetime is a challenge. “That’s why the validation step is so important,” he said.

    “NREL leads in distributed-wind analysis and testing, so I’m excited to work with NREL on validating our rotor design through accelerated fatigue testing,” Brownstein said. “I’m also excited about working with market analysis folks to locate untapped markets for when we are ready to take our product live.”

    Brownstein hopes that validating the design at NREL will accelerate the turbine’s certification to international wind standards and XFlow’s path to market.

    Cohort 4 of the West Gate program allows innovators to utilize laboratory expertise and resources to refine their technologies on the path to market applications. As these technologies are refined, they could improve industry energy efficiency and amplify U.S. energy leadership.  

    Learn more about the West Gate program.

    Continue Reading

  • Lava Existed in the Moon’s Subsurface Longer than Previously Thought

    Lava Existed in the Moon’s Subsurface Longer than Previously Thought

    According to the prevailing theory of how the Moon formed, it all began roughly 4.5 billion years ago when a Mars-sized object (Theia) collided with a primordial Earth. This caused both bodies to become a molten mass that eventually coalesced to form the Earth-Moon System (aka. The Giant Impact Hypothesis. This theory also states that the Moon gradually cooled from the top down, with the crust solidifying and arresting lava flows early in its history. However, recent findings from samples obtained by China’s Chiang’e-5 probe indicate that lava existed at shallower depths longer than previously thought.

    These samples obtained by the Chiang’e-5 lander were from the young mare basalt unit in the Oceanus Procellarum region, a vast lunar mare on the western edge of the near side of the Moon. The samples included 1.7 kilograms (3.7 pounds) of scooped and drilled material composed of basalt and igneous rock that formed roughly 2 billion years ago, making them the youngest samples obtained to date. These findings contradict the previous theory that the temperature of the outer layers of the Moon was too low for melting to occur in the shallow interior, and could revise theories about the Moon’s early evolution.

    The research was led by Stephen M. Elardo, an Assistant Professor from The Florida Planets Lab at the University of Florida. He was joined by researchers from the Colorado School of Mines, the University of Rochester, the Planetary Science Institute (PSI), the Hawaiʻi Institute of Geophysics and Planetology, the University of Hawaiʻi Manoa, and the University of Oxford. The paper describing their findings appeared on July 18th in the journal Science Advances.

    The Chiang’e-5 samples are examples of rock formed from rapidly cooled lava, which is characteristic of the mare region from which they were obtained. To obtain an estimate of how deep this lava came from, the team conducted high-pressure and high-temperature experiments on a lava simulant with an identical composition. Based on remote sensing from orbit, previous work from Chinese scientists showed it erupted in an area with very high abundances of radioactive, heat-producing elements, including potassium, thorium, and uranium.

    In large amounts, the researchers believe these elements could generate enough heat to keep the Moon hot near the surface, slowing the cooling process over time. Before this study, it was presumed that the upper mantle cooled first as the surface gradually lost heat to space, which was based largely on seismic data obtained by the Apollo astronauts. Per this theory, younger lavas like the samples obtained by the Chang’e-5 lander should have come from the deep mantle, where the Moon would still be hot. However, these findings suggest there must have been pockets in the shallow mantle that were hot enough to partially melt rock 2 billion years ago.

    As Prof. Elardo explained in a UF News release:

    Using our experimental results and thermal evolution calculations, we put together a simple model showing that an enrichment in radioactive elements would have kept the Moon’s upper mantle hundreds of degrees hotter than it would have been otherwise, even at 2 billion years ago.

    Lunar magmatism, which is the record of volcanic activity on the Moon, gives us a direct window into the composition of the Moon’s mantle, which is where magmas ultimately come from. We don’t have any direct samples of the Moon’s mantle like we do for Earth, so our window into the composition of the mantle comes indirectly from its lavas.

    Artist’s impression of the interior structure of the Moon. Credit: Hernán Cañellas/Benjamin Weiss/MIT

    This research is helping to establish a detailed timeline of the Moon’s evolution, which is critical to understanding how planets and smaller bodies form and evolve. The prevailing theory is that this process begins with accretion from a protoplanetary disk, where dust and gas coalesce due to angular momentum to form planetary bodies. Initially, these bodies are extremely hot and have molten surfaces, which gradually cool to form solid bodies composed of rock and metal, with some forming envelopes of gas or volatiles like water (depending on where they form around their host stars).

    The process of cooling and geological layer formation are key steps in the evolution of these bodies. Since the Moon is Earth’s closest celestial neighbor, studying lunar samples is the easiest way to learn more about these processes. Said Elardo:

    My hope is that this study will lead to more work in lunar geodynamics, which is a field that uses complex computer simulations to model how planetary interiors move, flow, and cool through time. This is an area, at least for the Moon, where there’s a lot of uncertainty, and my hope is that this study helps to give that community another important data point for future models.

    Further Reading: UF News, Science Advances

    Continue Reading

  • expert reaction to study looking at ultra-processed food consumption and lung cancer risk

    A study published in Thorax looks at ultra-processed food (UPF) consumption and the risk of lung cancer. 

     

    Prof Kevin McConway, Emeritus Professor of Applied Statistics, The Open University, said:

    “I have to confess that my heart rather sank when I got the request from SMC to comment on this study.  That’s not because it’s complicated, and not even because it’s a particularly bad study of its type.  It’s just that it’s yet another of a class of studies about ultra-processed foods (UPFs) that, in my view, are doing nothing much to advance what is known about associations between the consumption of UPFs and human health.  I’m well aware that studies with other kinds of methodology are going on – I just wish that researchers would concentrate more on those other types of research, instead of repeatedly cranking the handle of doing studies like this one.

    “The type of study that I’m getting sick of seeing goes like this.  The researchers find an existing large observational study, that involves a cohort of people, and for which the data have been made available to other researchers on application.  Many of the cohort studies involved took place in the UK or the USA.  Of course it’s generally a good thing that data from big studies is made available for other research, but that doesn’t mean that all the research you can do with this secondary data is necessarily very useful.

    “Typically, the researchers wouldn’t have been involved in the original study whose data they are using, and often are based in an entirely different country.  In this case, the original study was in the USA, and all but one of the researchers is based in China (with one being at Harvard).

    “To be useable for a study aiming to examine the association between consumption of UPFs and some health outcome, the original study has to have recorded information on what the participants eat, and then has to follow them up reasonably systematically, and record the health outcome.  It also has to record other potentially relevant information about the participants.

    “The outcome – in this case a diagnosis of lung cancer, but in other studies it might be death from a particular cause or from all causes – typically might take a long time to develop, so the participants have to be followed up for a long time.  (In the case of this new study, the average length of follow-up was over 12 years.)

    “Make no mistake – although the cohort used in this new study comes from a randomised trial of cancer screening, the so-called PLCO Cancer Screening Trial, the new study is observational.  That’s because the exposure of interest – how many UPFs the participants said they consumed – wasn’t allocated to them at random.  The participants themselves chose what to eat and drink.

    “The trouble with observational studies like this is that each one of them, on its own, might find an association, that is, a correlation between what people eat in the way of UPFs, and the risk of the health outcome of interest.  But it can’t tell you whether that association is one of cause and effect.  That’s because there will typically be many other factors that differ between the groups that consume different amounts of UPFs.  These are called potential confounders or potential confounding factors – if one or more of them happens to be associated also with people’s risk of the health outcome, then the potential confounders might be the cause of the association with the health outcome, and not the participants’ UPF consumption at all.

    “It’s possible to make statistical adjustments to try to allow for potential confounders, but that process isn’t definitive – you can never be sure that you have adjusted for all the potential confounders, so you can never be sure about cause and effect.  That’s why this study, like any decent study of this general type, points out that it can’t establish what is causing what.

    “In this particular new study, for instance, although the researchers made adjustments for participants’ smoking status (that is, whether they were current smokers, past smokers, or never smoked), they did not adjust for the amount that people smoked, and they point out that this is a limitation of the study and may potentially lead to biases.  (Actually it seems strange that they did not make that adjustment, since my reading of the information about the underlying clinical trail (at https://cdas.cancer.gov/plco/) indicates that information on the amount people smoked was collected and available.)  And we certainly can’t assume that no other potential confounders were omitted from the adjustments.  (See also point 1 in Further Information below.)

    “If there were lots of other studies looking at associations between UPF consumption and lung cancer risk, including studies on the actual mechanisms by which the cancer might be caused by the foods, we might eventually get to a conclusion about cause and effect.  But the researchers on the new study are proud to point out its novelty, so we’re clearly nowhere near that position yet.

    “The new study certainly doesn’t rule out the possibility that eating larger quantities of UPFs cause increases lung cancer risk, but it doesn’t come near to establishing that this cause and effect really exist.  That’s why I feel it has told us rather little.

    “Studies of this general type very often have two other important limitations, to do with the measurement of people’s consumption of different foods and drinks, and they both apply to this new study (and indeed are mentioned as limitations by the researchers).  Because a long follow-up period is needed, and because the definition of ultra-processed foods did not emerge until 2009, the questionnaires used to measure people’s consumption of foods and drinks used in the underlying cohorts were very often not designed to measure UPF consumption.  That certainly applies to the PLCO Cancer Screening Trial that provided data for this new study.  It recruited participants between 1993 and 2001, and recorded their diets between 1998 and 2001.  (See also point 2 in Further Information below.)  Also, often, and certainly in this case, the participants’ diets were recorded only once during their participation, so the new study cannot take account of any changes of diet.

    “I have done some calculations (using https://realrisk.wintoncentre.uk/) to give what, I hope, is a clearer view of the actual size of the risk associated with consuming greater amounts of UPFs.  A relevant point is that, despite the fact that the new study uses data from over 100,000 participants in the underlying PLCO Trial, most of them were not diagnosed with lung cancer during the 12 years of follow-up. Somewhere between 1 and 2 in every 100 of them got lung cancer during follow-up.  Thus the estimates of difference in lung cancer risk between the groups that consumed different amounts of UPFs are subject to quite large statistical margins of error.

    “In a group of 1,000 people who are just like the quarter of participants that consumed the lowest levels of UPFs, less than one serving per day on average, about 11 would get a lung cancer diagnosis in a 10-year period.  Now consider another group of 1,000 people, who are just like the first 1,000 in terms of all the factors used for statistical adjustment in the researchers’ calculations, except that they all consumer UPFs at a rate like those of the highest quarter in the study (so more than 3.7 servings per day, on average).  If the differences in lung cancer risk are due to the higher UPF consumption, and I repeat that this study just can’t establish that, then we’d expect about 15 cases of lung cancer in the second, high-UPF group of 1,000.  That’s just 4 more than in the low-UPF group, despite the large difference in UPF consumption between the groups.  The statistical uncertainty means that that number could plausibly be greater or smaller – between 3 and 6 more lung cancer cases in the high-UPF group than the low-UPF group of 1,000 people.  So even if the higher UPF consumption was somehow known to be what causes the increased risk (and, remember, this can’t be known from this type of study), arguably the increase isn’t all that large anyway.

    “Studies of this kind also usually can do little or nothing directly to investigate how high UPF consumption might possibly cause an increase in health risk, and that’s the case in this new study.  The authors make suggestions on how high UPF consumption could potentially increase the risk of lung cancer, but these suggestions are based on other previous studies at best, and don’t come from the new data analysis.

    “The press release repeats the suggestion in the research paper that the compound acrolein is known to be a toxic component of cigarette smoke, and that (of course) it is well known that cigarette smoke increases lung cancer risk.  It also mentions that acrolein is found in grilled sausages and caramel sweets, which would generally be counted as UPFs.  But it appears* that acrolein is found in many food substances, including some that would not be counted as UPFs.  In any case, the data behind the new research did not make any measurements of acrolein.”

     

    Further information

    The following points are more technical.

    1. The researchers calculated what’s called an E-value, to give a measure of how strong the effect of a single potential confounder, not included in their adjustments, would have to be in order to wipe out the observed association between UPF consumption and lung cancer risk.  They point out that the E-value implies that a single potential confounder would need to have a stronger effect than smoking status to overturn their observed association.  However, you have to remember that they left out smoking amount as a factor to adjust for, and that might have a strong effect in addition to smoking status.  Also E-values look only at the effect of one omitted confounder at a time – there could be several of them, whose combined effect could be great enough to overturn the observed association.  We simply can’t tell.
    2. The new research paper reports that the Diet History Questionnaire (DHQ) used in the underlying study to measure diet, was validated in four 24-hour dietary recalls.  But the study on the 24-hour recalls, reference 16 in the new research paper, was not part of either the underlying PLCO Cancer Screening Trial nor the new research.  It was based on research carried out in 1997, before the PLCO trial began to collect diet information from the DHQ, and its aim was to investigate how closely DHQ results matched those from detailed 24-hour food recalls, which were considered more accurate.  That study found that the correlation between food components measured by the DHQ and by the dietary recalls was only about 0.5, meaning that the DHQ results explained only roughly a quarter of the variability in the more accurate dietary recalls.  However, I am no expert on dietary measurement and you’d need to check with dietitians to check this concern.

     

    * ‘Origin and fate of acrolein in foods’, Jiang et al. https://www.mdpi.com/2304-8158/11/13/1976 , though my own expertise does not extend to knowing how dangerous (if at all) these amount of acrolein would be to health.

     

    Prof Sam Hare, Consultant Chest Radiologist, Royal Free London NHS Trust, said:

    “A quarter of lung cancer cases occur in non-smokers so we do need research exploring whether other factors are associated with lung cancer.  We also know immunity is linked to cancer biology so it is a good idea to do research into factors like diet.

    “However, further work is needed to establish direct causation between UPFs and lung cancer.  Crucially, whilst the study does make some adjustments for smoking status, the amount of smoking is not factored in, which is known to be directly related to lung cancer development.  Dietary habits also change considerably over the course of such long term studies.  As such, it is difficult to directly conclude that lung cancer is related to the level of UPF consumption alone given it was only declared at the start of the study.

    “That said, given the relative dearth of information on non-smoking related risk factors in lung cancer, it is important that the scientific community conducts more studies like this – we need genuine evidence-based advancement in the early diagnosis of lung cancer in non-smokers, but this study isn’t quite able to give us the answers yet.”

     

    Rachel Richardson, Acting Head of Methods Support, The Cochrane Collaboration, said:

    “This is a well-conducted observational study, but there are some important factors to bear in mind when considering any wider implications.

    “Firstly, this study uses data from a series of trials of screening conducted amongst older adults living in the USA.  It is very likely that regulations governing food (for example, permitted additives) are different from those in the UK, which means that American diets are likely to be different from ours in the UK.  The fact that this study only included people aged 55-74 also limits the extent to which the findings may be relevant to younger people.

    “Secondly, because this study uses data that were originally collected for a different purpose, the information they have is limited in terms of the research question they are trying to answer.  A significant limitation is that information on food intake was only collected once which does not reflect the fact that diets change over time.  The authors also do not seem to have had information on other factors that are related to lung cancer, for example, exposure to air pollution or second-hand smoke – these are likely to be confounding factors.”

     

    Dr Adam Jacobs, Executive Director and Strategic Consultant, Biostatistics, Ergomed, said:

    “Wang et al found a statistically significant association between ultra-processed food (UPF) consumption and lung cancer.  This association was strong enough to be clinically relevant, with a 41% increased risk of lung cancer in the highest quartile of UPF consumption compared with the lowest – owever, the important question is whether this association is causal, in other words whether it is the UPF consumption itself that drives the risk of lung cancer or whether people with high UPF consumption are more likely to be at increased risk of lung cancer for other reasons.

    “The most obvious confounding factor here is smoking, which is well known to cause a greatly increased risk of lung cancer.  If people with high UPF consumption smoked more than people with low UPF consumption, then that difference in smoking could easily lead to the observed results.  Although Wang et al attempted to adjust for smoking in their analysis, their adjustment was very crude: they categorised participants into only 2 categories: current or former smokers, and non-smokers (assessed by self report).  If people in the high UPF consumption group were more likely to be current smokers rather than former smokers, or smoked more cigarettes per day, or were more likely to have claimed to be non-smokers when in fact they were smokers, then none of these factors would have been captured in the analysis.  Wang et al correctly identify this limitation themselves in the paper.

    “Without more detailed adjustment of their statistical model for smoking intensity, it seems unsafe to conclude that the observed association between UPF consumption and lung cancer risk was a causal effect of the UPF consumption, and confounding by smoking also seems a plausible explanation of the results.”

     

    Prof Tom Sanders, Professor emeritus of Nutrition and Dietetics, King’s College London, said:

    “This an analysis of data collected in the United States comparing diet and lifestyle and subsequent development of lung cancer.  The authors claim to find about a 0.4 fold increase in risk of lung cancer associated with the highest ultraprocessed after adjusting for other risk factors.  This could be entirely due to residual confounding due to poorly recorded exposure to tobacco as well as occupational exposure to inhaled carcinogens.

    “Other studies show that smoking is an enormous cause of lung cancer, increasing risk 12 fold.  The statistical analysis only includes current/previous smokers without detail in terms of number of pack years of smoking.  It also does not ascertain properly current smokers because individuals are known to lie about smoking habit.  A test such as plasma or urinary cotinine is needed to check for smoking status.  Unhealthy diets often go hand in hand with smoking habit and low socioeconomic status.  But there appears to be no plausible mechanism to explain why ultraprocessed food should affect risk of lung cancer.”

     

     

     

    ‘Association between ultra-processed food consumption and lung cancer risk: a population-based cohort study’ by Kanran Wang et al. was published in Thorax at 23:30 UK time on Tuesday 29 July 2025. 

     

    DOI: 10.1136/thorax-2024-222100

     

     

     

    Declared interests

    Prof Kevin McConway: “I don’t have any relevant interests to declare.”

    Prof Sam Hare: “No conflicts of interest directly relevant to this work.  I am a past National Specialty Adviser to NHS England for Imaging.  I am CEO and co-founder of the HLH Imaging Group Limited.”

    Rachel Richardson: “I have no interests to declare.”

    Dr Adam Jacobs: “No competing interests to declare.

    Ergomed is a contract research organization: https://ergomedcro.com/.”

    Prof Tom Sanders: “I have received grant funding for research on vegans in the past. I have been retired for 10 years but during my career at King’s College London, I formerly acted as consultant for companies that made artificial sweeteners and sugar substitutes.

    “I am a member of the Programme Advisory Committee of the Malaysia Palm Oil Board which involves the review of research projects proposed by the Malaysia government.

    “I also used to be a member of the Scientific Advisory Committee of the Global Dairy Platform up until 2015.

    “I did do some consultancy work on GRAS affirmation of high oleic palm oil for Archer Daniel Midland more than ten years ago.

    “My research group received oils and fats free of charge from Unilever and Archer Daniel Midland for our Food Standards Agency Research.

    “I was a member of the FAO/WHO Joint Expert Committee that recommended that trans fatty acids be removed from the human food chain.

    “Member of the Science Committee British Nutrition Foundation.  Honorary Nutritional Director HEART UK.

    “Before my retirement from King’s College London in 2014, I acted as a consultant to many companies and organisations involved in the manufacture of what are now designated ultraprocessed foods.

    “I used to be a consultant to the Breakfast Cereals Advisory Board of the Food and Drink Federation.

    “I used to be a consultant for aspartame more than a decade ago.

    “When I was doing research at King’ College London, the following applied: Tom does not hold any grants or have any consultancies with companies involved in the production or marketing of sugar-sweetened drinks.  In reference to previous funding to Tom’s institution: £4.5 million was donated to King’s College London by Tate & Lyle in 2006; this funding finished in 2011. This money was given to the College and was in recognition of the discovery of the artificial sweetener sucralose by Prof Hough at the Queen Elizabeth College (QEC), which merged with King’s College London. The Tate & Lyle grant paid for the Clinical Research Centre at St Thomas’ that is run by the Guy’s & St Thomas’ Trust, it was not used to fund research on sugar. Tate & Lyle sold their sugar interests to American Sugar so the brand Tate & Lyle still exists but it is no longer linked to the company Tate & Lyle PLC, which gave the money to King’s College London in 2006.”

    Continue Reading