Blog

  • In reversal, Trump will pay for key rockets and space stations for Europe | International

    In reversal, Trump will pay for key rockets and space stations for Europe | International

    Buried in the 870 pages of the One Big Beautiful Bill, there is a concession to Republican representatives from states closely linked to the manned space exploration program, which was facing massive layoffs and unprecedented cuts. Another victim of this concession is the tycoon Elon Musk, as Trump’s decision takes away lucrative contracts for his space rockets. Europe, on the other hand, is breathing a sigh of relief.

    The law recently approved by Congress revives Gateway, the future space station that will orbit the Moon. The federal government will spend $2.6 billion on this manned base, the construction of which also involves the European Space Agency (ESA) along with Canada, Japan and the United Arab Emirates. The measure is a sharp U-turn from what Trump was proposing just over a month ago: to completely cancel the project and leave all its international partners in the lurch.

    Something similar is happening with the Space Launch System (SLS), with which the U.S. government aims to send the first woman astronaut to the Moon in 2027. The BBB will ultimately include more than $4 billion to fund at least two additional flights with this launch vehicle, beyond those already planned for the Artemis 2 and 3 missions. Trump’s original idea was to eliminate it after these two flights and perhaps resort to the Starship being developed by SpaceX, Elon Musk’s company. But this latter system is far from ready to carry astronauts, and has already accumulated several spectacular explosions, the last of them before even attempting takeoff.

    Furthermore, the clash between the U.S. president and the tycoon is increasingly evident. Musk has announced that he will create a political party in the United States to steal voters from Trump. The president has called it “ridiculous” and described Musk as a “train wreck.” The president has also threatened to withdraw important public contracts held by Musk’s companies such as SpaceX and the electric car manufacturer Tesla.

    Trump had bought into Musk’s idea of sending astronauts to Mars as soon as possible, a very different option than the one the U.S. space agency had been planning for years. The new funding returns to the original vision of first sending astronauts to the Moon, building an orbital station there, and recovering the Orion capsules for the Artemis 4 and 5 missions, formally planned for 2028 and 2030, which will also be transported by SLS rockets, which both Trump and Musk had criticized as too expensive and obsolete.

    One of the big winners from these measures is Europe, which had watched with horror as many of the joint programs it had invested the most money and effort in were threatened with cancellation. The European Space Agency (ESA), for example, is responsible for building a habitation module for the future Gateway lunar station, as well as a storage module, a fuel depot, and the only place in the entire facility with windows, through which astronauts can look out to contemplate the lunar surface and outer space. Europe will also benefit from the extended life of the Orion spacecraft, for which it manufactures the service module that provides power and propulsion.

    Trump’s mega-law also includes a significant injection of $1.2 billion in funding for the International Space Station (ISS) until 2029, before its retirement the following year. This is also key for Europe, as it could ensure that European astronauts, including the Spaniard Pablo Álvarez, can travel to space before 2030.

    An uninhabited space station

    “If the United States pulled out of Gateway, the project would be dead,” acknowledges an executive from one of Europe’s leading aerospace companies. “Europe could have completed the station on its own, but for the time being, it doesn’t have access to space for its astronauts; it relies on Russian Soyuz spacecraft or commercial U.S. spacecraft, so what we were facing was having an uninhabited space station on the Moon,” he explains.

    The main reason the United States has decided to increase funding for the ISS is geopolitical, these sources point out. China has an Earth orbital station, and it would be a complete defeat if the “Western world” didn’t have a similar facility, the ISS, or a larger one, the Gateway, when it’s ready later this decade.

    On the other hand, nobody can save NASA’s robotic exploration missions and other scientific programs, which are facing unprecedented cuts. Trump’s budget only includes increases for manned exploration programs, but in return, he will cut the science budget in half. This will force the cancellation of 41 projects, including 19 active space missions.

    The person responsible for this major change of tack is Texas Republican Senator Ted Cruz, who chairs the Senate Committee on Commerce, Science, and Transportation and who is believed to have pushed for the new funds to prevent the hundreds of layoffs that Trump’s policies would have caused in his state.

    “It’s been a lightning-fast change of direction, perhaps the fastest I’ve ever seen in this field,” Casey Dreier, head of space policy at the Planetary Society of the United States, told this newspaper. However, the expert at this non-profit organization founded by Carl Sagan in 1980 believes the scope of the new law is “disappointing” because it doesn’t include any relief from the planned cuts to science, education, and other areas. This is also explained by politics. “By an accident of history, NASA’s human space exploration centers are all in Republican-governed states. No Democratic congressman was going to support this law, so only the highest priorities of the Republican Party were considered,” Dreier explains. The specialist believes this situation opens the possibility that NASA’s science sector will fare somewhat better in the congressional debate on the budget, which must conclude before October 1. Since the BBB has set amounts for human exploration, perhaps this will leave some more money for other NASA projects and other federal agencies, which are facing brutal cuts.

    The outlook for the coming months remains highly uncertain. NASA has been in a state of disarray since Donald Trump unexpectedly decided to remove Jared Isaacman, a billionaire he himself had appointed to head the agency. Isaacman is very close to Musk and had to juggle in the Senate to defend the country’s desire to reach the Moon before China, but also to prioritize missions to Mars, as Musk wanted. Ultimately, the rift between the president and the magnate left him without a position, and with no successor in sight. “I also found it inappropriate that a very close friend of Elon’s, who was involved in the Space Business, should run NASA, given that NASA is such a significant part of Elon’s corporate life,” Trump wrote on his Truth Social network.

    Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

    Continue Reading

  • Govt committed to consultative process with business stakeholders: Aurangzeb – RADIO PAKISTAN

    1. Govt committed to consultative process with business stakeholders: Aurangzeb  RADIO PAKISTAN
    2. Business leaders praise PM Shehbaz’s economic leadership  Ptv.com.pk
    3. Pakistan aims to boost Gwadar port with new shipping lines, ferry  Nikkei Asia
    4. Aurangzeb reaffirms govt’s commitment to consultative policy framework  Associated Press of Pakistan
    5. PM emphasizes local resource use to boost exports  Mettis Global

    Continue Reading

  • Trade mark owners alert: Scammers misuse law firm details for e-commerce verification

    Trade mark owners alert: Scammers misuse law firm details for e-commerce verification


    Leaving Dentons

    Beijing Dacheng Law Offices, LLP (“大成”) is an independent law firm, and not a member or affiliate of Dentons. 大成 is a partnership law firm organized under the laws of the People’s Republic of China, and is Dentons’ Preferred Law Firm in China, with offices in more than 40 locations throughout China. Dentons Group (a Swiss Verein) (“Dentons”) is a separate international law firm with members and affiliates in more than 160 locations around the world, including Hong Kong SAR, China. For more information, please see dacheng.com/legal-notices or dentons.com/legal-notices.

    Continue Reading

  • Saudi Arabia and Iran hold talks after Tehran’s truce with Israel – World

    Saudi Arabia and Iran hold talks after Tehran’s truce with Israel – World

    Iran’s foreign minister has held talks with Saudi Arabia’s de facto leader, the Saudi foreign ministry said, two weeks after a ceasefire between regional rivals Iran and Israel began.

    Saudi Crown Prince Mohammed bin Salman said his country hoped the truce would contribute to regional stability, and emphasised Riyadh’s position “in supporting dialogue through diplomatic means as a path to resolving disputes,” the ministry said in a post on X early on Wednesday.

    According to the Saudi ministry, Iranian Foreign Minister Abbas Araghchi “expressed his gratitude” to Riyadh for its condemnation of Israel’s attacks on Iran last month.

    Israel launched its unprecedented bombing campaign on Iran on June 13, targeting military and nuclear facilities as well as residential areas. The strikes killed more than 1,000 people in Iran, including senior military commanders and nuclear scientists, according to Tehran.

    Israel, in turn, was hit by waves of drone and missile fire from Iran, which Israeli authorities said left at least 28 people dead.

    The United States, which had been in talks with Iran about its nuclear programme since April, joined Tel Aviv’s war and carried out its own strikes on Iran on June 22, targeting several nuclear sites.

    The talks between Tehran and Washington have since stalled, but the ceasefire between Iran and Israel has been in place since June 24.

    Iran and Saudi Arabia have often been on opposing sides of regional conflicts, including in Syria and Yemen. The two regional heavyweights broke off diplomatic relations in 2016 before re-establishing them in 2023 under a rapprochement deal brokered by China.

    It amounted to a diplomatic achievement for Prince Mohammed, who has taken a more conciliatory approach to regional diplomacy in recent years.

    Saudi Arabia condemned the Israeli strikes on Iran last month, calling them “aggressions” and a “clear violation of international laws”.

    Riyad also expressed its “great concern” following the US strikes on Iranian nuclear facilities.

    Iranian foreign ministry spokesman Esmaeil Baqaei said Araghchi held “fruitful conversations” with Prince Mohammed, as well as Foreign Minister Prince Faisal bin Farhan and Defence Minister Prince Khaled bin Salman about bilateral relations and developments in the region.

    Continue Reading

  • Destination Scholarships: Navigating Funding Opportunities by Region

    Led by Sakina Babar, this session offers insights into what leading scholarships in each region typically look for, how to align their academic and leadership profile with regional expectations, and how to avoid common application pitfalls.

    About Sakina Babar

    Sakina is a Commonwealth Scholar with a Master’s in Education from the UK, and has also been shortlisted for both Fulbright and Chevening. Through Ed Advisory, she has supported countless students in securing fully funded scholarships like Chevening, Fulbright, and Erasmus Mundus, with a mission to make global education more accessible.

    To register, click here.


    Continue Reading

  • Scientists hail rapid estimate of climate change’s role in heat deaths

    Scientists hail rapid estimate of climate change’s role in heat deaths

    Ten days of extreme heat killed 2,305 people in a sample of 12 European cities last month, with almost two-thirds of those deaths caused by climate change’s intensifying effect on heatwaves, new research estimated on Wednesday.

    The early summer heatwave, which sparked wildfires and health warnings from Spain to Turkey, was between 2 and 4 degrees Celsius hotter than it would have been without climate change, according to the study by the Grantham Institute at Imperial College London and the London School of Hygiene and Tropical Medicine (LSHTM).

    “These numbers represent real people who have lost their lives in the last days due to the extreme heat”, said Imperial College London climate scientist Friederike Otto.

    “If we continue to follow the wishes of the fossil fuel industry and delay serious mitigation [emissions-cutting] further, more and more people will lose their lives for the financial benefit of only a tiny rich influential minority,” she told reporters during a conference call.

    Separately, a report by the European Union’s Copernicus Climate Change Service said last month was the hottest June on record in Western Europe.

    Otto highlighted the researchers’ rapid work in calculating the role of climate change in the overall death toll, which she hailed as a first.

    Rapid attribution study

    Previously, such research has taken months. A study into Europe’s 2022 heatwave, which found that climate change was responsible for just over half of the 68,000 deaths, was published a year later.

    The new study has not been peer-reviewed, a sometimes lengthy process where other scientists evaluate the research, Otto said, adding that the methods it used to attribute deaths had undergone peer review and been approved.

    She said publishing studies quickly is important because the immediate aftermath of a heatwave is “when people talk about it”. That is also why the researchers focused on a sample of just 12 cities, she said, making their analysis more manageable.


    People hold umbrellas to protect themselves from the sun during an ongoing heat wave with temperatures reaching 40 degrees, in Rome, Italy, on July 6, 2025, at the Colosseo area. (Photo by Massimo Valicchia/NurPhoto)

    Previous studies from the World Weather Attribution group, which Otto co-leads, have only estimated how much hotter climate change has made a heatwave. Otto said she wanted to translate this into numbers of additional deaths because a temperature increase of a few degrees Celsius “might not sound very much”.

    Otto said the reason the first study like this was carried out in Europe is because scientists have established the relationship between heat and deaths better in Europe than elsewhere. But there are parts of southern Africa, Asia and the USA where this relationship has been established by scientists, she said, so “we will probably do this again in other parts of the world”.

    But LSHTM climate professor Malcolm Mistry, warned that carrying out this kind of study across the world would be “very challenging because not every public health authority wants to give out the mortality record reports for research purposes”. This data on deaths is key to establishing how many people are killed by a certain increase in temperature.

    Silent killer

    The study did not attribute any individual death to climate change and heat is generally not listed on death certificates. Most people who died had health problems exacerbated by the heat, and more than half of them were aged over 85.


    Construction workers use an umbrella on their boom lift to cover from the sun during a heatwave in the city center in Vienna, Austria, July 2, 2025. REUTERS/Lisa Leutner

    Heatwaves are a “silent killer” because the deaths mostly take place in homes and hospitals, away from public view, and are rarely reported, said Pierre Masselot from the LSHTM.

    But media reports have blamed last month’s soaring temperatures in some specific cases, such as the death of 48-year old builder who collapsed while laying concrete in 35C heat in the Italian city of Bologna, and a 53-year old woman with a heart condition who died in Palermo. Climate Home has spoken to relatives of people who died during extreme heat in Saudi Arabia and the Gaza Strip.

    Otto said that too many media reports about heatwaves include photographs of children eating ice cream and happy people playing on the beach. “That’s a massive problem”, she said, although she added that more articles were now referring to the role of climate change in driving heatwaves.

    The researchers behind the study said ways to cope with extreme heat included installing air conditioning, improving government heatwave warnings, planting more trees, building more parks, insulating buildings and painting roofs white.

    “But at the end of the day,” said Masselot, “all these measures won’t probably be as efficient as just reducing climate change altogether [by] reducing our greenhouse gas emissions.”

    Continue Reading

  • Apple eyes US Formula 1 broadcast rights after ‘F1: The Movie’ success – World

    Apple eyes US Formula 1 broadcast rights after ‘F1: The Movie’ success – World

    Apple is in talks to acquire the US broadcast rights to screen Formula 1 when the contract becomes available next year, the Financial Times reported on Wednesday, following the success of Brad Pitt-starrer “F1: The Movie”.

    The report follows the strong box office performance of Apple’s high-octane racing film “F1: The Movie”, which has grossed $293 million in its first 10 days, according to Variety and other outlets.

    The iPhone maker is challenging current US broadcaster ESPN, owned by Disney, for the Formula 1 rights next year, the FT report said, citing sources familiar with the matter.

    Reuters could not immediately confirm the report.

    Netflix’s “Formula 1: Drive to Survive” series helped boost the sport’s popularity in the United States, a momentum Apple now hopes to capitalise on.

    Several media outlets reported in February that Netflix is among the contenders for Formula 1’s US broadcasting rights from the 2026 season as ESPN’s exclusivity period to negotiate a new contract with F1 expired.

    Apple and F1 did not immediately respond to Reuters request for comment.

    Continue Reading

  • Sensing at quantum limits – CERN Courier

    Sensing at quantum limits – CERN Courier

    Quantum sensors have become important tools in low-energy particle physics. Michael Doser explores opportunities to exploit their unparalleled precision at higher energies.

    Quantum sensitivity The Axion Dark Matter Experiment (ADMX) searches for ultra-light bosonic dark matter in the 1 to 40 μeV mass range by detecting possible conversions of axions into microwave photons inside a high–quality-factor superconducting cavity. Quantum-limited amplifiers, cooled to millikelvin temperatures, push the detector’s sensitivity toward the limits set by quantum measurement noise. Credit: T Luong

    Atomic energy levels. Spin orientations in a magnetic field. Resonant modes in cryogenic, high-quality-factor radio-frequency cavities. The transition from superconducting to normal conducting, triggered by the absorption of a single infrared photon. These are all simple yet exquisitely sensitive quantum systems with discrete energy levels. Each can serve as the foundation for a quantum sensor – instruments that detect single photons, measure individual spins or record otherwise imperceptible energy shifts.

    Over the past two decades, quantum sensors have taken on leading roles in the search for ultra-light dark matter and in precision tests of fundamental symmetries. Examples include the use of atomic clocks to probe whether Earth is sweeping through oscillating or topologically structured dark-matter fields, and cryogenic detectors to search for electric dipole moments – subtle signatures that could reveal new sources of CP violation. These areas have seen rapid progress, as challenges related to detector size, noise, sensitivity and complexity have been steadily overcome, opening new phase space in which to search for physics beyond the Standard Model. Could high-energy particle physics benefit next?

    Low-energy particle physics

    Most of the current applications of quantum sensors are at low energies, where their intrinsic sensitivity and characteristic energy scales align naturally with the phenomena being probed. For example, within the Project 8 experiment at the University of Washington, superconducting sensors are being developed to tackle a longstanding challenge: to distinguish the tiny mass of the neutrino from zero (see “Quantum-noise limited” image). Inward-looking phased arrays of quantum-noise-limited microwave receivers allow spectroscopy of cyclotron radiation from beta-decay electrons as they spiral in a magnetic field. The shape of the endpoint of the spectrum is sensitive to the mass of the neutrino and such sensors have the potential to be sensitive to neutrino masses as low as 40 meV.

    Quantum-noise limited

    Beyond the Standard Model, superconducting sensors play a central role in the search for dark matter. At the lowest mass scales (peV to meV), experiments search for ultralight bosonic dark-matter candidates such as axions and axion-like particles (ALPs) through excitations of the vacuum field inside high–quality–factor microwave and millimetre-wave cavities (see “Quantum sensitivity” image). In the meV range, light-shining-through-wall experiments aim to reveal brief oscillations into weakly coupled hidden-sector particles such as dark photons or ALPs, and may employ quantum sensors for detecting reappearing photons, depending on the detection strategy. In the MeV to sub-GeV mass range, superconducting sensors are used to detect individual photons and phonons in cryogenic scintillators, enabling sensitivity to dark-matter interactions via electron recoils. At higher masses, reaching into the GeV regime, superfluid helium detectors target nuclear recoils from heavier dark matter particles such as WIMPs.

    These technologies also find broad application beyond fundamental physics. For example, in superconducting and other cryogenic sensors, the ability to detect single quanta with high efficiency and ultra-low noise is essential. The same capabilities are the technological foundation of quantum communication.

    Raising the temperature

    While many superconducting quantum sensors require ultra-low temperatures of a few mK, some spin-based quantum sensors can function at or near room temperature. Spin-based sensors, such as nitrogen-vacancy (NV) centres in diamonds and polarised rubidium atoms, are excellent examples.

    NV centres are defects in the diamond lattice where a missing carbon atom – the vacancy – is adjacent to a lattice site where a carbon atom has been replaced by a nitrogen atom. The electronic spin states in NV centres have unique energy levels that can be probed by laser excitation and detection of spin-dependent fluorescence.

    Researchers are increasingly exploring how quantum-control techniques can be integrated into high-energy-physics detectors

    Rubidium is promising for spin-based sensors because it has unpaired electrons. In the presence of an external magnetic field, its atomic energy levels are split by the Zeeman effect. When optically pumped with laser light, spin-polarised “dark” sublevels – those not excited by the light – become increasingly populated. These aligned spins precess in magnetic fields, forming the basis of atomic magnetometers and other quantum sensors.

    Being exquisite magnetometers, both devices make promising detectors for ultralight bosonic dark-matter candidates such as axions. Fermion spins may interact with spatial or temporal gradients of the axion field, leading to tiny oscillating energy shifts. The coupling of axions to gluons could also show up as an oscillating nuclear electric dipole moment. These interactions could manifest as oscillating energy-level shifts in NV centres, or as time-varying NMR-like spin precession signals in the atomic magnetometers.

    Large-scale detectors

    The situation is completely different in high-energy physics detectors, which require numerous interactions between a particle and a detector. Charged particles cause many ionisation events, and when a neutral particle interacts it produces charged particles that result in similarly numerous ionisations. Even if quantum control were possible within individual units of a massive detector, the number of individual quantum sub-processes to be monitored would exceed the possibilities of any realistic device.

    Increasingly, however, researchers are exploring how quantum-control techniques – such as manipulating individual atoms or spins using lasers or microwaves – can be integrated into high-energy-physics detectors. These methods could enhance detector sensitivity, tune detector response or enable entirely new ways of measuring particle properties. While these quantum-enhanced or hybrid detection approaches are still in their early stages, they hold significant promise.

    Quantum dots

    Quantum dots are nanoscale semiconductor crystals – typically a few nanometres in diameter – that confine charge carriers (electrons and holes) in all three spatial dimensions. This quantum confinement leads to discrete, atom-like energy levels and results in optical and electronic properties that are highly tunable with size, shape and composition. Originally studied for their potential in optoelectronics and biomedical imaging, quantum dots have more recently attracted interest in high-energy physics due to their fast scintillation response, narrow-band emission and tunability. Their emission wavelength can be precisely controlled through nanostructuring, making them promising candidates for engineered detectors with tailored response characteristics.

    Chromatic calorimetry

    While their radiation hardness is still under debate and needs to be resolved, engineering their composition, geometry, surface and size can yield very narrow-band (20 nm) emitters across the optical spectrum and into the infrared. Quantum dots such as these could enable the design of a “chromatic calorimeter”: a stack of quantum-dot layers, each tuned to emit at a distinct wavelength; for example red in the first layer, orange in the second and progressing through the visible spectrum to violet. Each layer would absorb higher energy photons quite broadly but emit light in a narrow spectral band. The intensity of each colour would then correspond to the energy absorbed in that layer, while the emission wavelength would encode the position of energy deposition, revealing the shower shape (see “Chromatic calorimetry” figure). Because each layer is optically distinct, hermetic isolation would be unnecessary, reducing the overall material budget.

    Rather than improving the energy resolution of existing calorimeters, quantum dots could provide additional information on the shape and development of particle showers if embedded in existing scintillators. Initial simulations and beam tests by CERN’s Quantum Technology Initiative (QTI) support the hypothesis that the spectral intensity of quantum-dot emission can carry information about the energy and species of incident particles. Ongoing work aims to explore their capabilities and limitations.

    Beyond calorimetry, quantum dots could be formed within solid semiconductor matrices, such as gallium arsenide, to form a novel class of “photonic trackers”. Scintillation light from electronically tunable quantum dots could be collected by photodetectors integrated directly on top of the same thin semiconductor structure, such as in the DoTPiX concept. Thanks to a highly compact, radiation-tolerant scintillating pixel tracking system with intrinsic signal amplification and minimal material budget, photonic trackers could provide a scintillation-light-based alternative to traditional charge-based particle trackers.

    Living on the edge

    Low temperatures also offer opportunities at scale – and cryogenic operation is a well-established technique in both high-energy and astroparticle physics, with liquid argon (boiling point 87 K) widely used in time projection chambers and some calorimeters, and some dark-matter experiments using liquid helium (boiling point 4.2 K) to reach even lower temperatures. A range of solid-state detectors, including superconducting sensors, operate effectively at these temperatures and below, and offer significant advantages in sensitivity and energy resolution.

    Single-photon phase transitions

    Magnetic microcalorimeters (MMCs) and transition-edge sensors (TESs) operate in the narrow temperature range where a superconducting material undergoes a rapid transition from zero resistance to finite values. When a particle deposits energy in an MMC or TES, it slightly raises the temperature, causing a measurable increase in resistance. Because the transition is extremely steep, even a tiny temperature change leads to a detectable resistance change, allowing precise calorimetry.

    Functioning at millikelvin temperatures, TESs provide much higher energy resolution than solid-state detectors made from high-purity germanium crystals, which work by collecting electron–hole pairs created when ionising radiation interacts with the crystal lattice. TESs are increasingly used in high-resolution X-ray spectroscopy of pionic, muonic or antiprotonic atoms, and in photon detection for observational astronomy, despite the technical challenges associated with maintaining ultra-low operating temperatures.

    By contrast, superconducting nanowire and microwire single-photon detectors (SNSPDs and SMSPDs) register only a change in state – from superconducting to normal conducting – allowing them to operate at higher temperatures than traditional low-temperature sensors. When made from high–critical-temperature (Tc) superconductors, operation at temperatures as high as 10 K is feasible, while maintaining excellent sensitivity to energy deposited by charged particles and ultrafast switching times on the order of a few picoseconds. Recent advances include the development of large-area devices with up to 400,000 micron-scale pixels (see “Single-photon phase transitions” figure), fabrication of high-Tc SNSPDs and successful beam tests of SMSPDs. These technologies are promising candidates for detecting milli-charged particles – hypothetical particles arising in “hidden sector” extensions of the Standard Model – or for high-rate beam monitoring at future colliders.

    Rugged, reliable and reproducible

    Quantum sensor-based experiments have vastly expanded the phase space that has been searched for new physics. This is just the beginning of the journey, as larger-scale efforts build on the initial gold rush and new quantum devices are developed, perfected and brought to bear on the many open questions of particle physics.

    Partnering with neighbouring fields such as quantum computing, quantum communication and manufacturing is of paramount importance

    To fully profit from their potential, a vigorous R&D programme is needed to scale up quantum sensors for future detectors. Ruggedness, reliability and reproducibility are key – as well as establishing “proof of principle” for the numerous imaginative concepts that have already been conceived. Challenges range from access to test infrastructures, to standardised test protocols for fair comparisons. In many cases, the largest challenge is to foster an open exchange of ideas given the numerous local developments that are happening worldwide. Finding a common language to discuss developments in different fields that at first glance may have little in common, builds on a willingness to listen, learn and exchange.

    The European Committee for Future Accelerators (ECFA) detector R&D roadmap provides a welcome framework for addressing these challenges collaboratively through the Detector R&D (DRD) collaborations established in 2023 and now coordinated at CERN. Quantum sensors and emerging technologies are covered within the DRD5 collaboration, which ties together 112 institutes worldwide, many of them leaders in their particular field. Only a third stem from the traditional high-energy physics community.

    These efforts build on the widespread expertise and enthusiastic efforts at numerous institutes and tie in with the quantum programmes being spearheaded at high-energy-physics research centres, among them CERN’s QTI. Partnering with neighbouring fields such as quantum computing, quantum communication and manufacturing is of paramount importance. The best approach may prove to be “targeted blue-sky research”: a willingness to explore completely novel concepts while keeping their ultimate usefulness for particle physics firmly in mind.

    Further reading

    C Peña et al. 2025 JINST 20 P03001.
    G Hallais et al. 2023 Nucl. Instrum. Methods Phys. Res. A 1047 167906.
    B G Oripov et al. 2023 Nature 622 730.
    L Gottardi and S Smith 2022 arXiv:2210.06617.

    Continue Reading

  • Quantum simulators in high-energy physics – CERN Courier

    Quantum simulators in high-energy physics – CERN Courier

    From black-hole evaporation to neutron-star interiors, extreme environments and complex dynamics often outpace even the most powerful supercomputers. Enrique Rico Ortega and Sofia Vallecorsa explain how quantum computing will change that.

    High fidelity A researcher peers into the vacuum chamber of a trapped-ion quantum computer at PSI. Credit: E Brucke/PSI

    In 1982 Richard Feynman posed a question that challenged computational limits: can a classical computer simulate a quantum system? His answer: not efficiently. The complexity of the computation increases rapidly, rendering realistic simulations intractable. To understand why, consider the basic units of classical and quantum information.

    A classical bit can exist in one of two states: |0> or |1>. A quantum bit, or qubit, exists in a superposition α|0> + β|1>, where α and β are complex amplitudes with real and imaginary parts. This superposition is the core feature that distinguishes quantum bits and classical bits. While a classical bit is either |0> or |1>, a quantum bit can be a blend of both at once. This is what gives quantum computers their immense parallelism – and also their fragility.

    The difference becomes profound with scale. Two classical bits have four possible states, and are always in just one of them at a time. Two qubits simultaneously encode a complex-valued superposition of all four states.

    Resources scale exponentially. N classical bits encode N boolean values, but N qubits encode 2N complex amplitudes. Simulating 50 qubits with double-precision real numbers for each part of the complex amplitudes would require more than a petabyte of memory, beyond the reach of even the largest supercomputers.

    Direct mimicry

    Feynman proposed a different approach to quantum simulation. If a classical computer struggles, why not use one quantum system to emulate the behaviour of another? This was the conceptual birth of the quantum simulator: a device that harnesses quantum mechanics to solve quantum problems. For decades, this visionary idea remained in the realm of theory, awaiting the technological breakthroughs that are now rapidly bringing it to life. Today, progress in quantum hardware is driving two main approaches: analog and digital quantum simulation, in direct analogy to the history of classical computing.

    Optical tweezers

    In analog quantum simulators, the physical parameters of the simulator directly correspond to the parameters of the quantum system being studied. Think of it like a wind tunnel for aeroplanes: you are not calculating air resistance on a computer but directly observing how air flows over a model.

    A striking example of an analog quantum simulator traps excited Rydberg atoms in precise configurations using highly focused laser beams known as “optical tweezers”. Rydberg atoms have one electron excited to an energy level far from the nucleus, giving them an exaggerated electric dipole moment that leads to tunable long-range dipole–dipole interactions – an ideal setup for simulating particle interactions in quantum field theories (see “Optical tweezers” figure).

    The positions of the Rydberg atoms discretise the space inhabited by the quantum fields being modelled. At each point in the lattice, the local quantum degrees of freedom of the simulated fields are embodied by the internal states of the atoms. Dipole–dipole interactions simulate the dynamics of the quantum fields. This technique has been used to observe phenomena such as string breaking, where the force between particles pulls so strongly that the vacuum spontaneously creates new particle–antiparticle pairs. Such quantum simulations model processes that are notoriously difficult to calculate from first principles using classical computers (see “A philosophical dimension” panel).

    Universal quantum computation

    Digital quantum simulators operate much like classical digital computers, though using quantum rather than classical logic gates. While classical logic manipulates classical bits, quantum logic manipulates qubits. Because quantum logic gates obey the Schrödinger equation, they preserve information and are reversible, whereas most classical gates, such as “AND” and “OR”, are irreversible. Many quantum gates have no classical equivalent, because they manipulate phase, superposition or entanglement – a uniquely quantum phenomenon in which two or more qubits share a combined state. In an entangled system, the state of each qubit cannot be described independently of the others, even if they are far apart: the global description of the quantum state is more than the combination of the local information at every site.

    A philosophical dimension

    The discretisation of space by quantum simulators echoes the rise of lattice QCD in the 1970s and 1980s. Confronted with the non-perturbative nature of the strong interaction, Kenneth Wilson introduced a method to discretise spacetime, enabling numerical solutions to quantum chromodynamics beyond the reach of perturbation theory. Simulations on classical supercomputers have since deepened our understanding of quark confinement and hadron masses, catalysed advances in high-performance computing, and inspired international collaborations. It has become an indispensable tool in particle physics (see “Fermilab’s final word on muon g-2”).

    In classical lattice QCD, the discretisation of spacetime is just a computational trick – a means to an end. But in quantum simulators this discretisation becomes physical. The simulator is a quantum system governed by the same fundamental laws as the target theory.

    This raises a philosophical question: are we merely modelling the target theory or are we, in a limited but genuine sense, realising it? If an array of neutral atoms faithfully mimics the dynamical behaviour of a specific gauge theory, is it “just” a simulation, or is it another manifestation of that theory’s fundamental truth? Feynman’s original proposal was, in a sense, about using nature to compute itself. Quantum simulators bring this abstract notion into concrete laboratory reality.

    By applying sequences of quantum logic gates, a digital quantum computer can model the time evolution of any target quantum system. This makes them flexible and scalable in pursuit of universal quantum computation – logic able to run any algorithm allowed by the laws of quantum mechanics, given enough qubits and sufficient time. Universal quantum computing requires only a small subset of the many quantum logic gates that can be conceived, for example Hadamard, T and CNOT. The Hadamard gate creates a superposition: |0> (|0> + |1>) / 2. The T gate applies a 45° phase rotation: |1> eiπ/4|1>. And the CNOT gate entangles qubits by flipping a target qubit if a control qubit is |1>. These three suffice to prepare any quantum state from a trivial reference state: |ψ> = U1 U2 U3 … UN |0000…000>.

    Trapped ions

    To bring frontier physics problems within the scope of current quantum computing resources, the distinction between analog and digital quantum simulations is often blurred. The complexity of simulations can be reduced by combining digital gate sequences with analog quantum hardware that aligns with the interaction patterns relevant to the target problem. This is feasible as quantum logic gates usually rely on native interactions similar to those used in analog simulations. Rydberg atoms are a common choice. Alongside them, two other technologies are becoming increasingly dominant in digital quantum simulation: trapped ions and superconducting qubit arrays.

    Trapped ions offer the greatest control. Individual charged ions can be suspended in free space using electromagnetic fields. Lasers manipulate their quantum states, inducing interactions between them. Trapped-ion systems are renowned for their high fidelity (meaning operations are accurate) and long coherence times (meaning they maintain their quantum properties for longer), making them excellent candidates for quantum simulation (see “Trapped ions” figure).

    Superconducting qubit arrays promise the greatest scalability. These tiny superconducting circuit materials act as qubits when cooled to extremely low temperatures and manipulated with microwave pulses. This technology is at the forefront of efforts to build quantum simulators and digital quantum computers for universal quantum computation (see “Superconducting qubits” figure).

    The noisy intermediate-scale quantum era

    Despite rapid progress, these technologies are at an early stage of development and face three main limitations.

    The first problem is that qubits are fragile. Interactions with their environment quickly compromise their superposition and entanglement, making computations unreliable. Preventing “decoherence” is one of the main engineering challenges in quantum technology today.

    The second challenge is that quantum logic gates have low fidelity. Over a long sequence of operations, errors accumulate, corrupting the result.

    Finally, quantum simulators currently have a very limited number of qubits – typically only a few hundred. This is far fewer than what is needed for high-energy physics (HEP) problems.

    Superconducting qubits

    This situation is known as the noisy “intermediate-scale” quantum era: we are no longer doing proof-of-principle experiments with a few tens of qubits, but neither can we control thousands of them. These limitations mean that current digital simulations are often restricted to “toy” models, such as QED simplified to have just one spatial and one time dimension. Even with these constraints, small-scale devices have successfully reproduced non-perturbative aspects of the theories in real time and have verified the preservation of fundamental physical principles such as gauge invariance, the symmetry that underpins the fundamental forces of the Standard Model.

    Quantum simulators may chart a similar path to classical lattice QCD, but with even greater reach. Lattice QCD struggles with real-time evolution and finite-density physics due to the infamous “sign problem”, wherein quantum interference between classically computed amplitudes causes exponentially worsening signal-to-noise ratios. This renders some of the most interesting problems unsolvable on classical machines.

    Quantum simulators do not suffer from the sign problem because they evolve naturally in real-time, just like the physical systems they emulate. This promises to open new frontiers such as the simulation of early-universe dynamics, black-hole evaporation and the dense interiors of neutron stars.

    Quantum simulators will powerfully augment traditional theoretical and computational methods, offering profound insights when Feynman diagrams become intractable, when dealing with real-time dynamics and when the sign problem renders classical simulations exponentially difficult. Just as the lattice revolution required decades of concerted community effort to reach its full potential, so will the quantum revolution, but the fruits will again transform the field. As the aphorism attributed to Mark Twain goes: history never repeats itself, but it often rhymes.

    Quantum information

    One of the most exciting and productive developments in recent years is the unexpected, yet profound, convergence between HEP and quantum information science (QIS). For a long time these fields evolved independently. HEP explored the universe’s smallest constituents and grandest structures, while QIS focused on harnessing quantum mechanics for computation and communication. One of the pioneers in studying the interface between these fields was John Bell, a theoretical physicist at CERN.

    Just as the lattice revolution needed decades of concerted community effort to reach its full potential, so will the quantum revolution

    HEP and QIS are now deeply intertwined. As quantum simulators advance, there is a growing demand for theoretical tools that combine the rigour of quantum field theory with the concepts of QIS. For example, tensor networks were developed in condensed-matter physics to represent highly entangled quantum states, and have now found surprising applications in lattice gauge theories and “holographic dualities” between quantum gravity and quantum field theory. Another example is quantum error correction – a vital QIS technique to protect fragile quantum information from noise, and now a major focus for quantum simulation in HEP.

    This cross-disciplinary synthesis is not just conceptual; it is becoming institutional. Initiatives like the US Department of Energy’s Quantum Information Science Enabled Discovery (QuantISED) programme, CERN’s Quantum Technology Initiative (QTI) and Europe’s Quantum Flagship are making substantial investments in collaborative research. Quantum algorithms will become indispensable for theoretical problems just as quantum sensors are becoming indispensable to experimental observation (see “Sensing at quantum limits”).

    The result is the emergence of a new breed of scientist: one equally fluent in the fundamental equations of particle physics and the practicalities of quantum hardware. These “hybrid” scientists are building the theoretical and computational scaffolding for a future where quantum simulation is a standard, indispensable tool in HEP. 

    Further reading

    M C Bañuls et al. 2020 Eur. Phys. J. D 74 165.
    Y Alexeev et al. 2021 PRX Quantum 2 017001.
    C W Bauer et al. 2023 PRX Quantum 4 027001.
    A Di Meglio et al. 2024 PRX Quantum 5 037001.
    T A Cochran et al. 2025 Nature 642 315.
    D González-Cuadra et al. 2025 Nature 642 321.

    Continue Reading

  • Hidden DNA-sized crystals in cosmic ice could rewrite water—and life itself

    Hidden DNA-sized crystals in cosmic ice could rewrite water—and life itself

    “Space ice” contains tiny crystals and is not, as previously assumed, a completely disordered material like liquid water, according to a new study by scientists at UCL (University College London) and the University of Cambridge.

    Ice in space is different to the crystalline (highly ordered) form of ice on Earth. For decades, scientists have assumed it is amorphous (without a structure), with colder temperatures meaning it does not have enough energy to form crystals when it freezes.

    In the new study, published in Physical Review B, researchers investigated the most common form of ice in the Universe, low-density amorphous ice, which exists as the bulk material in comets, on icy moons and in clouds of dust where stars and planets form.

    They found that computer simulations of this ice best matched measurements from previous experiments if the ice was not fully amorphous but contained tiny crystals (about three nanometers wide, slightly wider than a single strand of DNA) embedded within its disordered structures.

    In experimental work, they also re-crystallized (i.e. warmed up) real samples of amorphous ice that had formed in different ways. They found that the final crystal structure varied depending on how the amorphous ice had originated. If the ice had been fully amorphous (fully disordered), the researchers concluded, it would not retain any imprint of its earlier form.

    Lead author Dr Michael B. Davies, who did the work as part of his PhD at UCL Physics & Astronomy and the University of Cambridge, said: “We now have a good idea of what the most common form of ice in the Universe looks like at an atomic level.

    “This is important as ice is involved in many cosmological processes, for instance in how planets form, how galaxies evolve, and how matter moves around the Universe.”

    The findings also have implications for one speculative theory about how life on Earth began. According to this theory, known as Panspermia, the building blocks of life were carried here on an ice comet, with low-density amorphous ice the space shuttle material in which ingredients such as simple amino acids were transported.

    Dr Davies said: “Our findings suggest this ice would be a less good transport material for these origin of life molecules. That is because a partly crystalline structure has less space in which these ingredients could become embedded.

    “The theory could still hold true, though, as there are amorphous regions in the ice where life’s building blocks could be trapped and stored.”

    Co-author Professor Christoph Salzmann, of UCL Chemistry, said: “Ice on Earth is a cosmological curiosity due to our warm temperatures. You can see its ordered nature in the symmetry of a snowflake.

    “Ice in the rest of the Universe has long been considered a snapshot of liquid water — that is, a disordered arrangement fixed in place. Our findings show this is not entirely true.

    “Our results also raise questions about amorphous materials in general. These materials have important uses in much advanced technology. For instance, glass fibers that transport data long distances need to be amorphous, or disordered, for their function. If they do contain tiny crystals and we can remove them, this will improve their performance.”

    For the study, the researchers used two computer models of water. They froze these virtual “boxes” of water molecules by cooling to -120 degrees Centigrade at different rates. The different rates of cooling led to varying proportions of crystalline and amorphous ice.

    They found that ice that was up to 20% crystalline (and 80% amorphous) appeared to closely match the structure of low-density amorphous ice as found in X-ray diffraction studies (that is, where researchers fire X-rays at the ice and analyse how these rays are deflected).

    Using another approach, they created large “boxes” with many small ice crystals closely squeezed together. The simulation then disordered the regions between the ice crystals reaching very similar structures compared to the first approach with 25% crystalline ice.

    In additional experimental work, the research team created real samples of low-density amorphous ice in a range of ways, from depositing water vapor on to an extremely cold surface (how ice forms on dust grains in interstellar clouds) to warming up what is known as high-density amorphous ice (ice that has been crushed at extremely cold temperatures).

    The team then gently heated these amorphous ices so they had the energy to form crystals. They noticed differences in the ices’ structure depending on their origin — specifically, there was variation in the proportion of molecules stacked in a six-fold (hexagonal) arrangement.

    This was indirect evidence, they said, that low-density amorphous ice contained crystals. If it was fully disordered, they concluded, the ice would not retain any memory of its earlier forms.

    The research team said their findings raised many additional questions about the nature of amorphous ices — for instance, whether the size of crystals varied depending on how the amorphous ice formed, and whether a truly amorphous ice was possible.

    Amorphous ice was first discovered in its low-density form in the 1930s when scientists condensed water vapor on a metal surface cooled to -110 degrees Centigrade. Its high-density state was discovered in the 1980s when ordinary ice was compressed at nearly -200 degrees Centigrade.

    The research team behind the latest paper, based both at UCL and the University of Cambridge, discovered medium-density amorphous ice in 2023. This ice was found to have the same density as liquid water (and would therefore neither sink nor float in water).

    Co-author Professor Angelos Michaelides, from the University of Cambridge, said: “Water is the foundation of life but we still do not fully understand it. Amorphous ices may hold the key to explaining some of water’s many anomalies.”

    Dr Davies said: “Ice is potentially a high-performance material in space. It could shield spacecraft from radiation or provide fuel in the form of hydrogen and oxygen. So we need to know about its various forms and properties.”

    Continue Reading