AL-S Pharma’s lead ALS asset, AP-101, is going to Phase III after the drug demonstrated its disease-modifying capacity in a Phase II study. Image credit: Saiful52 via ShutterStock.com.
AL-S Pharma is planning a registrational study for its amyotrophic lateral sclerosis (ALS) monoclonal antibody (mAb), AP-101, after the drug met its efficacy and safety endpoints in a mid-stage trial.
During the Phase II study (NCT05039099), ALS patients treated early with AP-101 displayed prolonged survival and delayed ventilatory support after 12 months of treatment compared to those who received placebo for six months then AP-101 for six months. This benefit was observed across both the sporadic ALS and superoxide dismutase 1 (SOD1) mutation carrier cohorts.
Discover B2B Marketing That Performs
Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.
Find out more
The SOD1-targeting mAb also triggered disease stabilisation, as measured by King’s staging, which qualitatively categorises disease progression based on the condition of the bulbar, upper and lower limb and diaphragm regions.
Patients receiving AP-101 also experienced reduced functional decline in patients with elevated, misfolded SOD1 levels at baseline, as well as those with SOD1 mutations.
Alongside the drug’s impact on disease progression and stabilisation, patients receiving AP-101 experienced changes to neurofilament biomarkers, which AL-S claims “aligns with AP-101’s clinical benefit”.
While AL-S did not provide specifics on the drug’s safety profile, the Swiss biotech did note that adverse events (AEs) were similar between the placebo and treatment groups. There were no antibody responses to AP-101 observed during the 12-month study.
AL-S presented the data at the 36th International Symposium on ALS/MND, which was held from 5 to 7 December in San Diego; however, the company had previously confirmed the drug met the trial endpoints.
According to the CEO of AL-S Pharma, Michael Salzmann, the biotech now plans to discuss next steps for AP-101 with regulators in the coming months, as the company prepares to initiate a confirmatory Phase III study.
Homing in on SOD1
While ALS has typically been a challenging indication to treat, there are now four therapies that have gained approval from the US Food and Drug Administration (FDA).
This includes Biogen and Ionis Pharmaceuticals’ SOD1-targeting antisense oligonucleotide, Qalsody (tofersen), which gained accelerated approval from the FDA in 2023, despite the VALOR trial failing to meet its primary endpoint of improved ALSFRS-R scores over 28 weeks. The FDA granted the drug approval based on changes in plasma levels of neurofilament light (NfL) protein.
While Qalsody’s results have been mixed thus far, SOD1 has been a target of interest across the ALS research space, with four drugs, including AP-101, in active development, according to GlobalData’s Pharmaceutical Intelligence Center.
GlobalData is the parent company of Clinical Trials Arena.
The ALS Association currently estimates that SOD1 gene mutations are observed in 10-20% of familial ALS cases, while 1-2% of sporadic ALS cases are linked to this mutation.
If AL-S’s SOD1-directed AP-101 were to get the FDA go-ahead, it would offer an alternative dosing option to Qalsody, as AP-101 can be administered intravenously, while Qalsody requires patients to undergo an intrathecal injection.
According to GlobalData’s patient-based forecast, Qalsody will make $68m for Biogen and Ionis in 2029.
Sign up for our daily news round-up!
Give your business an edge with our leading industry insights.
Nominations are now closed for the Clinical Trials Arena Excellence Awards. A big thanks to all the organisations that entered – your response has been outstanding, showcasing exceptional innovation, leadership, and impact.
Excellence in Action
YPrime won the Innovation award for AI in Clinical Trials and the Environmental award for Sustainable Trials, thanks to its eCOA, IRT and eConsent platforms. Explore how purpose-built AI, paperless workflows and circular hardware practices are reshaping timelines, data quality and ESG performance in clinical research.
Pivotal Research Group is moving away from Netflix for the time being. Analyst Jeffrey Wlodarczak downgraded the streaming stock to a hold rating from buy. He also lowered his target price to $105 per share from $160, which implies a gain of just 5%. Wlodarczak cited Netflix’s $72 billion deal to buy Warner Bros. Discovery’s film studio and streaming services as the reason for his downgrade. Headwinds it introduces, he said, include approval risk and a likely time frame of between 18 to 24 months to close, along with the possibility of a bidding war with Paramount Skydance that could further drive the cost higher. NFLX YTD mountain NFLX YTD chart Most importantly, he added, this deal underscores Netflix’s concern that short form entertainment and declining attention spans are stealing market shares away from traditional long form content and streaming. “We are moving to more conservative stance on our outlook as we believe this expensive deal does partly signal concern from management about trying to combat mediocre subscriber engagement trends,” Wlodarczak wrote. “We reduced our 2030 subscriber forecasts from ~440M to ~420M and our ’30 [average revenue per user] from $15 in 2030 to $13 which was the primary driver of a substantial $55 decline in our YE’26 target price to $105.” Other upcoming risks for the stock include Netflix’s large future content obligations, its high reliance on net neutrality and materials risks that will be introduced by the development and launch of an advertising-based option. Wlodarczak added that user engagement trends have been flatlining despite a strong content slate, while Netflix’s dependency on Amazon Web Services to host its content infrastructure also benefits a direct competitor. Netflix stock has added 12% this year.
The actor Judi Dench has spoken about her worsening eyesight and increasing memory problems, saying she struggles to recall immediate appointments – but is still able to remember reams of Shakespeare.
“Hume kisi ko jawab nahi dena… humara arrow jawab dega (We don’t owe an answer to anyone, our arrow will do the talking)” – the line, once her coach’s advice to Sheetal Devi, is now the mantra driving one of the most astonishing…
Quantum Source released a comprehensive technical report, developed with The Quantum Insider, that compares all major qubit modalities and outlines engineering pathways toward fault-tolerant quantum computing.
The report highlights that the field has moved from theoretical exploration to practical engineering, with recent demonstrations from Google and Quantinuum showing logical qubits outperforming physical ones.
It introduces a unified framework comparing qubit systems by qubit carrier and computational model, showing that while no modality currently dominates, hybrid approaches may have the potential to overcome key scalability limits.
A case study on Quantum Source’s deterministic atom–photon platform shows how their design replaces probabilistic photonic entanglement with efficient, repeatable atom-mediated processes, reducing hardware overhead and enabling scalable, modular architectures for future fault-tolerant systems.
Quantum Source, a pioneering company developing scalable photonic quantum computing systems, has released a new industry report, From Qubits to Logic: Engineering Fault-Tolerant Quantum Systems, offering one of the most comprehensive technical assessments to date of the global push toward fault-tolerant quantum computing.
The report, developed in partnership with The Quantum Insider, synthesizes progress across all major qubit modalities and introduces a comparative framework linking physical qubit performance, quantum-error-correction (QEC) strategies, and scalability. The report emphasizes that the path to fault tolerance has shifted from a theoretical goal to an engineering challenge, defined by how well systems scale, and integrate control, architecture, and error-correction design.
The Transition from Theory to Demonstration
The report defines fault tolerance as the capability of a quantum computer to perform arbitrarily long computations reliably, even when each underlying physical gate or measurement is prone to error. Achieving this requires encoding logical qubits across many physical qubits and applying continuous error detection and correction.
As explained in the report, recent milestones such as Google’s Willow processor achieving error suppression below the surface-code threshold and Quantinuum’s demonstration of logical gates outperforming physical ones confirm that the field has entered a new phase. Logical qubits are now capable of surpassing physical fidelity, which is an essential crucial step toward scalable, useful quantum machines
“For more than two decades, the theoretical foundations of quantum error correction have matured,” said Michael Slutsky, Head of Theory at Quantum Source. “In recent years, the first functional logical elements have been experimentally demonstrated across a broad range of hardware platforms, showing steadily improving performance and marking real progress toward the fault-tolerant era. We’re not there yet—but the future is coming into focus.”
A Unified Framework for Comparing Qubit Modalities
The report organizes today’s quantum hardware landscape along two fundamental axes:
The physical nature of the qubit carrier (matter-based vs. photon-based), and
The computational model (circuit-based vs. measurement-based quantum computing, MBQC).
This two-axis perspective clarifies both the constraints and opportunities inherent to each modality:
Superconducting qubits – Fast gate speeds and mature fabrication, but cryogenic wiring and variability limit scaling.
Trapped-ion qubits – Record-setting fidelities and all-to-all connectivity, yet scaling is constrained by mode crowding and control complexity.
Neutral-atom qubits – Large, reconfigurable arrays with second-scale coherence, but two-qubit fidelities must exceed 99.9 %.
Semiconductor spin qubits – CMOS compatibility and density advantages offset by device variability and cryogenic control challenges.
Photonic qubits – Operate at room temperature and excel at networking, but photon loss and probabilistic entanglement limit scalability.
The comparative framework reveals that no modality yet leads the path to fault tolerance. Each platform carries its own engineering trade-offs, from coherence limits to fabrication challenges, making progress uneven and interdependent. While hybrid approaches remain unproven, they represent a promising area of exploration, particularly for addressing bottlenecks that no single technology can overcome alone. It is within this emerging space that Quantum Source is positioning its deterministic atom–photon architecture.
At the center of the report there is a case study on Quantum Source’s hybrid atom–photon platform, which replaces probabilistic two-photon fusion with deterministic atom-mediated entanglement.
In conventional measurement-based photonic computing, millions of synchronized photon sources and switches are needed to compensate for low entangling-gate success rates. Quantum Source’s design solves this by using single trapped atoms as reusable entanglement mediators:
A photon is first entangled with an atom inside a high-finesse optical cavity.
The atomic state is then mapped onto a second photon, entangling the two photons deterministically through the shared atomic state.
The same atom can repeat this process, efficiently generating large photonic cluster states.
This deterministic atom–photon mechanism reduces hardware overhead, requiring fewer photon sources, switches, and detectors. It also maintains full compatibility with room-temperature photonic systems.
“By harnessing deterministic photon–atom interactions on a chip, we can generate entangled photonic states with unprecedented efficiency, at room temperature, in a compact and scalable architecture,” said Oded Melamed, CEO of Quantum Source.
The report concludes that this hybrid approach “directly addresses the primary photonic bottleneck of two-photon entanglement” and could enable modular, distributed FTQC architectures where matter qubits handle deterministic logic and photons manage long-distance communication
Implications for Industry and Policy
The paper frames FTQC as both a technological and strategic inflection point.
For industry, success will depend on co-optimizing hardware, software, and error-correction stacks to minimize overhead. For investors and policymakers, diversification across hardware modalities is essential: each contributes unique value to the developing ecosystem.
The report forecasts that within the next decade, logical qubits will likely outperform physical ones and million-qubit systems will become a realistic engineering target. Hybrid innovations such as Quantum Source’s atom–photon platform may play an essential role in achieving those goals.
About the Report
From Qubits to Logic: Engineering Fault-Tolerant Quantum Systems is a 2025 technical white paper by Quantum Source, developed in partnership with The Quantum Insider.
The report presents a comprehensive comparative analysis of major qubit modalities and introduces a framework for evaluating fault-tolerant scalability across hardware classes. It includes expert commentary from leading researchers and references to recent experimental breakthroughs spanning superconducting, ion-trap, neutral-atom, spin, and photonic platforms.
For more information or to access the full report, visit this link.
Most meteor showers occur when Earth passes through debris trails shed by orbiting comets— the bits of ice and dust burn up in our atmosphere, producing shooting stars. Conceptually, the same process creates the Geminids….