Blog

  • Global Virus Network announces the addition of three new Centers of Excellence

    Global Virus Network announces the addition of three new Centers of Excellence

    The Global Virus Network (GVN), a coalition of leading medical virologists representing 80+ Centers of Excellence and Affiliates in 40+ countries, today announced the addition of three new Centers of Excellence: the USF Health…

    Continue Reading

  • ‘It’s dark in the US right now. But I turn on a light, you know?’: Mavis Staples on Prince, Martin Luther King and her 75-year singing career | Mavis Staples

    ‘It’s dark in the US right now. But I turn on a light, you know?’: Mavis Staples on Prince, Martin Luther King and her 75-year singing career | Mavis Staples

    Can you speak about the array of songs and artists on your new record? What kind of message and lyrics do you want to sing at this point in your life? steve_bayley
    The first song I got for the album was Human Mind, written by Hozier and Allison…

    Continue Reading

  • OpenSpliceAI provides an efficient modular implementation of SpliceAI enabling easy retraining across nonhuman species

    OpenSpliceAI provides an efficient modular implementation of SpliceAI enabling easy retraining across nonhuman species

    We developed OpenSpliceAI to be a modular Python toolkit designed as an open-source implementation of SpliceAI, to which we added several key enhancements. The framework replicates the core logic of the SpliceAI model while optimizing prediction efficiency and variant effect analysis, such as acceptor and donor gains or losses, using pre-trained models. Our benchmarks show substantial computational advantages over SpliceAI, with faster processing, lower memory usage, and improved GPU efficiency (Figure 2B, Figure 2—figure supplement 6). These improvements are driven by our optimized PyTorch implementation that employs dynamic computation graphs and on-demand GPU memory allocation – allowing memory to be allocated and freed as needed – in contrast to SpliceAI’s static, Keras-based TensorFlow approach, which pre-allocates memory for the worst-case input size. In SpliceAI, this rigid memory allocation leads to high memory overhead and frequent out-of-memory errors when handling large datasets through large loop iteration prediction. Additionally, OpenSpliceAI leverages streamlined data handling and enhanced parallelization through batch prediction and multiprocessing, automatically distributing tasks across available threads. Together, these features prevent the memory pitfalls common in SpliceAI and make OpenSpliceAI a more scalable and efficient solution for large-scale genomic analysis.

    It is important to note that even though OpenSpliceAI and SpliceAI share the same model architecture, the released trained models are not identical. The variability observed between our models and the original SpliceAI – and even among successive training runs using the same code and data – can be attributed to several sources of inherent randomness. First, weight initialization is performed randomly for many layers, which means that different initial weights can lead to distinct convergence paths and final model parameters. Second, the process of data shuffling alters the composition of mini-batches during training, impacting both the training dynamics and the statistics computed in batch normalization layers. Although batch normalization is deterministic for a fixed mini-batch, its reliance on batch statistics introduces variability due to the random sampling of data. Finally, OpenSpliceAI employs the AdamW optimizer (Loshchilov and Hutter, 2019), which incorporates exponential moving averages of the first and second moments of the gradients. This mechanism serves a momentum-like role, contributing to an adaptive learning process that is inherently stochastic. Moreover, subtle differences in the order of operations or floating-point arithmetic, particularly in distributed computing environments, can further amplify this stochastic behavior. Together, these factors contribute to the observed nondeterministic behavior, resulting in slight discrepancies between our trained models and the original SpliceAI, as well as among successive training sessions under identical conditions.

    OpenSpliceAI empowers researchers to adapt the framework to many other species by including modules that enable easy retraining. For closely related species such as mice, our retrained model demonstrated comparable or slightly better precision than the human-based SpliceAI model. For more distant species such as A. thaliana, whose genomic structure differs substantially from humans, retraining OpenSpliceAI yields much greater improvements in accuracy. Our initial release includes models trained on the human MANE genome annotation and four additional species: mouse, zebrafish, honeybee, and A. thaliana. We also evaluated pre-training on mouse (OSAIMouse), honeybee (OSAIHoneybee), zebrafish (OSAIZebrafish), and Arabidopsis (OSAIArabidopsis) followed by fine-tuning on the human MANE dataset. While cross-species pre-training substantially accelerated convergence during fine-tuning, the final human splicing prediction accuracy was comparable to that of a model trained from scratch on human data. This result indicates that our architecture seems to capture all relevant splicing features from human training data alone and thus gains little or no benefit from cross-species transfer learning in this context (see Figure 4—figure supplement 5).

    OpenSpliceAI also includes modules for transfer learning, allowing researchers to initialize models with weights learned on other species. In our transfer learning experiments, models transferred from human to other species displayed faster convergence and higher stability, with potential for increased accuracy. We also incorporate model calibration via temperature scaling, providing better alignment between predicted probabilities and empirical distributions.

    The ISM study revealed that OSAIMANE and SpliceAI made predictions using very similar sets of motifs (Figure 6B). Across several experiments, we note that SpliceAI exhibits an inherent bias near the starts and ends of transcripts which are padded with flanking Ns (as was done in the original study), predicting donor and acceptor sites in these boundaries with an extremely high signal that disappears when the sequence is padded with the actual genomic sequence. For example, the model correctly predicted the first donor site of the CFTR gene when the gene’s boundaries were flanked with N’s; however, when replaced those N’s with the actual DNA sequence upstream of the gene boundary, the signal all but disappeared, as seen in Figure 6D. This suggests a bias resulting from the way the model is trained. In our ISM benchmarks, we thus chose not to use flanking N’s unless explicitly recreating a study from the original SpliceAI paper.

    Additionally, we note that both the SpliceAI and OSAIMANE ‘models’ are the averaged result of five individual models, each initialized with slightly different weights. During the prediction process, each individual model was found to have discernibly different performance. By averaging their outputs leveraging the deep-ensemble approach (Fort et al., 2019; Lakshminarayanan et al., 2017), the overall performance of both SpliceAI and OpenSpliceAI improved while reducing sensitivity to local variations. In essence, this method normalizes the inherent randomness of the individual models, resulting in predictions that are more robust and better represent the expected behavior, ultimately yielding improved average performance across large datasets. OpenSpliceAI’s ‘predict’ submodule averages across all five models by default, but it also supports prediction using a single model.

    In summary, OpenSpliceAI is a fully open-source, accessible, and computationally efficient deep learning system for splice site prediction. Its modular architecture, enhanced performance, and adaptability to diverse species make it a powerful tool for advancing research on gene regulation and splicing across diverse species.

    Continue Reading

  • CEF Energy: Connecting Europe through CO₂ infrastructure

    CEF Energy: Connecting Europe through CO₂ infrastructure

    CEF Energy is contributing to the development of Europe’s CO₂ networks by funding key transport infrastructure, a central element of the EU’s Industrial Carbon Management Strategy. Since 2019, the programme has invested over €978 million in 28 projects, covering both studies and works across the full CO₂ transport chain – including pipelines, liquefaction terminals, buffer storage sites, and compressor facilities. By linking industrial emitters to permanent storage locations, these projects play a crucial role in reducing industrial emissions and advancing towards climate neutrality by 2050. 

    New supported CO₂ projects 

    Following the 2024 CEF Energy call for proposals for Projects of Common Interest (PCI) and Projects of Mutual Interest (PMI), the promoters of ten CO₂ projects have signed Grant Agreements with CINEA in 2025, further expanding Europe’s carbon dioxide transport networks. Together, these actions represent an EU investment of around €240 million, covering three construction works projects and seven preparatory studies. They aim to advance detailed design studies, strengthen cross-border connections and facilitate access to underground storage , with the aim to accelerate the development of new infrastructure that will enable the safe transport of captured CO₂ from industrial clusters to permanent storage sites.

    Examples of these new projects include the Prinos project in Northern Greece, which received nearly €120 million to develop a CO2 import terminal and upgrade offshore facilities to create the first carbon capture and storage value chain in the South-Eastern Mediterranean region; the North Sea L10 COfacility on the Dutch continental shelf, awarded €55 million for the construction of an offshore spurline connecting to the Aramis project; and the Norne CO2 facility in Denmark, granted almost €12 million for construction of the extension of quay walls in the Port of Aalborg within the first implementation phase of the PCI. For studies, the Baltic CCS project is preparing the development of a cross-border CO₂ transport network linking industrial emitters in Latvia and Lithuania to a liquid CO₂ terminal in Klaipėda (Lithuania). CEF support contributes to technical, environmental and economic studies to assess the feasibility and design of the terminal and the wider CO₂ value chain. 

    Together, these ten projects represent an important step towards the necessary European CO₂ infrastructure supporting the 2030 target of 50 million tonnes of annual CO2 injection capacity outlined in the Net Zero Industry Act. They complement earlier initiatives, extend the reach of the carbon dioxide network to new regions, and highlight the EU’s firm commitment to advancing industrial decarbonisation. 

    Success story paving the way

    Several CEF Energy supported projects are already demonstrating how EU funding is turning CO₂ infrastructure plans into reality. Among them, Porthos stands out for its maturity, progress and impact, showing how coordinated European action is building a connected CO₂ transport and storage system.

    The Porthos project, coordinated by the Port of Rotterdam and implemented together with Gasunie and EBN, is developing an open access, cross-border network to transport COfrom industrial sources in the port areas of Rotterdam, Antwerp and Ghent to offshore storage locations in the North Sea. CEF supports the construction of a 33 km long onshore pipeline connecting emitters in the port of Rotterdam, a compressor station of 20 MW located at Aziëweg, and a 20 km offshore pipeline that will transport the compressed captured COto depleted gas fields for storage in the Dutch section of the North Sea. Implemented as part of the PCI CO2 TransPorts, Porthos is expected to be operational in 2026, and illustrates how public-private investments and cooperation can drive large-scale climate solutions. 

    Building synergies across EU programmes

    The deployment of CO₂ transport infrastructure in Europe relies on strong complementarities between EU funding programmes managed by CINEA. While Horizon Europe supports research and innovation for new or improved technologies and the Innovation Fund finances large-scale industrial decarbonisation projects that generate the captured CO₂ to be transported and stored, CEF Energy focuses on developing networks and infrastructures with a cross-border dimension to allow the transport of CO2 from emitters and sources towards permanent geological storage. Together, all three funding programmes support a coherent value chain – from carbon capture to transport and permanent storage – essential to achieve climate neutrality. 

    One clear example of this complementarity can be seen between the projects Northern Lights (supported by CEF Energy) and Beccs Stockholm (supported by the Innovation Fund). BECCS is building one of the world’s largest facilities for capturing and permanently storing biogenic CO2 in Sweden. This CO2 needs to be safely stored, which is where Northern Lights comes in, as it will enable the storage of up to 900,000 tonnes of biogenic CO2 annually from Stockholm Exergi, while also offering additional CO2 storage capacity (up to 5 Mtpa in total) for other European emitters. A positive Final Investment Decision (FID) was reached by the promoters of these projects in March 2025.

    CINEA promotes close coordination and knowledge exchange between project promoters and programme teams, helping to identify synergies, avoid overlaps and accelerate progress across funding instruments. This collaborative approach reinforces Europe’s Industrial Carbon Management ecosystem, ensuring that EU investments deliver maximum impact for a competitive, connected, and climate-neutral Europe.

    More information

    Interactive publication on EU funding to the Industrial Carbon Management

    Continue Reading

  • Vote for your greatest race from F1’s 75 years of history

    Vote for your greatest race from F1’s 75 years of history

    On May 13, F1.com started counting down and celebrating the 25 greatest races throughout the sport’s 75 years of history. Today, October 30, our No. 1 race has been revealed – the 2011 Canadian Grand Prix.

    Jenson Button’s victory in that…

    Continue Reading

  • ‘Fresh Prince of Bel-Air’ Actor Was 42

    ‘Fresh Prince of Bel-Air’ Actor Was 42

    Floyd Roger Myers Jr., who played younger versions of Will Smith and Marlon Jackson, respectively, on episodes of The Fresh Prince of Bel-Air and The Jacksons: An American Dream, has died. He was 42.

    Myers died Wednesday after suffering a…

    Continue Reading

  • Israel receives coffins Hamas says contain two Gaza hostages’ bodies

    Israel receives coffins Hamas says contain two Gaza hostages’ bodies

    Israel has received via the Red Cross in Gaza two coffins which the Palestinian armed group Hamas says contain the bodies of deceased hostages, according to the Israeli prime minister’s office.

    Israeli forces will now transfer the bodies to the…

    Continue Reading

  • 15 of the best night creams for mature skin that target fine lines and boost collagen

    15 of the best night creams for mature skin that target fine lines and boost collagen

    15 of the best night creams for mature skin that target fine lines and boost collagen

    How does menopause affect skin changes?

    Perimenopause and menopause bring complex shifts. “After the menopause, oestrogen levels fall significantly and this can…

    Continue Reading

  • Influential Firestone Racing Leader Al Speyer Dies at 75

    Influential Firestone Racing Leader Al Speyer Dies at 75

    Al Speyer, who helped lead Firestone’s triumphant return to open-wheel racing in the mid-1990s and became an influential motorsports executive during a pivotal period in INDYCAR history, died Oct. 27 in Hendersonville, Tennessee….

    Continue Reading

  • Just a moment…

    Just a moment…

    Continue Reading