The Global Virus Network (GVN), a coalition of leading medical virologists representing 80+ Centers of Excellence and Affiliates in 40+ countries, today announced the addition of three new Centers of Excellence: the USF Health…
Blog
-

Grokipedia vs. Ruwiki Elon Musk’s Wikipedia rival uses AI to push its creator’s views — something Moscow already tried. Meduza compares the results.
This week saw the launch of Grokipedia, a large language model-powered online encyclopedia created by Elon Musk. The U.S. billionaire claims the new product is a less “biased” alternative to Wikipedia, the decades-old reference site widely seen as one of the last surviving relics of a healthier, more democratic Internet. However, users have found much of Grokipedia’s content to be less than neutral, often promoting the same right-wing views that became more prevalent on X after Musk bought Twitter. If the concept of an AI-“enhanced” Wikipedia alternative with a heavy editorial hand sounds familiar, that’s because the Kremlin already launched one months ago. Meduza compares Grokipedia to Russia’s homegrown reference site, Ruwiki, and examines how they each treat certain politically charged topics.
Drawing from the OG
Ruwiki, first launched in beta in the summer of 2023, is essentially a fork of the Russian-language Wikipedia, meaning it’s largely based on its predecessor’s articles. The main difference — besides its AI capabilities, which were added later — is that Ruwiki articles about topics that are politically sensitive in Russia have been heavily censored. In 2024, the outlet T-invariant reported that most of Ruwiki’s articles about apolitical topics “are copied word-for-word from Wikipedia.”
Grokipedia also appears to be something of a Wikipedia fork. Many of its entries include the disclaimer: “The content is adapted from Wikipedia, licensed under Creative Commons Attribution-ShareAlike 4.0 License.” According to NBC News, some entries are copied verbatim from Wikipedia.
Much like Ruwiki, the Grokipedia articles that differ most from their Wikipedia counterparts are the ones about its creators’ pet issues. For example, as NBC notes, while the Wikipedia article for U.S. President Donald Trump includes a section on potential conflicts of interest, the Grokipedia entry omits many of his highest profile corruption allegations.
Our only hope is you. Support Meduza before it’s too late.
AI Integration
Ruwiki is integrated with YandexGPT, the AI chatbot created by Russian tech giant Yandex. The encyclopedia’s homepage looks more like that of ChatGPT than Wikipedia, consisting of a text field underneath the question “What do you want to learn?” After the user enters a question, Ruwiki provides an AI-generated response based on its body of content, linking to specific articles that it cites.
YandexGPT is itself heavily censored, giving vague and evasive responses to user questions about politically sensitive topics such as the war in Ukraine or the late Russian opposition leader Alexey Navalny.
Grokipedia’s home page is also minimalistic, with a dark color scheme and a single text field. However, rather than an AI-generated response, queries there return a list of related Grok articles, which have themselves been generated and “fact-checked” by Grok, Musk’s AI chatbot.
Grok has repeatedly made headlines this year for pushing conspiracy theories, praising Hitler, and denying the Holocaust in its interactions with users on X (previously Twitter). At one point, it began mentioning “white genocide” in South Africa in responses to unrelated user posts throughout the platform, explaining in one case that it had been “instructed” to do so.
Ukraine war coverage
Grokipedia’s article on Russia’s full-scale war in Ukraine is far less blatantly propagandistic than that of Ruwiki. However, unlike Wikipedia, Grokipedia features common Russian propaganda talking points more prominently and generally assigns them equal weight to evidence-based claims.
The Ruwiki entry, which is titled “Hostilities in Ukraine,” adheres closely to Moscow’s official narratives surrounding the war in Ukraine. Its opening sentence defines the conflict as “an indirect military confrontation between Russia and the United States and NATO.” Subsequent paragraphs suggest the war is a direct consequence of NATO expansion in Eastern Europe; Ukraine’s Maidan Revolution, which it refers to as a Western-backed coup; and a “military operation” by Kyiv “against the population of Donbas.” Overall, the article consistently frames Russia’s invasion as a defensive operation against the aggression of Western countries.
Grokipedia’s entry, on the contrary, acknowledges that Russia “initiated a full-scale invasion” and attempted to instigate regime change in Ukraine in 2022. At the same time, for all its purported neutrality, Grokipedia’s framing of the conflict often amounts to false balance, presenting easily refutable Russian disinformation as merely another “perspective” on the war.
For example, the second sentence of the more than 11,000-word article notes that Russia argues its “special military operation” is “aimed at demilitarizing and denazifying Ukraine” and “protecting ethnic Russians and Russian speakers from alleged persecution in Donbas,” but fails to mention that there was no systematic persecution of Russians or Russian speakers in the Donbas or that there are no “Nazis” in power in Ukraine.
Sign up for Meduza’s daily newsletter
A digest of Russia’s investigative reports and news analysis. If it matters, we summarize it.
‘Gender ideology’
While both Elon Musk and the Kremlin have framed the existence of transgender people as an unnatural phenomenon or conspiracy pushed by U.S. left-wing elites, Grokipedia’s entry for “Transgender” editorializes much more and differs more dramatically from Wikipedia than Ruwiki’s.
The Ruwiki article is largely copied directly from Russian-language Wikipedia, though some sentences appear to have been removed, such as: “Transgender identity is not a disease or a disorder.” Grokipedia’s entry, on the other hand, repeatedly suggests that transgender identity is a “social contagion” and that gender affirming care is more harmful than the medical establishment claims. It also devotes more attention to comorbidities with gender dysphoria, pointing to studies that have found a higher prevalence of autism, depression, and anxiety among transgender people than among cisgender people. While it lists “critiques of innate gender identity models” among multiple “theories of causation” for “transgender identification,” it asserts that “philosophically,” this theory “invites circularity” and “conflates belief with biology.”
Both encyclopedias include sections about religious views on transgender identity. However, the Ruwiki article does not make overarching statements about the overall attitude of major religions towards transgender people, instead quoting statements from religious bodies and leaders criticizing the concept of gender identity. In contrast, the corresponding section in Grokipedia asserts: “Major world religions predominantly view transgender identity and transitions as incompatible with divine creation of binary biological sex.”
Dear leaders
Compared to Wikipedia, both Grokipedia’s and Ruwiki’s articles for their countries’ respective presidents omit significant negative information about them. Most of Ruwiki’s entry about Russian President Vladimir Putin reads like an article from Russian state media or from the Kremlin’s official website. However, it does briefly mention the International Criminal Court’s arrest warrant for Putin, noting that Russia does not recognize the court’s jurisdiction. It also includes two sentences about the Kursk submarine disaster in 2000, noting that the incident “prompted criticism not only toward the leadership of the Russian Navy but also toward the president himself.”
Even the “criticism” section of the Ruwiki article on Putin consists largely of compliments from foreign leaders, such as U.K. politician Nigel Farage’s statement that he dislikes the Russian president as a person but admires him “as a political operator.”
Grokipedia’s entry on Putin is less fawning but repeatedly takes the Kremlin’s own statements at face value. For example, the article lists a number of past political assassinations of Putin’s critics and enemies, but describes Russia’s failure to prosecute “alleged organizers” as “fueling debates over higher-level complicity.” Notably, the Grokipedia article does not mention the International Criminal Court’s arrest warrant for Putin over the illegal deportation of Ukrainian children.
Ruwiki is more willing to criticize Donald Trump, noting (unlike Russian-language Wikipedia) in its second sentence that Trump is “the first former U.S. president in history to be convicted of a criminal offense.” Also, unlike Wikipedia, the Ruwiki article includes an entire section on Trump’s relationship with the late financier and convicted sex offender Jeffrey Epstein.
Grokipedia avoids mentioning many of Trump’s major scandals, including his relationship with Epstein, the 2024 court ruling that he defamed E. Jean Carroll in comments denying her accusations of sexual assault, and the numerous corruption allegations against him.
Continue Reading
-

‘It’s dark in the US right now. But I turn on a light, you know?’: Mavis Staples on Prince, Martin Luther King and her 75-year singing career | Mavis Staples
Can you speak about the array of songs and artists on your new record? What kind of message and lyrics do you want to sing at this point in your life? steve_bayley
The first song I got for the album was Human Mind, written by Hozier and Allison…Continue Reading
-

OpenSpliceAI provides an efficient modular implementation of SpliceAI enabling easy retraining across nonhuman species
We developed OpenSpliceAI to be a modular Python toolkit designed as an open-source implementation of SpliceAI, to which we added several key enhancements. The framework replicates the core logic of the SpliceAI model while optimizing prediction efficiency and variant effect analysis, such as acceptor and donor gains or losses, using pre-trained models. Our benchmarks show substantial computational advantages over SpliceAI, with faster processing, lower memory usage, and improved GPU efficiency (Figure 2B, Figure 2—figure supplement 6). These improvements are driven by our optimized PyTorch implementation that employs dynamic computation graphs and on-demand GPU memory allocation – allowing memory to be allocated and freed as needed – in contrast to SpliceAI’s static, Keras-based TensorFlow approach, which pre-allocates memory for the worst-case input size. In SpliceAI, this rigid memory allocation leads to high memory overhead and frequent out-of-memory errors when handling large datasets through large loop iteration prediction. Additionally, OpenSpliceAI leverages streamlined data handling and enhanced parallelization through batch prediction and multiprocessing, automatically distributing tasks across available threads. Together, these features prevent the memory pitfalls common in SpliceAI and make OpenSpliceAI a more scalable and efficient solution for large-scale genomic analysis.
It is important to note that even though OpenSpliceAI and SpliceAI share the same model architecture, the released trained models are not identical. The variability observed between our models and the original SpliceAI – and even among successive training runs using the same code and data – can be attributed to several sources of inherent randomness. First, weight initialization is performed randomly for many layers, which means that different initial weights can lead to distinct convergence paths and final model parameters. Second, the process of data shuffling alters the composition of mini-batches during training, impacting both the training dynamics and the statistics computed in batch normalization layers. Although batch normalization is deterministic for a fixed mini-batch, its reliance on batch statistics introduces variability due to the random sampling of data. Finally, OpenSpliceAI employs the AdamW optimizer (Loshchilov and Hutter, 2019), which incorporates exponential moving averages of the first and second moments of the gradients. This mechanism serves a momentum-like role, contributing to an adaptive learning process that is inherently stochastic. Moreover, subtle differences in the order of operations or floating-point arithmetic, particularly in distributed computing environments, can further amplify this stochastic behavior. Together, these factors contribute to the observed nondeterministic behavior, resulting in slight discrepancies between our trained models and the original SpliceAI, as well as among successive training sessions under identical conditions.
OpenSpliceAI empowers researchers to adapt the framework to many other species by including modules that enable easy retraining. For closely related species such as mice, our retrained model demonstrated comparable or slightly better precision than the human-based SpliceAI model. For more distant species such as A. thaliana, whose genomic structure differs substantially from humans, retraining OpenSpliceAI yields much greater improvements in accuracy. Our initial release includes models trained on the human MANE genome annotation and four additional species: mouse, zebrafish, honeybee, and A. thaliana. We also evaluated pre-training on mouse (OSAIMouse), honeybee (OSAIHoneybee), zebrafish (OSAIZebrafish), and Arabidopsis (OSAIArabidopsis) followed by fine-tuning on the human MANE dataset. While cross-species pre-training substantially accelerated convergence during fine-tuning, the final human splicing prediction accuracy was comparable to that of a model trained from scratch on human data. This result indicates that our architecture seems to capture all relevant splicing features from human training data alone and thus gains little or no benefit from cross-species transfer learning in this context (see Figure 4—figure supplement 5).
OpenSpliceAI also includes modules for transfer learning, allowing researchers to initialize models with weights learned on other species. In our transfer learning experiments, models transferred from human to other species displayed faster convergence and higher stability, with potential for increased accuracy. We also incorporate model calibration via temperature scaling, providing better alignment between predicted probabilities and empirical distributions.
The ISM study revealed that OSAIMANE and SpliceAI made predictions using very similar sets of motifs (Figure 6B). Across several experiments, we note that SpliceAI exhibits an inherent bias near the starts and ends of transcripts which are padded with flanking Ns (as was done in the original study), predicting donor and acceptor sites in these boundaries with an extremely high signal that disappears when the sequence is padded with the actual genomic sequence. For example, the model correctly predicted the first donor site of the CFTR gene when the gene’s boundaries were flanked with N’s; however, when replaced those N’s with the actual DNA sequence upstream of the gene boundary, the signal all but disappeared, as seen in Figure 6D. This suggests a bias resulting from the way the model is trained. In our ISM benchmarks, we thus chose not to use flanking N’s unless explicitly recreating a study from the original SpliceAI paper.
Additionally, we note that both the SpliceAI and OSAIMANE ‘models’ are the averaged result of five individual models, each initialized with slightly different weights. During the prediction process, each individual model was found to have discernibly different performance. By averaging their outputs leveraging the deep-ensemble approach (Fort et al., 2019; Lakshminarayanan et al., 2017), the overall performance of both SpliceAI and OpenSpliceAI improved while reducing sensitivity to local variations. In essence, this method normalizes the inherent randomness of the individual models, resulting in predictions that are more robust and better represent the expected behavior, ultimately yielding improved average performance across large datasets. OpenSpliceAI’s ‘predict’ submodule averages across all five models by default, but it also supports prediction using a single model.
In summary, OpenSpliceAI is a fully open-source, accessible, and computationally efficient deep learning system for splice site prediction. Its modular architecture, enhanced performance, and adaptability to diverse species make it a powerful tool for advancing research on gene regulation and splicing across diverse species.
Continue Reading
-
CEF Energy: Connecting Europe through CO₂ infrastructure
CEF Energy is contributing to the development of Europe’s CO₂ networks by funding key transport infrastructure, a central element of the EU’s Industrial Carbon Management Strategy. Since 2019, the programme has invested over €978 million in 28 projects, covering both studies and works across the full CO₂ transport chain – including pipelines, liquefaction terminals, buffer storage sites, and compressor facilities. By linking industrial emitters to permanent storage locations, these projects play a crucial role in reducing industrial emissions and advancing towards climate neutrality by 2050.
New supported CO₂ projects
Following the 2024 CEF Energy call for proposals for Projects of Common Interest (PCI) and Projects of Mutual Interest (PMI), the promoters of ten CO₂ projects have signed Grant Agreements with CINEA in 2025, further expanding Europe’s carbon dioxide transport networks. Together, these actions represent an EU investment of around €240 million, covering three construction works projects and seven preparatory studies. They aim to advance detailed design studies, strengthen cross-border connections and facilitate access to underground storage , with the aim to accelerate the development of new infrastructure that will enable the safe transport of captured CO₂ from industrial clusters to permanent storage sites.
Examples of these new projects include the Prinos project in Northern Greece, which received nearly €120 million to develop a CO2 import terminal and upgrade offshore facilities to create the first carbon capture and storage value chain in the South-Eastern Mediterranean region; the North Sea L10 CO2 facility on the Dutch continental shelf, awarded €55 million for the construction of an offshore spurline connecting to the Aramis project; and the Norne CO2 facility in Denmark, granted almost €12 million for construction of the extension of quay walls in the Port of Aalborg within the first implementation phase of the PCI. For studies, the Baltic CCS project is preparing the development of a cross-border CO₂ transport network linking industrial emitters in Latvia and Lithuania to a liquid CO₂ terminal in Klaipėda (Lithuania). CEF support contributes to technical, environmental and economic studies to assess the feasibility and design of the terminal and the wider CO₂ value chain.
Together, these ten projects represent an important step towards the necessary European CO₂ infrastructure supporting the 2030 target of 50 million tonnes of annual CO2 injection capacity outlined in the Net Zero Industry Act. They complement earlier initiatives, extend the reach of the carbon dioxide network to new regions, and highlight the EU’s firm commitment to advancing industrial decarbonisation.
Success story paving the way
Several CEF Energy supported projects are already demonstrating how EU funding is turning CO₂ infrastructure plans into reality. Among them, Porthos stands out for its maturity, progress and impact, showing how coordinated European action is building a connected CO₂ transport and storage system.
The Porthos project, coordinated by the Port of Rotterdam and implemented together with Gasunie and EBN, is developing an open access, cross-border network to transport CO2 from industrial sources in the port areas of Rotterdam, Antwerp and Ghent to offshore storage locations in the North Sea. CEF supports the construction of a 33 km long onshore pipeline connecting emitters in the port of Rotterdam, a compressor station of 20 MW located at Aziëweg, and a 20 km offshore pipeline that will transport the compressed captured CO2 to depleted gas fields for storage in the Dutch section of the North Sea. Implemented as part of the PCI CO2 TransPorts, Porthos is expected to be operational in 2026, and illustrates how public-private investments and cooperation can drive large-scale climate solutions.
Building synergies across EU programmes
The deployment of CO₂ transport infrastructure in Europe relies on strong complementarities between EU funding programmes managed by CINEA. While Horizon Europe supports research and innovation for new or improved technologies and the Innovation Fund finances large-scale industrial decarbonisation projects that generate the captured CO₂ to be transported and stored, CEF Energy focuses on developing networks and infrastructures with a cross-border dimension to allow the transport of CO2 from emitters and sources towards permanent geological storage. Together, all three funding programmes support a coherent value chain – from carbon capture to transport and permanent storage – essential to achieve climate neutrality.
One clear example of this complementarity can be seen between the projects Northern Lights (supported by CEF Energy) and Beccs Stockholm (supported by the Innovation Fund). BECCS is building one of the world’s largest facilities for capturing and permanently storing biogenic CO2 in Sweden. This CO2 needs to be safely stored, which is where Northern Lights comes in, as it will enable the storage of up to 900,000 tonnes of biogenic CO2 annually from Stockholm Exergi, while also offering additional CO2 storage capacity (up to 5 Mtpa in total) for other European emitters. A positive Final Investment Decision (FID) was reached by the promoters of these projects in March 2025.
CINEA promotes close coordination and knowledge exchange between project promoters and programme teams, helping to identify synergies, avoid overlaps and accelerate progress across funding instruments. This collaborative approach reinforces Europe’s Industrial Carbon Management ecosystem, ensuring that EU investments deliver maximum impact for a competitive, connected, and climate-neutral Europe.
More information
Interactive publication on EU funding to the Industrial Carbon Management
Continue Reading
-

Vote for your greatest race from F1’s 75 years of history
On May 13, F1.com started counting down and celebrating the 25 greatest races throughout the sport’s 75 years of history. Today, October 30, our No. 1 race has been revealed – the 2011 Canadian Grand Prix.
Jenson Button’s victory in that…
Continue Reading
-

‘Fresh Prince of Bel-Air’ Actor Was 42
Floyd Roger Myers Jr., who played younger versions of Will Smith and Marlon Jackson, respectively, on episodes of The Fresh Prince of Bel-Air and The Jacksons: An American Dream, has died. He was 42.
Myers died Wednesday after suffering a…
Continue Reading
-

Israel receives coffins Hamas says contain two Gaza hostages’ bodies
Israel has received via the Red Cross in Gaza two coffins which the Palestinian armed group Hamas says contain the bodies of deceased hostages, according to the Israeli prime minister’s office.
Israeli forces will now transfer the bodies to the…
Continue Reading
-

15 of the best night creams for mature skin that target fine lines and boost collagen
15 of the best night creams for mature skin that target fine lines and boost collagen How does menopause affect skin changes?
Perimenopause and menopause bring complex shifts. “After the menopause, oestrogen levels fall significantly and this can…
Continue Reading
-

Influential Firestone Racing Leader Al Speyer Dies at 75
Al Speyer, who helped lead Firestone’s triumphant return to open-wheel racing in the mid-1990s and became an influential motorsports executive during a pivotal period in INDYCAR history, died Oct. 27 in Hendersonville, Tennessee….
Continue Reading
