Blog

  • Free Play Days – Call of Duty Black Ops 6 (MP & Zombies), Dragon Ball Xenoverse 2, Dead by Daylight, CATAN, and More – Xbox Wire

    1. Free Play Days – Call of Duty Black Ops 6 (MP & Zombies), Dragon Ball Xenoverse 2, Dead by Daylight, CATAN, and More  Xbox Wire
    2. Activision Announces Call of Duty: Black Ops 6 Multiplayer and Zombies Go Free Again Today, Just as EA Launches…

    Continue Reading

  • Quantify Boards YA Adventure ‘The Queen’s Jewels’ For AFM Sales Push

    Quantify Boards YA Adventure ‘The Queen’s Jewels’ For AFM Sales Push

    EXCLUSIVE: Quantify has acquired worldwide sales rights to the YA adventure thriller The Queen’s Jewels, starring Carson Rowland, Katherine McNamara, Michael Evans Behling and Natalie Martinez, for an AFM launch.

    It will mark the first…

    Continue Reading

  • Xbox shoots for the moon with Snapchat activation amid pricing furor

    Xbox shoots for the moon with Snapchat activation amid pricing furor

    This audio is auto-generated. Please let us know if you have

    Continue Reading

  • Comet Lemmon will be a Halloween treat – Astronomy Magazine

    1. Comet Lemmon will be a Halloween treat  Astronomy Magazine
    2. ‘Miracle’ photo captures Comet Lemmon and meteor seemingly entwined over Earth  Live Science
    3. Comet Tracker For Wednesday: See Comets For Final Time As Moon Grows  Forbes
    4. Starwatchers capture…

    Continue Reading

  • Prince William and Catherine win legal battle over ski holiday pictures

    Prince William and Catherine win legal battle over ski holiday pictures

    The Prince and Princess of Wales have won a legal battle against a French magazine which published photographs of their private ski holiday in April, Kensington Palace has said.

    The images of the royal couple and their children Prince George,…

    Continue Reading

  • The Strad news – Hilary Hahn cancels three Berlin Philharmonic dates in December

    The Strad news – Hilary Hahn cancels three Berlin Philharmonic dates in December

    Read more news stories here 

    Hilary Hahn has had to cancel further concerts in December as she continues to recover from her ongoing double pinched nerve injury. The violinist was due to play Dvořák’s Violin Concerto in A minor with the…

    Continue Reading

  • Global Virus Network announces the addition of three new Centers of Excellence

    Global Virus Network announces the addition of three new Centers of Excellence

    The Global Virus Network (GVN), a coalition of leading medical virologists representing 80+ Centers of Excellence and Affiliates in 40+ countries, today announced the addition of three new Centers of Excellence: the USF Health…

    Continue Reading

  • Grokipedia vs. Ruwiki Elon Musk’s Wikipedia rival uses AI to push its creator’s views — something Moscow already tried. Meduza compares the results.

    Grokipedia vs. Ruwiki Elon Musk’s Wikipedia rival uses AI to push its creator’s views — something Moscow already tried. Meduza compares the results.

    This week saw the launch of Grokipedia, a large language model-powered online encyclopedia created by Elon Musk. The U.S. billionaire claims the new product is a less “biased” alternative to Wikipedia, the decades-old reference site widely seen as one of the last surviving relics of a healthier, more democratic Internet. However, users have found much of Grokipedia’s content to be less than neutral, often promoting the same right-wing views that became more prevalent on X after Musk bought Twitter. If the concept of an AI-“enhanced” Wikipedia alternative with a heavy editorial hand sounds familiar, that’s because the Kremlin already launched one months ago. Meduza compares Grokipedia to Russia’s homegrown reference site, Ruwiki, and examines how they each treat certain politically charged topics.

    Drawing from the OG

    Ruwiki, first launched in beta in the summer of 2023, is essentially a fork of the Russian-language Wikipedia, meaning it’s largely based on its predecessor’s articles. The main difference — besides its AI capabilities, which were added later — is that Ruwiki articles about topics that are politically sensitive in Russia have been heavily censored. In 2024, the outlet T-invariant reported that most of Ruwiki’s articles about apolitical topics “are copied word-for-word from Wikipedia.”

    Grokipedia also appears to be something of a Wikipedia fork. Many of its entries include the disclaimer: “The content is adapted from Wikipedia, licensed under Creative Commons Attribution-ShareAlike 4.0 License.” According to NBC News, some entries are copied verbatim from Wikipedia.

    Much like Ruwiki, the Grokipedia articles that differ most from their Wikipedia counterparts are the ones about its creators’ pet issues. For example, as NBC notes, while the Wikipedia article for U.S. President Donald Trump includes a section on potential conflicts of interest, the Grokipedia entry omits many of his highest profile corruption allegations.

    Our only hope is you. Support Meduza before it’s too late.

    AI Integration

    Ruwiki is integrated with YandexGPT, the AI chatbot created by Russian tech giant Yandex. The encyclopedia’s homepage looks more like that of ChatGPT than Wikipedia, consisting of a text field underneath the question “What do you want to learn?” After the user enters a question, Ruwiki provides an AI-generated response based on its body of content, linking to specific articles that it cites.

    YandexGPT is itself heavily censored, giving vague and evasive responses to user questions about politically sensitive topics such as the war in Ukraine or the late Russian opposition leader Alexey Navalny.

    Grokipedia’s home page is also minimalistic, with a dark color scheme and a single text field. However, rather than an AI-generated response, queries there return a list of related Grok articles, which have themselves been generated and “fact-checked” by Grok, Musk’s AI chatbot.

    Grok has repeatedly made headlines this year for pushing conspiracy theories, praising Hitler, and denying the Holocaust in its interactions with users on X (previously Twitter). At one point, it began mentioning “white genocide” in South Africa in responses to unrelated user posts throughout the platform, explaining in one case that it had been “instructed” to do so.

    ‘Commitment to providing facts without bias’ Russia’s flagship AI chatbot recommends reading Meduza and other ‘foreign agents’

    ‘Commitment to providing facts without bias’ Russia’s flagship AI chatbot recommends reading Meduza and other ‘foreign agents’

    Ukraine war coverage

    Grokipedia’s article on Russia’s full-scale war in Ukraine is far less blatantly propagandistic than that of Ruwiki. However, unlike Wikipedia, Grokipedia features common Russian propaganda talking points more prominently and generally assigns them equal weight to evidence-based claims.

    The Ruwiki entry, which is titled “Hostilities in Ukraine,” adheres closely to Moscow’s official narratives surrounding the war in Ukraine. Its opening sentence defines the conflict as “an indirect military confrontation between Russia and the United States and NATO.” Subsequent paragraphs suggest the war is a direct consequence of NATO expansion in Eastern Europe; Ukraine’s Maidan Revolution, which it refers to as a Western-backed coup; and a “military operation” by Kyiv “against the population of Donbas.” Overall, the article consistently frames Russia’s invasion as a defensive operation against the aggression of Western countries. 

    Grokipedia’s entry, on the contrary, acknowledges that Russia “initiated a full-scale invasion” and attempted to instigate regime change in Ukraine in 2022. At the same time, for all its purported neutrality, Grokipedia’s framing of the conflict often amounts to false balance, presenting easily refutable Russian disinformation as merely another “perspective” on the war. 

    For example, the second sentence of the more than 11,000-word article notes that Russia argues its “special military operation” is “aimed at demilitarizing and denazifying Ukraine” and “protecting ethnic Russians and Russian speakers from alleged persecution in Donbas,” but fails to mention that there was no systematic persecution of Russians or Russian speakers in the Donbas or that there are no “Nazis” in power in Ukraine.

    Sign up for Meduza’s daily newsletter

    A digest of Russia’s investigative reports and news analysis. If it matters, we summarize it.

    ‘Gender ideology’

    While both Elon Musk and the Kremlin have framed the existence of transgender people as an unnatural phenomenon or conspiracy pushed by U.S. left-wing elites, Grokipedia’s entry for “Transgender” editorializes much more and differs more dramatically from Wikipedia than Ruwiki’s.

    The Ruwiki article is largely copied directly from Russian-language Wikipedia, though some sentences appear to have been removed, such as: “Transgender identity is not a disease or a disorder.” Grokipedia’s entry, on the other hand, repeatedly suggests that transgender identity is a “social contagion” and that gender affirming care is more harmful than the medical establishment claims. It also devotes more attention to comorbidities with gender dysphoria, pointing to studies that have found a higher prevalence of autism, depression, and anxiety among transgender people than among cisgender people. While it lists “critiques of innate gender identity models” among multiple “theories of causation” for “transgender identification,” it asserts that “philosophically,” this theory “invites circularity” and “conflates belief with biology.”

    Both encyclopedias include sections about religious views on transgender identity. However, the Ruwiki article does not make overarching statements about the overall attitude of major religions towards transgender people, instead quoting statements from religious bodies and leaders criticizing the concept of gender identity. In contrast, the corresponding section in Grokipedia asserts: “Major world religions predominantly view transgender identity and transitions as incompatible with divine creation of binary biological sex.”

    Dear leaders

    Compared to Wikipedia, both Grokipedia’s and Ruwiki’s articles for their countries’ respective presidents omit significant negative information about them. Most of Ruwiki’s entry about Russian President Vladimir Putin reads like an article from Russian state media or from the Kremlin’s official website. However, it does briefly mention the International Criminal Court’s arrest warrant for Putin, noting that Russia does not recognize the court’s jurisdiction. It also includes two sentences about the Kursk submarine disaster in 2000, noting that the incident “prompted criticism not only toward the leadership of the Russian Navy but also toward the president himself.”

    Even the “criticism” section of the Ruwiki article on Putin consists largely of compliments from foreign leaders, such as U.K. politician Nigel Farage’s statement that he dislikes the Russian president as a person but admires him “as a political operator.”

    A useless add-on Russia’s Wikipedia replacement is touting its integrated AI — but the results are underwhelming

    A useless add-on Russia’s Wikipedia replacement is touting its integrated AI — but the results are underwhelming

    Grokipedia’s entry on Putin is less fawning but repeatedly takes the Kremlin’s own statements at face value. For example, the article lists a number of past political assassinations of Putin’s critics and enemies, but describes Russia’s failure to prosecute “alleged organizers” as “fueling debates over higher-level complicity.” Notably, the Grokipedia article does not mention the International Criminal Court’s arrest warrant for Putin over the illegal deportation of Ukrainian children.

    Ruwiki is more willing to criticize Donald Trump, noting (unlike Russian-language Wikipedia) in its second sentence that Trump is “the first former U.S. president in history to be convicted of a criminal offense.” Also, unlike Wikipedia, the Ruwiki article includes an entire section on Trump’s relationship with the late financier and convicted sex offender Jeffrey Epstein.

    Grokipedia avoids mentioning many of Trump’s major scandals, including his relationship with Epstein, the 2024 court ruling that he defamed E. Jean Carroll in comments denying her accusations of sexual assault, and the numerous corruption allegations against him. 

    Continue Reading

  • ‘It’s dark in the US right now. But I turn on a light, you know?’: Mavis Staples on Prince, Martin Luther King and her 75-year singing career | Mavis Staples

    ‘It’s dark in the US right now. But I turn on a light, you know?’: Mavis Staples on Prince, Martin Luther King and her 75-year singing career | Mavis Staples

    Can you speak about the array of songs and artists on your new record? What kind of message and lyrics do you want to sing at this point in your life? steve_bayley
    The first song I got for the album was Human Mind, written by Hozier and Allison…

    Continue Reading

  • OpenSpliceAI provides an efficient modular implementation of SpliceAI enabling easy retraining across nonhuman species

    OpenSpliceAI provides an efficient modular implementation of SpliceAI enabling easy retraining across nonhuman species

    We developed OpenSpliceAI to be a modular Python toolkit designed as an open-source implementation of SpliceAI, to which we added several key enhancements. The framework replicates the core logic of the SpliceAI model while optimizing prediction efficiency and variant effect analysis, such as acceptor and donor gains or losses, using pre-trained models. Our benchmarks show substantial computational advantages over SpliceAI, with faster processing, lower memory usage, and improved GPU efficiency (Figure 2B, Figure 2—figure supplement 6). These improvements are driven by our optimized PyTorch implementation that employs dynamic computation graphs and on-demand GPU memory allocation – allowing memory to be allocated and freed as needed – in contrast to SpliceAI’s static, Keras-based TensorFlow approach, which pre-allocates memory for the worst-case input size. In SpliceAI, this rigid memory allocation leads to high memory overhead and frequent out-of-memory errors when handling large datasets through large loop iteration prediction. Additionally, OpenSpliceAI leverages streamlined data handling and enhanced parallelization through batch prediction and multiprocessing, automatically distributing tasks across available threads. Together, these features prevent the memory pitfalls common in SpliceAI and make OpenSpliceAI a more scalable and efficient solution for large-scale genomic analysis.

    It is important to note that even though OpenSpliceAI and SpliceAI share the same model architecture, the released trained models are not identical. The variability observed between our models and the original SpliceAI – and even among successive training runs using the same code and data – can be attributed to several sources of inherent randomness. First, weight initialization is performed randomly for many layers, which means that different initial weights can lead to distinct convergence paths and final model parameters. Second, the process of data shuffling alters the composition of mini-batches during training, impacting both the training dynamics and the statistics computed in batch normalization layers. Although batch normalization is deterministic for a fixed mini-batch, its reliance on batch statistics introduces variability due to the random sampling of data. Finally, OpenSpliceAI employs the AdamW optimizer (Loshchilov and Hutter, 2019), which incorporates exponential moving averages of the first and second moments of the gradients. This mechanism serves a momentum-like role, contributing to an adaptive learning process that is inherently stochastic. Moreover, subtle differences in the order of operations or floating-point arithmetic, particularly in distributed computing environments, can further amplify this stochastic behavior. Together, these factors contribute to the observed nondeterministic behavior, resulting in slight discrepancies between our trained models and the original SpliceAI, as well as among successive training sessions under identical conditions.

    OpenSpliceAI empowers researchers to adapt the framework to many other species by including modules that enable easy retraining. For closely related species such as mice, our retrained model demonstrated comparable or slightly better precision than the human-based SpliceAI model. For more distant species such as A. thaliana, whose genomic structure differs substantially from humans, retraining OpenSpliceAI yields much greater improvements in accuracy. Our initial release includes models trained on the human MANE genome annotation and four additional species: mouse, zebrafish, honeybee, and A. thaliana. We also evaluated pre-training on mouse (OSAIMouse), honeybee (OSAIHoneybee), zebrafish (OSAIZebrafish), and Arabidopsis (OSAIArabidopsis) followed by fine-tuning on the human MANE dataset. While cross-species pre-training substantially accelerated convergence during fine-tuning, the final human splicing prediction accuracy was comparable to that of a model trained from scratch on human data. This result indicates that our architecture seems to capture all relevant splicing features from human training data alone and thus gains little or no benefit from cross-species transfer learning in this context (see Figure 4—figure supplement 5).

    OpenSpliceAI also includes modules for transfer learning, allowing researchers to initialize models with weights learned on other species. In our transfer learning experiments, models transferred from human to other species displayed faster convergence and higher stability, with potential for increased accuracy. We also incorporate model calibration via temperature scaling, providing better alignment between predicted probabilities and empirical distributions.

    The ISM study revealed that OSAIMANE and SpliceAI made predictions using very similar sets of motifs (Figure 6B). Across several experiments, we note that SpliceAI exhibits an inherent bias near the starts and ends of transcripts which are padded with flanking Ns (as was done in the original study), predicting donor and acceptor sites in these boundaries with an extremely high signal that disappears when the sequence is padded with the actual genomic sequence. For example, the model correctly predicted the first donor site of the CFTR gene when the gene’s boundaries were flanked with N’s; however, when replaced those N’s with the actual DNA sequence upstream of the gene boundary, the signal all but disappeared, as seen in Figure 6D. This suggests a bias resulting from the way the model is trained. In our ISM benchmarks, we thus chose not to use flanking N’s unless explicitly recreating a study from the original SpliceAI paper.

    Additionally, we note that both the SpliceAI and OSAIMANE ‘models’ are the averaged result of five individual models, each initialized with slightly different weights. During the prediction process, each individual model was found to have discernibly different performance. By averaging their outputs leveraging the deep-ensemble approach (Fort et al., 2019; Lakshminarayanan et al., 2017), the overall performance of both SpliceAI and OpenSpliceAI improved while reducing sensitivity to local variations. In essence, this method normalizes the inherent randomness of the individual models, resulting in predictions that are more robust and better represent the expected behavior, ultimately yielding improved average performance across large datasets. OpenSpliceAI’s ‘predict’ submodule averages across all five models by default, but it also supports prediction using a single model.

    In summary, OpenSpliceAI is a fully open-source, accessible, and computationally efficient deep learning system for splice site prediction. Its modular architecture, enhanced performance, and adaptability to diverse species make it a powerful tool for advancing research on gene regulation and splicing across diverse species.

    Continue Reading