Google has announced several autofill improvements being rolled out to the mobile and desktop versions of Chrome. The updates will give autofill access to more of your data from places like Google Wallet and make it easier to select the right…
KARACHI – The State Bank of Pakistan (SBP) injected Rs2,610.8 billion through Reverse Repo Purchase and Shariah Compliant Mudarabah based Open Market Operations (OMO) on Friday to maintain liquidity in the market. The central bank conducted the Open Market Operation, Reverse Repo Purchase (Injection) for 7 and 14-day tenors on December 05, 2025, and injected Rs2,437.8 billion against 14 bids while other Rs173 billion were inserted through Shariah Compliant Mudarabah based OMO. The central bank received 11 bids for the 14-day Reverse Repo Purchase, cumulatively offering Rs2,384.8 billion at the rate of return ranging between 11.01 to 11.08 percent.
The SBP accepted all the bids with the entire amount at 11.01 percent rate of return.
Moreover, the SBP also received 3 quotes for the 7-day tenor, cumulatively offering Rs53 billion at the rate of return ranging between 11.03 percent to 11.05 percent.
The SBP accepted the entire amount at 11.03 percent rate of return.
Meanwhile, SBP also conducted Shariah Compliant Mudarabah based Open Market Operation for the 7 and 14-day tenors. The central bank did not receive any bid for the 14-day tenor while 3 quotes were received for 7-day tenor offering Rs218 billion at rate of return ranging between 11.01 to 11.06 percent. SBP accepted Rs173 billion against two bids at 11.05 percent rate of return.
LONDON – The last supermoon of 2025 dazzled bright in the night sky on Thursday night. Appearing brighter and larger than a normal full Moon, the third supermoon of the year was also spotted by some in the UK with an atmospheric phenomenon…
A new study suggests that certain genetic differences, passed down from ancient human ancestors, and exposure to common present-day chemicals could explain why some women are more likely to develop endometriosis.
Speaking to the media ahead of their first race in São Paulo, Federico Goyret, the Citroën Global Marketing Director, opened up about why the iconic French brand opted for a…
Japan came close to their third straight win, which would have kept them in the quarter-final race, but Hungary mounted a late comeback to steal a 26:26 draw, and with it, book their ticket to the top eight.
The Ministry of Health, Families, Autonomy and Persons with Disabilities and Public Health France reports two cases of Middle East respiratory syndrome coronavirus (MERS-CoV) in France.
The two cases were confirmed following suggestive symptoms…
With the enshittification of Windows and the rise of SteamOS, the requests for Linux testing are coming from more than just a few passionate Level1Techs viewers.
TikTok and other social media platforms are hosting AI-generated deepfake videos of doctors whose words have been manipulated to help sell supplements and spread health misinformation.
The factchecking organisation Full Fact has uncovered hundreds of such videos featuring impersonated versions of doctors and influencers directing viewers to Wellness Nest, a US-based supplements firm.
All the deepfakes involve real footage of a health expert taken from the internet. However, the pictures and audio have been reworked so that the speakers are encouraging women going through menopause to buy products such as probiotics and Himalayan shilajit from the company’s website.
The revelations have prompted calls for social media giants to be much more careful about hosting AI-generated content and quicker to remove content that distorts prominent people’s views.
“This is certainly a sinister and worrying new tactic,” said Leo Benedictus, the factchecker who undertook the investigation, which Full Fact published on Friday.
He added that the creators of deepfake health videos deploy AI so that “someone well-respected or with a big audience appears to be endorsing these supplements to treat a range of ailments”.
Prof David Taylor-Robinson, an expert in health inequalities at Liverpool University, is among those whose image has been manipulated. In August, he was shocked to find that TikTok was hosting 14 doctored videos purporting to show him recommending products with unproven benefits.
Though Taylor-Robinson is a specialist in children’s health, in one video the cloned version of him was talking about an alleged menopause side-effect called “thermometer leg”.
The fake Taylor-Robinson recommended that women in menopause should visit a website called Wellness Nest and buy what it called a natural probiotic featuring “10 science-backed plant extracts, including turmeric, black cohosh, Dim [diindolylmethane] and moringa, specifically chosen to tackle menopausal symptoms”.
Female colleagues “often report deeper sleep, fewer hot flushes and brighter mornings within weeks”, the deepfake doctor added.
Black cohosh supplement pills. Photograph: Julie Woodhouse f/Alamy
The real Taylor-Robinson discovered that his likeness was being used only when a colleague alerted him. “It was really confusing to begin with – all quite surreal,” he said. “My kids thought it was hilarious.
“I didn’t feel desperately violated, but I did become more and more irritated at the idea of people selling products off the back of my work and the health misinformation involved.”
The footage of Taylor-Robinson used to make the deepfake videos came from a talk on vaccination he gave at a Public Health England (PHE) conference in 2017 and a parliamentary hearing on child poverty at which he gave evidence in May this year. In one misleading video, he was depicted swearing and making misogynistic comments while discussing menopause.
TikTok took down the videos six weeks after Taylor-Robinson complained. “Initially, they said some of the videos violated their guidelines but some were fine. That was absurd – and weird – because I was in all of them and they were all deepfakes. It was a faff to get them taken down,” he said.
Full Fact found that TikTok was also carrying eight deepfakes featuring doctored statements by Duncan Selbie, the former chief executive of PHE. Like Taylor-Robinson, he was falsely shown talking about menopause, using video taken from the same 2017 event where Taylor-Robinson spoke.
One, also about “thermometer leg”, was “an amazing imitation”, Selbie said. “It’s a complete fake from beginning to end. It wasn’t funny in the sense that people pay attention to these things.”
skip past newsletter promotion
after newsletter promotion
Full Fact also found similar deepfakes on X, Facebook and YouTube, all linked to Wellness Nest or a linked British outlet called Wellness Nest UK. It has posted apparent deepfakes of high-profile doctors such as Prof Tim Spector and another diet expert, the late Dr Michael Mosley.
Michael Mosley. Photograph: TT News Agency/Alamy
Wellness Nest told Full Fact that deepfake videos encouraging people to visit the firm’s website were “100% unaffiliated” with its business. It said it had “never used AI-generated content”, but “cannot control or monitor affiliates around the world”.
Helen Morgan, the Liberal Democrat health spokesperson, said: “From fake doctors to bots that encourage suicide, AI is being used to prey on innocent people and exploit the widening cracks in our health system.
“Liberal Democrats are calling for AI deepfakes posing as medical professionals to be stamped out, with clinically approved tools strongly promoted so we can fill the vacuum.
“If these were individuals fraudulently pretending to be doctors they would face criminal prosecution. Why is the digital equivalent being tolerated?
“Where someone seeks health advice from an AI bot they should be automatically referred to NHS support so they can get the diagnosis and treatment they actually need, with criminal liability for those profiting from medical disinformation.”
A TikTok spokesperson said: “We have removed this content [relating to Taylor-Robinson and Selbie] for breaking our rules against harmful misinformation and behaviours that seek to mislead our community, such as impersonation.
“Harmfully misleading AI-generated content is an industry-wide challenge, and we continue to invest in new ways to detect and remove content that violates our community guidelines.”
The Department of Health and Social Care was approached for comment.