The long-awaited foldable iPhone — rumored to be called the iPhone Fold — should arrive soon. So far, Apple has kept a very tight lid on its first-ever foldable phone, but all…
A NASA rover has used its last drop of a special chemical to analyze a Mars sample that may contain organics, the kinds of molecules life uses on Earth.
SEPTIC shock is the most severe form of sepsis, occurring when a widespread infection leads to dangerously low blood pressure, organ failure and a high risk of death. Despite advances in intensive care, mortality remains substantial. A major…
A few days after Christmas 2022, Bad Bunny, the Puerto Rican reggaetonero, appeared without warning on one of the most unlikely of stages: the roof of a Gulf Oil gas station in San Juan. To a massive crowd singing every word, he performed a…
Victims of deepfake image abuse have called for stronger protection against AI-generated explicit images, as the law criminalising the creation of non-consensual intimate images comes into effect.
Campaigners from Stop Image-Based Abuse delivered a petition to Downing Street with more than 73,000 signatures, urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices.
“Today’s a really momentous day,” said Jodie, a victim of deepfake abuse who uses a pseudonym.
“We’re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it,” she added.
In the petition, campaigners are also calling for improved relationships and sex education, as well as adequate funding for specialist services, such as the Revenge Porn Helpline, which support intimate image abuse victims.
Jodie, who is in her 20s, discovered images of her being used as deepfake pornography in 2021. She and 15 other women testified against the perpetrator, 26-year-old Alex Woolf, after he posted images of women from social media to porn websites. He was convicted and sentenced to 20 weeks in prison.
“I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” said Jodie.
The offence against creating explicit deepfake images was introduced as an amendment to the Data (Use and Access) Act 2025. While the law received royal assent last July, the offence was not enforced until Friday.
Many campaigners, including Jodie, were frustrated by delays to the law coming into effect. “We had these amendments ready to go with royal assent before Christmas,” said Jodie. “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”
In January, Leicestershire police opened an investigation into a case involving sexually explicit deepfake images that were created by Grok AI.
Madelaine Thomas, a sex worker and founder of tech forensics company Image Angel, who has waived her right to anonymity, said it was “a very emotional day” for her and other victims. However, she said the law falls short of protecting sex workers from intimate image abuse.
“When commercial sexual images are misused, they’re only seen as a copyright breach. I respect that,” Thomas said. “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”
For the last seven years, intimate images of her have been shared without her consent almost every day. “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that.”
One in three women in the UK have experienced online abuse, according to domestic abuse organisation Refuge.
Stop Image-Based Abuse is a movement composed of the End Violence Against Women Coalition, the victim campaign group #NotYourPorn, Glamour UK and Clare McGlynn, a professor of law at Durham University.
A Ministry of Justice spokesperson said: “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too.
“But we’re not stopping there. We’re going after the companies behind these ‘nudification’ apps, banning them outright so we can stop this abuse at source.
“The technology secretary has also confirmed that creating non-consensual sexual deepfakes will be made a priority offence under the Online Safety Act, placing extra duties on platforms to proactively prevent this content from appearing.”
Born in New York state, Billy Crudup, 57, made his film debut in Sleepers in 1996. His subsequent movies include Almost Famous (2000), Big Fish (2003), Mission: Impossible III (2006), Spotlight (2015), Alien: Covenant (2017) and most recently Jay…
“It’s the difference between falling on the pavement and falling on grass,” says Madeleine Orr, one of the report’s authors and an assistant professor of sport ecology at the University of Toronto in Canada and formerly at Loughborough, whose…