UK Bans 'Nudification' AI Apps in £1bn Plan to Protect Women and Girls
UK bans harmful AI apps in new £1bn safety strategy

The UK government has launched a major new offensive against violence targeting women and girls, pledging a £1 billion investment and announcing a pioneering ban on harmful artificial intelligence applications.

New Tech Safeguards and AI Crackdown

Central to the new strategy is a commitment to make it impossible for children in the UK to take, share, or view explicit images on their phones. The government will work with technology companies to develop and implement nudity detection filters on smartphones. Furthermore, it will outlaw so-called 'nudification' apps, which use AI to generate fake sexually explicit images of real people without their consent.

Safeguarding Minister Jess Phillips declared the move a fundamental shift in approach. "For too long, on violence against women and girls, we have treated the symptoms and not the cause. No more," she stated. "The prevention measures we have announced today will save the lives of our next generation of girls."

Addressing a Pervasive Online Threat

The ban targets a disturbingly widespread problem. Research cited by the government revealed that in just one month during 2023, more than 24 million people accessed nudification websites. An overwhelming 96% of sexual deepfake images were found to feature women.

Technology Secretary Liz Kendall emphasised the government's stance: "Women and girls deserve to be safe online as well as offline. We will not stand by while technology is weaponised to abuse, humiliate and exploit them." She warned that those profiting from such software would face the full force of the law.

Victims and Campaigners Welcome 'Safety Net'

The announcement has been welcomed by campaigners and those directly affected by these technologies. Lisa Squire, who has campaigned for action since her 21-year-old daughter Libby was murdered in 2019, said it finally felt like a "safety net" was being created for young people. "Until today it's felt like nothing's been done, but now it feels there is," she told The Mirror.

Roxy Longworth, founder of the Behind Our Screens campaign, was manipulated into sharing intimate photos at age 13, leading to harassment and a mental health crisis. She praised the planned device controls: "If device controls like these had existed when I was 13, my life would have been completely different... It's so important that technology is used to protect young people, not harm them."

The comprehensive Violence Against Women and Girls (VAWG) strategy also includes:

  • Up to £50 million in specialist funding for NHS services supporting survivors of sexual violence.
  • Specialist training for teachers to address misogyny and teach young people to challenge harmful behaviour.
  • New forensic techniques and cutting-edge technology for police to clamp down on abusers and help reopen cold cases.

While charities like Refuge and Women's Aid commended the cross-government approach, they cautioned that the strategy must be backed by sufficient funding for frontline support services, which are already stretched beyond capacity, to ensure survivors receive the help they need.