Intimate Image Abuse: Software

(asked on 29th October 2025) - View Source

Question to the Home Office:

To ask the Secretary of State for the Home Department, what assessment her Department has made of the potential impact of nudification apps on boys and girls under 18.


Answered by
Jess Phillips Portrait
Jess Phillips
Parliamentary Under-Secretary (Home Office)
This question was answered on 5th November 2025

The Government is aware of concerns about the impacts of nudification apps on children and in facilitating violence against women and girls. AI-generated child sexual abuse material can have direct impact on real children. Offenders use AI to create photorealistic abuse imagery that often features real children, for example children known to the offender or existing victims. We also know that offenders are using AI imagery to groom and blackmail children.

We are taking action on non-consensual intimate image abuse, having criminalised the creation of intimate images without consent (or reasonable belief in consent) in the Data (Use and Access) Act. This built on the existing offences introduced by the Online Safety Act for sharing, or threatening to share intimate images, including deepfakes.

Furthermore, in the Crime and Policing Bill, this Government is protecting children from the growing threat of online predators, by becoming the first country in the world to criminalise AI tools which generate child sexual abuse images.

We are going even further in the Crime and Policing Bill by introducing offences of taking an intimate image without consent, and installing equipment with the intent of taking an intimate image without consent, or a reasonable belief in consent.

Regarding a prohibition of ‘nudification’ apps, the Government is actively considering what action is needed to ensure that any intervention in this area is effective, and will provide an update in due course.

Reticulating Splines