Asked by: Sarah Pochin (Reform UK - Runcorn and Helsby)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, when regulated online service providers will be required to pre screen for known child sexual abuse material, in the context of the recommendations of the Independent Inquiry into Child Sexual Abuse.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act contains provisions to address the Independent Inquiry into Child Sexual Abuse’s recommendation. Under Section 121 of the Act, Ofcom has the power, where necessary and proportionate, to require regulated services to use accredited technology to detect and remove child sexual exploitation and abuse content, including in private or encrypted channels.
Ofcom will be able to issue a tech notice once minimum standards for accredited technologies have been published and its accreditation scheme is in place. It will publish advice on minimum standards to the Secretary of State by April 2026.
Asked by: Sarah Pochin (Reform UK - Runcorn and Helsby)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, when enhanced age verification and online safety measures to protect children from online facilitated sexual abuse will be implemented, in the context of the recommendations of the Independent Inquiry into Child Sexual Abuse.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act already meets the Inquiry’s recommendations on age-verification and online safety measures. The child safety duties require regulated services to implement highly effective age assurance to prevent children from accessing the most harmful content, including pornography, and implement age-appropriate measures to protect children from other legal but harmful material such as bullying or violent content.
The illegal content safety duties go beyond age-verification. Child sexual exploitation and abuse material is a priority offence, and under the duties, services must take proactive steps to prevent it appearing and remove it swiftly if it does.