Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what plans they have to require platforms (1) to assess and mitigate the risks of hosting or organising illegal sexual harm communities, and (2) to respond promptly to credible notifications of such communities.
The Online Safety Act requires user‑to‑user services to assess risks of different kinds of illegal harm on their platforms, including child sexual exploitation and abuse, grooming, intimate image abuse and extreme pornography, and to take proportionate steps to mitigate those risks, including where they are facilitated through groups.
Services must also have effective systems and processes to prevent, detect and act against illegal content and activity, both proactively and in response to notifications. Ofcom, as the independent regulator, sets out expected measures in statutory codes of practice, which came into force in July 2025, including on proactive technologies such as hash‑matching.