To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


View sample alert

Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Gaming: Internet
Tuesday 10th February 2026

Asked by: Mike Reader (Labour - Northampton South)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the effectiveness of the Online Safety Act 2023 in protecting children from harm on online gaming platforms, including Roblox; and whether she plans to undertake a review of the Act’s application to such platforms.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

Gaming platforms that allow users to post or interact, such as Roblox, are in scope of the Online Safety Act. They are required to protect children from illegal and harmful content on their service, including using highly effective age assurance to prevent children encountering the most harmful types of content.

Ofcom is the regulator of the Act and has powers to take robust enforcement action. Ofcom has already used these powers, effectively enforcing against non-compliant services.

We will continue to monitor the effectiveness of the Act, and on 20 January, the government announced a short, swift consultation on further measures to enhance children's wellbeing and ensuring they have a healthy relationship with social media accompanied by a national conversation.


Written Question
Radicalism: Internet
Tuesday 10th February 2026

Asked by: Mark Pritchard (Conservative - The Wrekin)

Question to the Ministry of Defence:

To ask the Secretary of State for Defence, what steps he is taking to counter engagement with extreme online political content by members of the armed forces.

Answered by Louise Sandher-Jones - Parliamentary Under-Secretary (Ministry of Defence)

The Ministry of Defence remains vigilant to the risks associated with Service personnel engaging with extremist or extreme online political content and treats such matters with the utmost seriousness. Such behaviour is wholly incompatible with the values and standards of the Armed Forces.

Defence maintains clear expectations of conduct, requiring all personnel to uphold the core values of respect, integrity and commitment, and to adhere to strict rules on political impartiality. It also voluntarily applies the Government’s Prevent Duty. Service regulations set out clear restrictions on online and public activity to ensure personnel do not engage in behaviour that could undermine the reputation, neutrality or operational effectiveness of the Armed Forces. Through a combination of clear behavioural standards, mandatory training, counter-terrorism intelligence, vetting and robust personnel policies, Defence works to reduce the risk of Armed Forces personnel engaging with extreme online political content.


Written Question
Subscriptions: Internet
Monday 9th February 2026

Asked by: Vikki Slade (Liberal Democrat - Mid Dorset and North Poole)

Question to the Department for Business and Trade:

To ask the Secretary of State for Business and Trade, what assessment his Department have made of the potential impacts of a 14-day cooling off period for all online subscriptions on the number of people that cancel their subscription after visiting a site run by a charity in a 14-day period.

Answered by Kate Dearden - Parliamentary Under Secretary of State (Department for Business and Trade)

The requirement for a 14 day cooling off period for distance contracts is an existing requirement under the Consumer Contract Regulations 2013. The government has consulted on the implementation of the subscriptions regime in the Digital Markets, Competition and Consumer Act 2024. The consultation received over 70 responses including 15 from charitable organisations, and the government is engaging closely with the sector to understand the impacts on both consumers and these bodies.

The impact assessment for the subscriptions chapter in the Digital Markets, Competition and Consumer Act can be found here: Subscription traps: annex 2 impact assessment. Together, the subscription measures are anticipated to provide £400m of consumer benefits per year and the estimated net direct cost to businesses is £171m per year. Sector-specific analysis has not been conducted.


Written Question
Internet: Data Protection
Monday 9th February 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask Her Majesty's Government what steps they are taking to ensure that online platforms operating in the UK comply with data access and privacy requirements, in light of recent regulatory scrutiny of messaging services.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

All organisations in the UK that provide online and messaging services to their customers have to comply with the requirements of UK’s data protection and privacy framework, as set out in the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA) and the Privacy and Electronic Communications Regulations 2003 (PECR). The UK GDPR and the DPA also apply to online platforms based outside of the UK that are processing UK residents’ data for the purposes of providing goods and services or monitoring behaviour.

As such, the handling of people’s data by online platforms should be lawful, fair, transparent and secure. The data protection legislation gives people the right to be informed about the collection and use of their personal data, as well as rights to request access to their data, object to its processing or seek its erasure.

The Information Commissioner, the UK’s independent regulator for data protection, publishes a range of guidance to help organisations comply with the legislation and has the power to investigate and impose penalties for non-compliance.


Written Question
Offences against Children: Internet
Friday 6th February 2026

Asked by: Jim Shannon (Democratic Unionist Party - Strangford)

Question to the Home Office:

To ask the Secretary of State for the Home Department, how many online child sexual abuse offences have been recorded in England and Wales in the last 3 years.

Answered by Jess Phillips - Parliamentary Under-Secretary (Home Office)

Online child sexual abuse offences are captured in police recorded crime via an online crime flag being applied to a series of offences deemed most likely to be child sexual abuse. This includes contact sexual offences and obscene publications offences which act as a proxy for indecent images of children (IIOC) offences.

In April 2015, it became mandatory for all forces to return quarterly information on the number of crimes flagged as being committed online as part of the Annual Data Requirement (ADR). Since April 2024 this has been supported by the National Data Quality Improvement Service (NDQIS) which aims to improve the quality and consistency of flagging. Data released prior to 2024 are not directly comparable due to the move to NDQIS.

The online crime flag refers to any crime committed either in full, or in part, through use of online methods or platforms. The online crime flag helps provide a national and local picture of how internet and digital communications technology are being used to commit crimes, and an understanding of the prominence of certain crimes that are happening online, compared to offline.

An offence should be flagged where online methods or internet-based activities were used to facilitate the offence (e.g. through email, social media, websites, messaging platforms, gaming platforms, or smart devices). In April 2024, recording guidelines were amended to clarify that offences committed via SMS text messages or online-platform-enabled phone calls should also be flagged.

These data are published quarterly via the Office for National Statistics (ONS), originally in ‘Other related tables’ and now in ‘Appendix tables’ as per links below.

Child sexual offences

Proportion

Obscene publications offences

Proportion

Year to September 2025 – Appendix Table C5

14,515

23%

32,191

75%

Year to September 2024 – Appendix table C5

13,987

23%

28,269

71%

Year to September 2023 – Other related tables, F11

12,568

20%

26,024

64%

Note: Data across the year are not comparable due to continued improvements to the processing of online flags.

The Government is committed to tackling all forms of child sexual abuse and exploitation and is committed to taking robust action to better safeguard children, ensuring victims and survivors receive appropriate care and support and pursuing offenders and bringing them to justice.


Written Question
Internet: Data Protection
Thursday 5th February 2026

Asked by: Peter Bedford (Conservative - Mid Leicestershire)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, whether her Department plans to review the requirement for platforms to implement client-side scanning and other automated content analysis tools under the Online Safety Act 2023 in the context of the scanning of private cloud storage and encrypted communications.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Online Safety Act does not require platforms to implement client-side scanning or other automated content analysis tools on content communicated privately. The Act states that Ofcom may not recommend the use of proactive technology, such as client-side scanning, to analyse user-generated content communicated privately.

This means that Ofcom’s codes cannot recommend that service providers deploy proactive technology in private or encrypted communications. The Department has no plans to review this section of the Act.


Written Question
Prescription Drugs: Sales
Thursday 5th February 2026

Asked by: Stuart Anderson (Conservative - South Shropshire)

Question to the Department of Health and Social Care:

To ask the Secretary of State for Health and Social Care, what steps he is taking to help restrict the sale of illegal prescription drugs online.

Answered by Zubir Ahmed - Parliamentary Under-Secretary (Department of Health and Social Care)

The Medicines and Healthcare products Regulatory Agency (MHRA) is responsible for the regulation of medicines for human use, medical devices, and blood products for transfusion in the United Kingdom. This includes applying the legal controls on the retail sale, supply, and advertising of medicines which are set out in the Human Medicines Regulations 2012.

Sourcing medicines from unregulated suppliers significantly increases the risk of getting a product which is either falsified or not authorised for use. Products purchased in this way will not meet the MHRA’s strict quality and safety standards and could expose patients to incorrect dosages or dangerous ingredients. The MHRA’s Criminal Enforcement Unit works hard to prevent, detect, and investigate illegal activity involving medicines and medical devices. It works closely with other health regulators, customs authorities, law enforcement agencies, and private sector partners, including e-commerce and the internet industry to identify, remove, and block online content promoting the illegal sale of medicines and medical devices.

The MHRA seeks to identify and, where appropriate, prosecute online sellers responsible for putting public health at risk. In 2025, the MHRA and its partners seized almost 20 million doses of illegally traded medicines with a street value of nearly £45 million.

During the same period, it disrupted over 1,500 websites and posts on social media accounts selling medicinal products illegally. Additionally, collaboration with one well-known online marketplace led to the successful identification and blocking of more than two million unregulated prescription medicines, over-the-counter medicines, and medical devices before they could be offered for sale to the public.

The MHRA is continually developing new and innovative ways to combat the illegal trade in medicines and to raise public awareness. These measures include:

- publication of a #Fakemeds campaign which explains how to access medicines through safe and legitimate online sources, with further information available at the following link:
https://fakemeds.campaign.gov.uk/;

- public guidance on how to safely access and use GLP-1 medications, available at the following link:
https://www.gov.uk/government/publications/glp-1-medicines-for-weight-loss-and-diabetes-what-you-need-to-know/glp-1-medicines-for-weight-loss-and-diabetes-what-you-need-to-know.

- implementation of a web-based reporting scheme allowing users to report suspicious online sellers to the MHRA;

- rollout of an online service which will allow users to check if a website has been deemed ‘Not Recommended’ by the MHRA; and

- extensive work with media outlets to raise awareness of the dangers of illegal medicines.


Written Question
Gambling: Internet
Wednesday 4th February 2026

Asked by: Lord Foster of Bath (Liberal Democrat - Life peer)

Question to the Department for Digital, Culture, Media & Sport:

To ask His Majesty's Government what recent discussions they have had with Ofcom regarding the regulation of gambling content and advertising online for young people, including the interaction of those regulations with the framework set out in the Online Safety Act 2023.

Answered by Baroness Twycross - Baroness in Waiting (HM Household) (Whip)

Gambling is regulated by the Gambling Commission under the Gambling Act 2005. Rules on gambling advertising content are regulated by the Advertising Standards Authority. Gambling advertising is not covered under the Online Safety Act, and as such no discussions with Ofcom have taken place.

The Government recognises that more work needs to be done to ensure that gambling advertising does not exacerbate harm. We engage regularly with stakeholders across government and with industry, to ensure the most vulnerable are protected.


Written Question
Digital Technology and Internet: Abuse
Tuesday 3rd February 2026

Asked by: Gideon Amos (Liberal Democrat - Taunton and Wellington)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what plans her Department has for cross-government working to encourage safety by design of smart and connected technology to help protect victims and survivors of technology-facilitated abuse.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

Tackling Violence Against Women and Girls (VAWG) in all its forms, including online, is a priority for this Government. That is why, in December, we published the cross-government VAWG Strategy.

Within the Strategy, we commit to working across departments to explore what more we can do to encourage safety‑by‑design in smart and connected technologies. This work aims to better protect victims and survivors, and to prevent perpetrators from misusing these technologies to facilitate abuse.


Written Question
Internet: Disinformation
Tuesday 3rd February 2026

Asked by: Jim Shannon (Democratic Unionist Party - Strangford)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to tackle deepfakes online.

Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Online Safety Act requires services to tackle illegal content and protect children from harmful content, including where it is AI generated ‘deepfakes’.

Building on this, the offence of creating intimate images without consent, including using AI, will come into effect in the coming weeks and this will be made a priority offence under the Act, giving users the strongest protections from such content. We are also criminalising nudification apps – making it illegal for companies to supply such tools.

We are also running Deepfake Detection Challenge 2026 a programme aimed at strengthening capabilities to detect and mitigate synthetic media threats.