Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to improve media literacy.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government is taking a cross‑government approach to improving media literacy, as set out in ‘A Safe, Informed Digital Nation’, published on 16 March.
This includes strengthening coordination across policy areas and working with civil society and industry to help people build the skills, confidence and critical thinking needed to navigate the online world safely and effectively.
This includes initiatives such as the ‘You Won’t Know Until You Ask’ campaign, which encourages people to pause and question online content, alongside trusted guidance on the new Kids Online Safety Hub and funding innovative projects through the Digital Inclusion Innovation Fund.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to ensure that the Information Commissioner's Office is adequately resourced to carry out digital age enforcement cases against tech companies.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
The government increased the data protection fee in 2025 to provide the ICO with the necessary resources to carry out its functions effectively. As an independent regulator it is at the discretion of the Commissioner how he chooses to use this funding to effectively enforce the digital age of consent under UK GDPR. To fulfil these responsibilities and respond to rising public and business demand, the ICO has hired additional specialist capacity.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what discussions has she had with Ofcom and the Information Commissioner's Office on the adequacy of protections relating to (a) generative AI and (b) chatbots in the Online Safety Act 2023.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Following public consultation, the Information Commissioner’s Office (ICO) issued and updated guidance on how data protection law applies to generative AI. The Government supports the ICO’s role in providing guidance to organisations to help their compliance.
While some AI chatbots are covered by the Online Safety Act, this Government is determined to close loopholes and has tabled an amendment to the Crime and Policing Bill to protect users from illegal content on chatbots.
The Department will continue to meet regularly with Ofcom, the ICO and industry, to address emerging risks and uphold strong online safety protections.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the suitability of (a) live location sharing and (b) addictive content features such as autopay for social media apps used by children.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act requires services to mitigate and manage risks to children from online features and functionalities. Ofcom recommends in its Codes of Practice that services with specific risks should turn off live location sharing for children as default. Services must also consider how specific features and functionalities, such as autoplay, can increase children’s exposure to illegal or harmful content and mitigate these risks.
Additionally, our landmark consultation launched earlier this month seeks views on whether the government should further restrict risky functionalities such as location sharing, and ‘addictive’ functionalities including autoplay, to further protect children online.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to ensure that Ofcom is adequately resourced to (a) monitor and (b) regulate the algorithms of online platforms.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Ofcom’s online safety budget and expert team ensure its duties are performed effectively. Ofcom has spent approximately £281.3 million on online safety since 2020, including a projected spend of £92 million for 2025/26.
As part of its information gathering powers, Ofcom can remotely view information about a service’s processes, including conducting tests of algorithmic systems. Ofcom also has the power to seek information from categorised services about the design and operation of their algorithms in annual transparency reports.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what proportion of UKRI and other research council funding was spent on (a) dementia, (b) cancer, (c) stroke and (d) coronary heart disease research in each year between 2019 and 2025.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Medical Research Council (MRC), which is part of UK Research and Innovation (UKRI), supports world‑leading research to accelerate diagnosis, develop treatments and prevent disease.
Details of funding from MRC, as well as other research councils within UKRI, on specific areas is provided in the table below:
| 2019/20 | 2020/21 | 2021/22 | 2022/23 | 2023/24 | 2024/25 | Total |
(a)Dementia* |
|
|
|
|
|
|
|
MRC | £44m | £54m | £50m | £56m | £65m | £56m | £334m |
Rest of UKRI |
| £29m | £30m | £31m | £32m | £23m | £145m |
Total | £44m | £83m | £81m | £87m | £97m | £88m | £479m |
|
|
|
|
|
|
|
|
(b)Cancer |
|
|
|
|
|
|
|
MRC | £68m | £70m | £71m | £106m | £73m | £74m | £462m |
Rest of UKRI | £61m | £81m | £69m | £128m | £143m | £125m | £607m |
Total | £129m | £151m | £140m | £234m | £216m | £199m | £1069m |
|
|
|
|
|
|
|
|
(c)Stoke |
|
|
|
|
|
|
|
MRC | £47m | £9m | £21m | £10m | £15m | £20m | £121m |
Rest of UKRI | £6m | £30m | £12m | £31m | £50m | £30m | £148m |
Total | £53m | £39m | £33m | £41m | £65m | £50m | £269m |
|
|
|
|
|
|
|
|
(d) Coronary heart disease |
|
|
|
|
|
|
|
MRC | £73m | £18m | £29m | £44m | £32m | £64m | £260m |
Rest of UKRI | £23m | £24m | £25m | £49m | £84m | £55m | £260m |
Total | £96m | £42m | £54m | £93m | £116m | £119m | £520m |
|
|
|
|
|
|
|
|
*'Rest of UKRI' spend figure for 2019/20 is unavailable. For 2024/25, ‘Rest of UKRI’ figure does not include funding from the Natural Environment Research Council (NERC).
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to the Online Safety Act 2023, how the department is ensuring that the voices of children are considered in the implementation of the Act, to help ensure that their concerns and experiences are heard and acted on.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
On 2 March, the government launched a landmark consultation on how to give young people the childhood they deserve in an online world. Alongside the formal consultation, we have launched a child and parent-friendly version, ensuring these important voices are properly heard.
As part of the National Conversation running alongside the consultation, we will be hosting events across the UK to hear directly from young people. Families, young people, and communities from all over the UK are encouraged to discuss this vital topic in community events, MP-led local conversations, and engagement through schools and civil society organisations.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps the department is taking to ensure the implementation of the Online Safety Act 2023 does not inappropriately impact on individual rights to privacy.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The right to privacy is central to our online safety work. The Online Safety Act has cross-cutting duties to ensure that users’ rights and privacy are protected. All providers are required to give particular regard to the importance of protecting users’ rights when implementing measures to comply with their new safety duties.
As the independent regulator of the Online Safety Act, Ofcom may refer matters to the Information Commissioners Office if it has concerns that a provider has not complied with its obligations under data protection law.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps the department is taking to ensure that age verification measures implemented by social media apps such as snapchat in response to the Online Safety Act 2023 are effective.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
As the regulator, Ofcom is responsible for the implementation and enforcement of the Online Safety Act. Ofcom has set out in guidance that age assurance technologies should fulfil the four criteria of technical accuracy, robustness, reliability, and fairness to be considered highly effective.
Ofcom is set to publish reports on age assurance and the use of app stores by children by July 2026 and January 2027 respectively. The public consultation on protecting children online will also seek views on strengthening age assurance measures. Where evidence demonstrates further action is necessary to protect children online, we will not hesitate to act.
Asked by: Freddie van Mierlo (Liberal Democrat - Henley and Thame)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps the Department is taking to limit exposure of children to harmful content on a) self harm and b) eating disorders through social media algorithms.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act requires services, including social media, to protect children from illegal, harmful, and age-inappropriate content.
In scope services that are likely to be accessed by children must use highly effective age assurance to prevent children from encountering the most harmful types of content, such as content that encourages, promotes or provides instructions for self-harm and eating disorders.
The Act requires services to consider, as part of their risk assessments, how algorithms could impact children’s exposure to illegal content and content which is harmful to children on their service.
Ofcom can take robust enforcement action against services failing to comply with their duties.