Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what further action the Government plans to take to help reduce the negative impacts of social media on the mental health of young people and wider society.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Protecting children from harm online is a priority for the Secretary of State and the Government. One of the Secretary of State’s first actions in the job was to criminalise intimate image abuse and cyberflashing. We have legislated to make content that promotes self-harm and suicide priority offences in the Online Safety Act. The Secretary of State and I have acted to prevent platforms hosting child sexual abuse material and material that contributes to violence against women and girls by banning AI nudification apps, requiring platforms to take down non-consensual intimate images 48 hours after they are reported, make it so that women only need to report non-consensual intimate images once and requiring platforms to act faster to address intimate images, strangulation pornography, and pornography depicting adults role-playing as children . We have always been clear that there is still more to do.
On 2 March we published a consultation and national conversation which seeks views and evidence on a range of measures that could further protect children online and enhance their wider wellbeing.
The consultation includes exploring banning social media and gaming for children below a certain age and restricting access to risky and ‘addictive’ features and functionalities.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether the Government plans to introduce legislation to limit the creation of echo chambers and the use of harmful algorithms that promote hate for financial gain.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Under the Online Safety Act, platforms must tackle illegal content, including terrorist content and religious or race-based hatred. They must protect children from additional forms of legal content, including hate or abuse.
Services must ensure their algorithms do not promote this content, and Ofcom has robust enforcement powers to ensure this. Government has met with Ofcom to encourage their enforcement on this issue.
In response to the Social media, misinformation and harmful algorithms report by the Science, Innovation and Technology Committee, government committed to ensuring individuals have a say over the content they are presented by algorithms. DSIT committed further to exploring options requiring platforms to provide users with greater control over their algorithms in Protecting What Matters, the government’s plan to improve social cohesion.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what consideration the Government has given to introducing further regulations on social media companies to address the prevalence of hate speech online.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Under the Online Safety Act, platforms must tackle illegal content, including terrorist content and religious or race-based hatred. They must protect children from additional forms of legal content, including hate or abuse.
Services must ensure their algorithms do not promote this content, and Ofcom has robust enforcement powers to ensure this. Government has met with Ofcom to encourage their enforcement on this issue.
In response to the Social media, misinformation and harmful algorithms report by the Science, Innovation and Technology Committee, government committed to ensuring individuals have a say over the content they are presented by algorithms. DSIT committed further to exploring options requiring platforms to provide users with greater control over their algorithms in Protecting What Matters, the government’s plan to improve social cohesion.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps the Government intends to take to regulate social media algorithms that promote or amplify hateful content.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
Under the Online Safety Act, platforms must tackle illegal content, including terrorist content and religious or race-based hatred. They must protect children from additional forms of legal content, including hate or abuse.
Services must ensure their algorithms do not promote this content, and Ofcom has robust enforcement powers to ensure this. Government has met with Ofcom to encourage their enforcement on this issue.
In response to the Social media, misinformation and harmful algorithms report by the Science, Innovation and Technology Committee, government committed to ensuring individuals have a say over the content they are presented by algorithms. DSIT committed further to exploring options requiring platforms to provide users with greater control over their algorithms in Protecting What Matters, the government’s plan to improve social cohesion.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether the Government plans to bring forward further legislation to help prevent and hold online platforms accountable for the monetisation of hate-driven engagement.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Online Safety Act places duties on online platforms to tackle illegal content that stirs up hatred and to protect children from legal content that is hateful or abusive. Platforms must ensure their algorithms do not promote these types of content.
In March, MHCLG published Protecting What Matters, in which DSIT, in partnership with DCMS, committed to engaging the advertising industry and platforms to further understand how advertising can inadvertently fund legal but harmful content and consider potential solutions to this issue.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment the Government has made of the level of requirement for additional legislation to help prevent social media companies from promoting extreme ideologies through their platforms.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The government is committed to tackling extremism in all its forms, and we constantly review our understanding of emerging radicalising threats to our society.
Under the Online Safety Act, platforms must tackle illegal content, including terrorist content and that which stirs up hatred based on religion or race. They must also protect children from additional forms of legal content, including hateful or abusive content.
We are committed to keeping our online safety regime under review to ensure it keeps up with rapidly evolving harms.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, how the cyber risk to Government has changed in the last 5 years; how his Department's approach to cyber security has changed in that time; and what assessment he has made of how the Government's level of cyber resilience has changed in that time.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
Our approach to tackling Government cyber risk is driven by the 2022 Government Cyber Security Strategy which sets a clear target for critical functions to be hardened to cyber attack by 2025.
We have made important steps in understanding and mitigating risk; GovAssure has dramatically improved our understanding of cyber resilience levels across government and the systemic issues preventing departments from achieving targets. The Government Cyber Coordination Centre enables us to respond as one government to cyber incidents, threats and vulnerabilities.
However, the threat picture is the most sophisticated it has ever been and the UK's resilience picture is poorer than previously estimated. In January 2025, the NAO report into Government cyber resilience confirmed that Government since 2022 has not improved its cyber resilience quickly enough to meet its 2025 target. We welcome the report and are taking immediate action to address the recommendations.
We are accelerating our response through the launch of a more interventionist approach, which will address the long-standing shortage of cyber skills, strengthen accountability for cyber risks, provide greater support for delivery in the form of cyber services, guidance, and hands-on technical support and bolster our response capabilities to fast-moving cyber incidents.
Asked by: Ian Lavery (Labour - Blyth and Ashington)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether her Department plans to be able to meet its target for the Government to be cyber resilient by the end of 2025.
Answered by Ian Murray - Minister of State (Department for Science, Innovation and Technology)
Our approach to tackling Government cyber risk is driven by the 2022 Government Cyber Security Strategy which sets a clear target for critical functions to be hardened to cyber attack by 2025.
We have made important steps in understanding and mitigating risk; GovAssure has dramatically improved our understanding of cyber resilience levels across government and the systemic issues preventing departments from achieving targets. The Government Cyber Coordination Centre enables us to respond as one government to cyber incidents, threats and vulnerabilities.
However, the threat picture is the most sophisticated it has ever been and the UK's resilience picture is poorer than previously estimated. In January 2025, the NAO report into Government cyber resilience confirmed that Government since 2022 has not improved its cyber resilience quickly enough to meet its 2025 target. We welcome the report and are taking immediate action to address the recommendations.
We are accelerating our response through the launch of a more interventionist approach, which will address the long-standing shortage of cyber skills, strengthen accountability for cyber risks, provide greater support for delivery in the form of cyber services, guidance, and hands-on technical support and bolster our response capabilities to fast-moving cyber incidents.