(5 days, 6 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Ms Jardine. I thank my hon. Friend the Member for North Ayrshire and Arran (Irene Campbell) for opening this important debate. I also thank the thousands of petitioners—some of whom are in the Public Gallery, which indicates the strength of public views on this matter—and all hon. Members here today, who have made powerful contributions.
I do not have a dog, so I will not enter the competition about whose dog is the cutest, but I do have two little children who try to touch every single dog we come across when we go around parks; they at least now know that they have to ask permission before they do that. I feel that I am not too far from having one of those cute dogs, or one like Frank, in our household.
The Government fully appreciate that the use of dogs for scientific and regulatory procedures stirs strong emotional feelings for many people across the UK, including myself as a dog lover. In my previous role in local government, I was responsible for environment, including stray dogs, as part of a service for many authorities around us. In a bid to avoid having to put healthy dogs down, we set up London’s first dog hotel, which Peter Egan opened. We had a system where staff could come and take dogs out for walks. Every role I have had has involved looking after dogs, and I must say that I found the preparation for this debate very difficult.
Along with other Members present, I long for the day when we can finally bring an end to animal testing and the use of dogs in scientific research; it cannot come soon enough, but sadly it is not yet here. The UK is world leading in the development of non-animal methods, and the Government are keen to ensure that those are utilised wherever possible. That is why our manifesto commits us to partner with scientists, industry and civil society as we work towards phasing out animal testing. Colleagues mentioned the changes that the FDA has brought forward. To be clear, those bring it in line with us regarding the protection of animals, but where there is new learning to be done, we will absolutely look at that.
As part of our commitment to phasing out the use of animals in science, we will publish a strategy to support the development, validation and uptake of alternative methods. It will set out how we can build on our support by creating a research and innovation system that replaces animal testing with alternative methods wherever possible. However, for now, the carefully regulated use of animals, including dogs, in scientific research remains necessary to protect humans and the wider environment.
I will now expand on why, given the current state of science, we are unfortunately not yet ready to ban the use of dogs for testing and research purposes in the UK. The use of animals in science lies in the intersection of two vital public goods: the benefits to humans, animals and the environment, and the UK’s proud history of support for the highest possible standards of animal welfare.
The balance between these two public goods is reflected in the UK’s robust regulation of the use of animals in science through the Animals (Scientific Procedures) Act 1986, known as ASPA. The Act specifies that animals can be used in science only for specific limited purposes where there are no alternatives, where the number of animals used is the minimum needed to achieve the scientific benefit and where the potential harm to animals is limited to the absolute minimum. As has been mentioned, this is known as the three Rs: replacement, reduction and refinement.
The use of animals in science is therefore highly regulated, including through a three-tier system of licensing, which licenses each establishment, project and individual involved in performing regulated procedures involving animals. All establishments are required to have dedicated individuals, including veterinary surgeons with legal responsibilities for the care and welfare of animals, and an ethical review body that reviews any proposals for the use of animals and promotes the three Rs of animal use.
Our manifesto commitment stands in recognition of the fact that the phasing out of animal testing has to be in lockstep with the development of alternatives. As yet, the reality is that the technology is not advanced enough for alternative methods to completely replace the use of animals. For now, animal testing and research play an important role in supporting the development of new medicines and cutting-edge medical technologies for humans and animals, and it supports the safety and sustainability of our environment.
Animal research has helped us to make life-changing discoveries, from new vaccines and medicines to transplant procedures, anaesthetics and blood transfusions. The development of the covid-19 vaccine, as with all vaccines, was made possible only because of the use of animals in research. Animals are used to assess how potential new medicines affect biological systems, ensuring that drugs are safe and effective before human trials. Many products that would be unsafe or ineffective, or that could cause harm to humans, are detected through animal testing, ensuring the safety of the healthy volunteers who take part in clinical trials, as well as of future patients.
We have heard from a number of Members today—some of them very learned Members of Parliament who have professional backgrounds in this area—about the serious doubts regarding the efficacy of some of the tests the Minister is referring to. Would she be willing after the debate to share with me the sources she is using to support her claims regarding the value of this testing?
I thank the hon. Member for that intervention, and I am happy to share the research and reasons behind my arguments.
For the reasons I have given, animal testing is required by the international agreements followed by all global medicines regulators, including the UK’s Medicines and Healthcare products Regulatory Agency. Although the MHRA does not require all medicines to be tested on two species, safety testing in a second species is required for most drugs, with dogs being one of the species that can be used.
The key proposal in the petition is for an immediate ban on the use of dogs in scientific and regulatory procedures. None of us wants dogs to be used in research, despite how carefully animal welfare is regulated. However, I regret to say that forbidding the use of dogs in medical research—without alternatives at the moment—would likely have catastrophic effects on the UK’s medical research system. We would be unable to meet international regulatory requirements for drug safety testing, preventing virtually all first-in-human trials in the UK and vastly reducing the number of subsequent clinical trials. A significant proportion of basic research would cease, preventing new insight into disease and treatments that save lives and improve people’s health. Forbidding the use of animals in medical research would also likely have a negative impact on animal welfare. Animal testing would move overseas, to countries where the regulations on the use of animals in science are less stringent than they are here.
I am proud to say that the UK is world leading in the development of alternative methods, and we are keen to utilise that technology as much as possible. As much as we can, we are striving to partner with regulators to see how advances in technology can phase out animal use where we are able to do that.
The Minister is making an interesting speech, because the Labour party manifesto commitment is very clear: we are looking to ban animal testing. We have talked about a road map, which Labour has committed to, so when will that be published and when will the strategy be published? I ask because those are vital things that people in the Public Gallery want to know today.
My hon. Friend intervenes at the right time, as I was about to say that in publishing our road map, we will be setting out how we can go even further in supporting alternative methods and working towards a world where the use of animals in science is eliminated in all but exceptional circumstances. That will be achieved by creating a research and innovation system that replaces animals with alternatives wherever possible.
Currently, through UKRI, the Government support the development and dissemination of the three Rs. That is achieved primarily through funding for the National Centre for the Replacement, Refinement and Reduction of Animals in Research, which works nationally and internationally to drive the uptake of alternative technologies and to ensure that advances are reflected in policy, practice and regulations on animal research.
I failed to catch the Minister’s eye on her previous mention of the three Rs. Does she agree that the number of procedures using specially protected species—cats, dogs, horses and non-human primates—has actually increased over recent years, to about 17,000 from about 15,000 in 2022 and that that was driven by a 38.9% rise in procedures using horses? Does she also agree that our hon. Friend the Minister for Security confirmed that in the period from 1 January 2023 to 30 September 2024, no applications for a project licence under the Animals (Scientific Procedures) Act 1986 were refused? Does she see reductions in the number of animals being used in testing or are they actually increasing as part of the strategy?
The stats that I have say that in 2023 the use of dogs in procedures reduced by 9%. On overall animal testing, I will have to get back to my hon. Friend. I am sure that my colleagues from the Home Office will be able to explain the stringent licensing process—the procedure that everyone has to go through to be able to obtain a licence.
We want to replace the use of animals in scientific procedures with alternatives where we can. That is why our current approach is to support and fund the development and dissemination of techniques that replace, reduce and refine the use of animals in research, and to ensure that the UK has a robust regulatory system for licensing animal studies and enforcing legal standards, which will drive their uptake. We have a commitment in our manifesto to do all we can to phase out the use of animals—including dogs—in science, and we will be publishing a road map before the end of the year to lay out how we can give increased impetus to the support and validation of alternative methods.
Colleagues asked about ensuring that we are consulting animal welfare organisations, and there is a roundtable meeting with the Office for Life Sciences and animal welfare organisations to do precisely that. The hon. Member for Winchester (Dr Chambers) requested a meeting to discuss issues around the benefits of testing on animals. I am happy to agree to that and will be in contact with his office to arrange one.
I conclude by again thanking Members for their insightful contributions to today’s debate, and I look forward to working together as we go forward.
(1 month ago)
Written StatementsToday the Government have published a policy statement on proposed legislative measures to bolster the UK’s cyber-security and resilience.
Our digital economy and essential services are increasingly being attacked by cyber-criminals and state actors, threatening essential public services and infrastructure. This poses a serious risk not only to UK citizens, with core services like hospitals being targeted, but also to the performance of our economy. UK businesses lost around £87 billion from cyber-attacks between 2015 and 2019—that is £87 billion taken from our economy, much of which went into the hands of cyber-criminals.
Enhanced cyber-security is an essential pillar not only of our national security, but of the UK’s economic growth. We cannot have economic growth without stability, and we cannot have stability without national security.
The UK’s only existing cross-sector cyber legislation—the Network and Information Systems (NIS) Regulations—was introduced in 2018 when the UK was still an EU member state. The rapidly evolving threat landscape and changing nature of digital services mean that these regulations need to be updated, and we no longer have powers in primary legislation to make the amendments needed.
That is why we committed to introduce a cyber-security and resilience Bill in the King’s Speech in July last year. As set out in the policy statement published today, the Bill will strengthen the UK’s cyber-defences and make sure that the critical infrastructure and digital services UK citizens and business rely on are more secure. This will enhance the UK’s level of cyber-security and resilience at a time when similar steps are being taken by our international counterparts, such as the EU, which has updated the NIS framework through its own updated directive.
The policy statement provides more detail to the Bill’s measures announced in the King’s Speech:
Expanding the scope of regulations to protect more digital services and supply chains. The Bill will bring managed IT service providers that provide digital services into the scope of the regulatory framework. The Bill will allow individual regulators to designate a small number of important suppliers to regulated entities as “critical suppliers”, including those that would otherwise be exempted from regulation as SMEs. This, in addition to embedding supply chain security requirements directly into our regulatory framework, will address supply chain vulnerabilities and reduce the threat of significant disruptions to critical services. This will build a better picture of the threats facing our critical national infrastructure and protect a broader range of services from cyber-attacks.
Empowering regulators and enhancing oversight. Regulators will be better equipped with the tools they need to perform their duties effectively, including enhanced oversight of cyber-incidents affecting regulated entities and improved cost recovery powers. The Information Commissioner’s information gathering powers will be strengthened, to improve its understanding of the landscape of cyber-security threats affecting the expanded portfolio of digital service providers that it will oversee.
Ensuring the regulatory framework can keep pace with the ever-changing cyber-landscape. The Bill will allow the Government to update the regulatory framework in the future via secondary legislation, if necessary. For example, the Government would be able to bring new sectors into scope of the regulations, if necessary to do so. The Bill will enable the Government to update the security requirements for regulated services in line with best practice, improving clarity for service providers in terms of what is expected of them.
In addition to the policy proposals outlined in the King’s Speech for inclusion in the Bill, we have identified a number of additional cyber-security and resilience proposals, as set out in the policy statement. The appropriate legislative vehicle for these has yet to be determined.
The Government propose bringing data infrastructure into the scope of the regulatory framework, recognising their new status as critical national infrastructure and essential role in ensuring the stability and growth of our digital economy. Additionally, to ensure our regulatory framework is implemented with a consistent understanding of the Government’s cyber-security and resilience objectives, we propose enabling the Secretary of State to publish a statement of strategic priorities. This will establish a unified set of objectives and expectations for regulators. Finally, we intend to provide new powers to the Secretary of State to direct a regulator, or regulated entities, to take action when it is necessary for national security. This will be invaluable in responding to the constant evolution of both the cyber-landscape and the changes in tactics used by cyber threat actors.
The Government have listened to the views expressed to the previous Government in the 2022 consultation on cyber-security to develop the Bill’s measures. The measures set out in the policy statement build on what we have learned from our engagement with key international partners, including learnings from the European Union on the implementation of the NIS2 directive (Directive (EU) 2022-2555) and 2023 data infrastructure consultation. We will continue to engage with and learn from the actions taken by other nations to improve cyber-security.
These cyber-security and resilience measures represent a significant step forward in our efforts to protect the UK from the growing threats of cyber-attacks. Cyber-security is a critical enabler of economic growth, and by protecting our digital assets and ensuring the resilience of our critical services we are creating a stable environment that fosters innovation and attracts investment.
My officials and I will engage with parliamentarians, regulators and industry groups to thoroughly test the proposals before the Bill is introduced to Parliament this year.
[HCWS572]
(1 month, 1 week ago)
Commons ChamberWe work closely with the Department of Health and Social Care to support research into this terrible disease. UK Research and Innovation invested £10 million in MND research in 2023-24, and it also plays a key role in funding the underpinning research that benefits medical research more generally. Since 2022, the Medical Research Council and the National Institute for Health and Care Research have awarded £2.8 million to MND projects led by Scottish research organisations.
My hon. Friend will know of the important work of MND campaigners, including my constituent Mark Sommerville, who are seeking more Government investment in MND research. I recognise that any further plans for research and development investment would be outlined after the spending review in June, but can my hon. Friend give some reassurance to those with MND and their families, for whom time matters so much, that the Department is giving consideration to boosting investment in MND research, working with key partners to accelerate the development of new treatments?
I pay tribute to my hon. Friend for drawing attention to the work of the Mark Sommerville Foundation in this important area. Government funders are investing in MND research to accelerate progress. Let me give just one example. Through UK Research and Innovation and the National Institute for Health and Care Research, the Government are investing £6 million in the MND translational accelerator, led by Dementias Platform UK. The aim of the funded projects is to accelerate the development of treatment for MND.
It is good to hear that there is continuing investment in the search for therapies and indeed cures to deal with this horrific disease, but even if therapies do emerge, one of the frustrations in getting them to patients may be the inability of scientists to obtain access to clinical trials. In “Life Sciences Vision”, published in 2021, a number of groups combined to look into access to clinical trials in the UK and the possibility of increasing the number of such trials, but acceleration has not been good. I chair the all-party parliamentary group for life sciences, and one of the comments I hear most frequently in the industry is about the need for a more focused effort in this regard. Would the Minister consider establishing a clinical trials taskforce in her Department to drive this important work forward?
The Government are doing and have done a great deal. We have continued to support this work through both UKRI and the NIHR, and a large amount of funding has gone into clinical research. However, I should be happy to discuss the issue further with the right hon. Gentleman, and to let him know what more work could be done on clinical trials.
(1 month, 1 week ago)
General CommitteesIt may help the Committee if I clarify from the Chair what we are debating. The motion in the name of the Secretary of State for Science, Innovation and Technology is listed in the “Future Business” section of the Order Paper, and the House will be asked to pass the motion without debate after the text has been agreed by this Committee.
I beg to move,
That this House authorises the Secretary of State to undertake to pay, and to pay by way of financial assistance under section 8 of the Industrial Development Act 1982, a grant or grants exceeding £30 million and up to a total of £129 million to BioNTech UK Limited to support their planned expansion of research and development and artificial intelligence activities in the UK over the next 10 years.
It is an honour to serve under your chairmanship, Ms Jardine.
This investment comes at an important time for the UK’s thriving life sciences sector, which forms a key pillar of two of the Government’s missions: to kick-start economic growth; and to build an NHS fit for the future. The sector is responsible for over £100 billion of turnover in the UK, and it supports over 304,000 jobs in 6,850 businesses. In addition to supporting our economy, the sector also delivers for patients by providing the medicines and technologies that people need to live longer, healthier lives.
As we will set out in the life sciences sector plan, we must build on our world-leading R&D ecosystem and double down on rebuilding an internationally competitive business environment so that innovative companies can start, scale and stay here in the UK. To deliver that plan, we will continue to work in partnership with industry, our life sciences ecosystem and the NHS to seize opportunities that will foster innovation across the UK. To that end, through this proposed grant, we have an opportunity for the UK to secure international investment in innovative, cutting-edge R&D in the face of increasing global competition.
As the right hon. and hon. Members present know, BioNTech is an international leader in the biotechnology industry, and the developer of the first licensed mRNA covid-19 vaccine. Building on the vaccine’s success and global impact, BioNTech has applied for a Government grant of £129 million to support its transformation and UK expansion, which will see it invest circa £1 billion over 10 years. Supported by the grant, BioNTech research activities will focus on structural biology, regenerative medicine, oncology and AI-driven drug discovery, spanning three locations and creating about 460 new, directly-employed, highly skilled jobs.
In Cambridge, BioNTech will set up a new centre of excellence to focus on drug discovery and development of new treatments for cancer and other serious diseases. That directly supports the Government’s ambition to boost the Oxford-Cambridge growth corridor. In London, BioNTech intends to establish a major hub, including a centre of AI expertise to leverage this game-changing technology and to enhance our understanding of diseases, their causes and drug targeting. At a third site—to be announced shortly—BioNTech plans to undertake R&D into vaccines, including for diseases with high pandemic potential.
BioNTech’s decision to invest in the UK and to expand its R&D activities builds on the Government’s existing strategic partnership with the company. That includes BioNTech’s work to provide up to 10,000 NHS patients with personalised immunotherapies by 2030, which is already transforming health outcomes by enabling UK patients to be among the first in the world to benefit from cancer vaccines. That support for BioNTech is further evidence of the Government’s backing of a world-leading life sciences sector. Working together, we are driving growth, creating jobs and fostering innovation that will translate into positive outcomes for patients. Supporting BioNTech’s investment is another signal of our commitment to this crucial sector ahead of launching our ambitious life sciences sector plan in the spring.
I commend the motion to the Committee.
I thank the Opposition and Liberal Democrat spokespeople. The funding we have discussed today will unlock around £1 billion to further boost the UK’s life sciences sector and, in turn, support the Government’s missions to kick-start economic growth and build an NHS fit for the future. It will also build on our significant progress and commitments to date, including the life sciences innovative manufacturing fund of up to £520 million announced by the Chancellor in October 2024, and our landmark partnerships with Oxford Nanopore and Eli Lilly.
The hon. Member for Runnymede and Weybridge (Dr Spencer) commented on the investment environment. I am sure he did not miss the fact that this Government attracted £63 billion-worth of investment at the last international investment summit. We have done the hard work to make that investment a reality. He may be interested to hear that, according to the latest CEO survey by PricewaterhouseCoopers, the UK is the second best country in the world in which to invest. However, we are not complacent, and we are fully committed to making the UK the best place to invest. The life sciences are an area of huge UK expertise, and they are key to that commitment. Securing this investment will send a clear message to innovative companies that the UK is open for business.
The hon. Member for Harpenden and Berkhamsted (Victoria Collins) asked about monitoring. The financial assistance will be monitored through the normal procedures used for any investment made by the Government. I am happy to send her details of that process and the timeline for this investment.
Working together with industry, this Government are delivering better patient outcomes and driving economic growth. I look forward to continuing that work, and to building on that momentum through the publication and rapid delivery of the life sciences sector plan and industrial strategy in the spring.
I commend the motion to the Committee.
Question put and agreed to.
(2 months ago)
Written StatementsI am repeating the following written ministerial statement made today in the other place by the Minister for the Future Digital Economy and Online Safety, my noble Friend Baroness Jones of Whitchurch.
In 2023, the previous Government appointed Baroness Bertin as the independent lead reviewer to explore issues surrounding the regulation, legislation and enforcement of online pornography. Throughout the review, she reviewed evidence submitted from the public, academics and civil society, as well as stakeholders in law enforcement, the pornography sector and health service providers. The final report provided to the Government is insightful and timely.
The report has been laid before Parliament today and it will also be available on gov.uk.
Baroness Bertin’s report highlights some of the harms caused by unregulated access to some online pornography. The review finds that online pornography can impact people’s health and mental wellbeing, and is potentially fuelling violence against women and girls offline.
Baroness Bertin’s review makes a case for bringing the regulation of pornography online into parity with offline regulation. In the time she has had to do the review, she has considered the existing evidence on the topic, but she has also highlighted where some issues are still poorly understood and more research is needed to understand the potential harms from pornographic content and how to mitigate those.
The review acknowledges the important protections that the Online Safety Act 2023 will put in place to protect young people from seeing harmful content online, including pornographic content. It also notes that the Act has made it a priority for in-scope services to proactively tackle the most harmful illegal content, which includes intimate image abuse, extreme pornography and child sexual abuse material.
This review has revealed shocking detail about the prevalence of violent and misogynistic pornography online, and the extent to which it is influencing dangerous offline behaviours, including in young relationships. Graphic strangulation pornography is illegal but is not always being treated as such and instead remains widely accessible on mainstream pornography platforms. There is increasing evidence that “choking” is becoming a common part of real-life sexual encounters, despite the significant medical dangers associated with it. The Government will take urgent action to ensure that pornography platforms, law enforcement and prosecutors are taking all necessary steps to tackle this increasingly prevalent harm.
Additionally, the review’s findings have noted that as technologies such as artificial intelligence continue to evolve and become increasingly sophisticated and accessible, they are reshaping the online pornography landscape. Individuals can now create sexual content, consensually and non-consensually, with nudification applications and other forms of software. Baroness Bertin has found that more needs to be done to protect those online from being victimised by non-consensual sexual content.
The Government are delivering our manifesto commitment to ban sexually explicit deepfakes: the Data (Use and Access) Bill introduces a new offence that will criminalise the creation of a purported intimate image, or deepfake, of an adult without their consent. It will also criminalise asking someone to create a purported intimate image, or deepfake, for you, regardless of where that person is based or whether the image is created.
We are introducing a package of offences in the Crime and Policing Bill to tackle the taking of intimate images without consent and the installation of equipment with intent to enable the taking of intimate images without consent. Through the offences at section 66B of the Sexual Offences Act 2003, the law already captures situations where intimate images, including deepfakes, are shared without consent.
Together these measures will ensure that law enforcement can effectively tackle this abusive behaviour. This demeaning and disgusting form of chauvinism must not become normalised, and as part of our plan for change we are bearing down on violence against women, whatever form it takes. We are putting offenders on notice: they will face the full force of the law.
The review has also made several recommendations related to the education system. This Government consider healthy relationships a key part of RSHE—relationships, sex and health education—and relationships education will support our mission to halve violence against women and girls in the next decade. This Government will support schools to tackle misogyny and promote healthy relationships and positive masculinity.
The relationship, sex and health education statutory guidance is currently being reviewed following a public consultation last year. As part of this, we are working with stakeholders and teachers to ensure that the curriculum covers all content that pupils need to keep themselves and others safe and to be respectful in their relationships.
This Government are equipping teachers with the information, resources and training to teach young people about healthy relationships and behaviour, which plays a significant role in preventing harmful sexual behaviours. We have recently published a new guide for teachers on incel culture on the Department’s Education Against Hate website. Teacher training contains the teachers’ standards, including high expectations of behaviour, and we are working with schools on what more we can do to support them to root out misogyny and ensure that young people treat each other with respect.
This Government have set out an unprecedented mission to halve violence against women and girls within a decade, and this will require a renewed focus on prevention—including ensuring that online content is not encouraging offline violence and abuse. We will therefore take forward the findings of Baroness Bertin’s review, which will help to inform the cross-Government violence against women and girls strategy to be published in the next few months.
I thank Baroness Bertin for her efforts in bringing this report together and shedding light on a complex yet deeply important topic. The Government will provide a further update on how they are tackling the issues raised in the review as part of their mission to tackle VAWG in due course.
[HCWS479]
(2 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this debate on the implementation of the Online Safety Act. I know that he has been following the Bill throughout its passage and has been a critic of every Minister, even his Government’s Ministers, whenever the Bill was watered down or delayed, so I expect him to hold all of us to account. I am grateful to him and all the hon. Members who have spoken this morning. The Government share their commitment to keeping users safe online. It is crucial that we continue to have conversations about how best to achieve that goal.
The Online Safety Act lays the foundations for strong protections against evil content and harmful material online. It addresses the complex nature of online harm, recognising that harm is not limited to explicit content and extending to the design and functionality of online services. We know that the legislation is not perfect. I hear that at every such debate, but we are committed to supporting Ofcom to ensure that the Act is implemented quickly, as this is the fastest way to protect people online. 2025 is the year of action for online safety, and the Government have already taken a number of steps to build on Ofcom’s implementation of the Act. In November last year, the Secretary of State published the draft “Statement Of Strategic Priorities for online safety”. That statement is designed to deliver a comprehensive, forward-looking set of online safety priorities for the full term of this Government. It will give Ofcom a backing to be bold on specific areas, such as embedding safety by design, through considering all aspects of a service’s business model, including functionalities and algorithms.
We are also working to build further on the evidence base to inform our next steps on online safety, and I know that this issue was debated earlier this week. In December, we announced a feasibility study to understand the impact of smartphones and social media on children, and in the Data (Use and Access) Bill, we have included provisions to allow the Secretary of State to create a new researcher access regime for online safety data. That regime is working to fix a systemic issue that has historically prevented researchers from understanding how platforms operate, and it will help to identify and mitigate new and preventable harms. We have also made updates to the framework, such as strengthening measures to tackle intimate image abuse under the Online Safety Act, and we are following up on our manifesto commitment to hold perpetrators to account for the creation of explicit, non-consensual deepfake images through amendments to the Data Bill.
We are also building on the measures in the Online Safety Act that allow Ofcom to take information on behalf of coroners. Through the Data Bill, we are bringing in additional powers to allow coroners to request Ofcom to issue a notice requiring platforms to preserve children’s data, which can be crucial for investigations into a child’s tragic death. My hon. Friend the Member for Darlington (Lola McEvoy) raised Jools’ law, of which I am very aware, and I believe that she is meeting Ministers this week to discuss it further.
Finally, we recently announced that, in the upcoming Crime and Policing Bill, we are introducing multiple offences to tackle AI sexual abuse, including a new offence for possessing, creating or supplying AI tools designed to generate child sexual abuse material.
Members have raised the issue of the Act’s implementation being too slow. We are aware of the frustrations over the amount of time that it has taken to implement the Online Safety Act, not least because of the importance of the issues at hand. We are committed to working with Ofcom to ensure that the Online Safety Act is implemented as quickly and effectively as possible.
On implementation, would the Minister give clarity about the watermark for re-consultation and the point of delay of implementing the children’s codes under the Act? Amendments could be made to the children’s codes and I do not think they would trigger an automatic re-consultation with platforms. Could the Minister elaborate on where the delay would come from and how much scope Parliament has to amend those codes, which will be published in April?
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.
I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.
We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.
Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.
We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.
If the Government fully support our concerns about small but harmful sites, will the statutory instrument be reworked to bring them back into category 1, as the Act states?
The Government are confident that the duties to tackle illegal content and, where relevant, protect children from harmful content will have a meaningful impact on the small but risky services to which the hon. Gentleman refers. Ofcom has created a dedicated supervision taskforce for small but high-risk services, recognising the need for a bespoke approach to securing compliance. The team will focus on high-priority risks, such as CSAM, suicide and hate offences directed at women and girls. Where services do not engage with Ofcom and where there is evidence of non-compliance, Ofcom will move quickly to enforcement action, starting with illegal harm duties from 17 March, so work is being done on that.
The comprehensive legal safety duties will be applied to all user-to-user forums, and child safety duties will be applied to all user-to-user forums likely to be accessed by children, including the small but high-risk sites. These duties will have the most impact in holding the services to account. Because of the deep concerns about these forums, Ofcom has, as I said, created the small but risky supervision taskforce. For example, Ofcom will be asking an initial set of firms that pose a particular risk, including smaller sites, to disclose their illegal content risk assessment by 31 March.
The Government have been clear that we will act where there is evidence that harm is not being adequately addressed despite the duties being in effect, and we have been clear to Ofcom that it has the Government’s and Parliament’s backing to be bold in the implementation of the Online Safety Act. We are in clear agreement that the Act is not the end of the road, and Ofcom has already committed to iterating on the codes of practice, with the first consultation on further measures being launched this spring. The Government remain open minded as to how we ensure that users are kept safe online, and where we need to act, we will. To do so, we must ensure that the actions we take are carefully considered and rooted in evidence.
Will the consultation this spring for the next iterations of the codes include consultation with parliamentarians, or is it solely with platforms?
I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.
I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.
Can the Minister explain what she meant when she said that Ofcom had to ensure that the codes were as judicial review-proofed as possible? Surely Ofcom’s approach should be to ensure that the codes protect vulnerable users, rather than be judicial review-proofed.
The point I was trying to make was that Ofcom is spending time ensuring that it gets the codes right and can implement them as soon as possible, without being delayed by any potential challenge. To avoid any challenge, it must ensure that it gets the codes right.
(2 months, 4 weeks ago)
General CommitteesI beg to move,
That the Committee has considered the draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025.
Thank you for coming to save the day, Sir Christopher; it is an honour to serve under your chairmanship. These regulations were laid before Parliament on 16 December 2024. As the Online Safety Act 2023 sets out, the Secretary of State must set thresholds for three categories of service: category 1, category 2A and category 2B. The services that fall into each of those categories will be required to comply with additional duties, with category 1 services having the most duties placed on them. The duties are in addition to the core duties that apply to all user-to-user and search services in scope.
The 2023 Act requires that specific factors must be taken into account by the Secretary of State when deciding thresholds for each category. The threshold conditions for user-to-user services must be set on user numbers and functionalities as well as any other characteristics or factors relating to the user-to-user part of the service that the Secretary of State deems relevant.
For category 1, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities, on how quickly, easily and widely regulated user-generated content is disseminated by means of the service. For category 2A, the key consideration is the likely impact of the number of users of the search engine on the level of risk of harm to individuals from search content that is illegal or harmful to children. For category 2B, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities on the level of risk of harm to individuals from illegal content or content that is harmful to children disseminated by means of the service.
Those conditions form the basis of Ofcom’s independent research and advice, as published in March 2024, which the Secretary of State was required to consider when setting threshold conditions. In laying these regulations before Parliament, the Secretary of State has considered the research carried out and the advice from Ofcom and agreed to its recommendations.
I understand that this decision will not please everyone. In particular, I recognise that the thresholds are unlikely to capture so-called “small but risky services”, as per Baroness Morgan’s successful amendment, which made it possible to create a threshold condition by reference only to functionalities and any other factors or characteristics. However, it is important to note that all regulated user-to-user and search services, no matter their size, will be subject to existing illegal content duties and, where relevant, child safety duties. The categories do not change that fact.
If the codes on illegal content duties currently laid before Parliament pass without objection, the duties will be in effect by this spring. They will force services to put in place systems and processes to tackle illegal content. If a service is likely to be accessed by children, the child safety duties will require services to conduct a child safety risk assessment and provide safety measures for child users. We expect that those will come into effect this summer, on the basis that the codes for the duties will have passed by then.
Together, the illegal content and child safety duties will mark the biggest material change in online safety for UK citizens since the internet era began. We expect the Online Safety Act to cover more than 100,000 services of various sizes, showing that the legislation goes far and wide to ensure important protections for users, particularly children, online.
The instrument before us will enable additional duties for categorised services. All categorised services must comply with transparency reporting duties. They must also have terms on the ability of parents to access information about children’s use of a service in the event of a child’s death. Category 1 services will have the most additional requirements. They will have to give adults more choice about the content they see and the people they interact with, and they must protect journalistic and news publisher content and content of democratic importance. The duties will also ensure that we can hold these companies to account over their terms of service, ensuring that they keep the promises they make to their users.
Once in force, the regulations will enable Ofcom to establish a public register of categorised services, which it expects to publish this summer. Ofcom will then consult on the draft codes of practice and guidance where relevant for additional duties. Ofcom will also do additional work to tackle small but risky services.
Ofcom’s work to tackle egregious content and enhance accountability does not stop with this instrument, which takes me back to the small but risky services that I mentioned. The horrifying stories I have heard about these sites during a number of debates recently are truly heartbreaking; we must do everything in our power to prevent vulnerable people from falling victim to such circumstances. I was pleased to see Ofcom set out in September 2024 its targeted approach to tackling small but risky services, which includes a dedicated supervision taskforce and a commitment to move to rapid enforcement action where necessary. That followed a letter from the Secretary of State to Ofcom inquiring about those services.
I am confident that the regulatory framework, combined with the bespoke taskforce, will work to keep all UK citizens safe online, but I must stress that the Secretary of State will hold the thresholds under review going forward. If there is evidence that the categories have become outdated or that they inadequately protect users, he will not shy away from updating them or reviewing the legislation, as he has made clear recently.
Finally, the online world that we are looking to govern is complex and ever-changing. The Act will not solve every problem, but it will bring real benefit to children and adults who have to contend with an unsafe online world for far too long. We should see the instruments we are debating as a step in that process and a first iteration, not as something fixed or set in stone, because there is much more to do. Our foremost priority is the timely implementation of the Act to enforce the additional duties as soon as possible. Years of delay and indecision have already come at a heartbreaking cost for vulnerable children and adults. Now it is time to deliver, but that relies on Parliament approving the categorisation thresholds without delay.
I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.
The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.
For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.
The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.
In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.
As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.
Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.
Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.
These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.
The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.
The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.
The Minister raised the issue of age verification, which is good. However, she did not say how “harmful to adults”, “harmful to vulnerable minorities” and “harmful to women” are categorised. Children are protected in this case, but those other groups are not.
Also, in response to the answer that the Minister just gave, the difficulty is not the Ofcom powers; it is the obligation on the provider. If we have not put a provider into category 1, it does not have the same level of obligation as category 1 companies do. No matter what powers Ofcom has and no matter what fines it imposes, it cannot get such companies to give those commitments to a category 1 level if they are not in that category.
Removing the section is not giving Ofcom the tools it needs. The Minister was absolutely right earlier when she said that there is much more to do. Why drop this ability to put other sites in category 1?
I think the hon. Member missed it when I said that, as things stand, the Secretary of State does not have the power to include them. It is not about removing them; it is about not having the powers to include them, as things stand, at the moment.
I will conclude. In extreme cases, Ofcom, with the agreement of the courts, uses business disruption measures, which are court orders that mean third parties have to withdraw non-compliant services, or restrict or block access to non-compliant services in the UK.
The hon. Member for Newton Abbot also asked whether the Act will be reviewed to address the gaps in it. As I said at the start, our immediate focus is getting the Act implemented quickly and effectively. It was designed to tackle illegal content and protect children, and we want those protections in place as soon as possible. It is right that the Government continually assess the ability of the framework to keep us safe, especially given that technology develops so quickly. We will look, of course, at how effective these protections are and build on the Online Safety Act, based on evidence. However, our message to social media companies remains clear: there is no need to wait. As the Opposition spokesperson said, those companies can and should take immediate action to protect their users.
On the use of business disruption measures, the Act provides Ofcom with powers to apply to court for such measures, as I have said, including where there is continued failure and non-compliance. We expect Ofcom to use all available enforcement mechanisms.
The hon. Member for Huntingdon asked how Parliament can scrutinise the delivery of the legislation. Ongoing parliamentary scrutiny is absolutely crucial; indeed, the Online Safety Act requires Ofcom codes to be laid before Parliament for scrutiny. The Science, Innovation and Technology Committee and the Communications and Digital Committee of the House of Lords will play a vital role in scrutinising the regime. Ofcom’s codes of practice for illegal content duties were laid before Parliament in December. Subject to their passing without objection, we expect them to be in force by spring 2025, and the child safety codes are expected to be laid before Parliament in April, in order to be in effect by summer 2025. Under section 178 of the Act, the Secretary of State is required to review the effectiveness of its regulatory framework between two and five years after key provisions of the Act come into force. That will be published as a report and laid before Parliament.
Letters were sent in advance of laying these regulations to the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. Hon. Members have asked about user numbers. Ofcom recommended the threshold of 34 million or 7 million for category 1. Services must exceed the user number thresholds. The Government are not in a position to confirm who will be categorised. That will be the statutory role of Ofcom once the regulations have passed.
I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.
Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.
On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.
In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.
Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.
The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.
The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.
The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.
I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.
I am extremely grateful to the Minister for giving way, and I have sympathy with her position, especially in relation to legal advice, having both received it and given it. I suggest that the Minister is talking about two different things, and they need to be separated. The first is the question of whether legal but harmful content was removed from the Bill, which it undoubtedly was. Measures in relation to content that is neither unlawful nor harmful to children were largely removed from the Bill—the Minister is right to say that.
What we are discussing, however, are the tools available to Ofcom to deal with those platforms that it is still concerned about in relation to the remaining content within the ambit of the Bill. The worry of those of us who have spoken in the debate is that the Government are about to remove one of the tools that Ofcom would have had to deal with smaller, high-harm platforms when the harm in question remains in ambit of the Bill—not that which was taken out during its passage. Would the Minister accept that?
I will again set out what the Secretary of State’s powers are. The Government have considered the suggestion of Baroness Morgan and others to categorise small but risky based on the coroner or Ofcom linking a service to a death. The Government were grateful for that suggestion. However, there were issues with that approach, including with what the Act allows the Secretary of State to consider when setting the categories. The Secretary of State is not allowed to consider anything other than the factors set out in the Act, which says that it has to include easy, quick and wide dissemination for category 1, and has to be evidence based.
I hope that the hon. Member for Aberdeen North will accept that I will write to her in great detail, and include a letter from Government lawyers setting out what I am saying in relation to the powers of the Secretary of State in setting the categories. I hope that she will be satisfied with that. I want to make it clear that we are not taking anything out; the Secretary of State is proceeding with the powers that he has been given.
I am going to proceed. I think I have covered the main points raised by hon. Members. I hope that the Committee agrees with me on the importance of enacting these thresholds and implementing the Online Safety Act as swiftly as possible. I made it clear that Ofcom has set up a taskforce that will review the small but risky sites, in response to the Secretary of State’s letter to it in September.
It is an honour to serve under your chairmanship, Sir Christopher. My right hon. and learned Friend the Member for Kenilworth and Southam was Attorney General for four years. It is just possible that his interpretation of the Act is correct, and that of the Minister’s officials is incorrect. I do not have detailed knowledge of this legislation, but I wonder whether the Minister and her Whip want to take some further time and pause before putting these regulations to a vote—that would be perfectly acceptable to us. We will not oppose the regulations, but we are cautious that if the Minister wants more time, she is welcome to take it.
Although I thank the hon. Member for his contribution, I am sure that he will appreciate that this issue has been looked into and discussed in debates and with officials. With that, I commend these regulations to the Committee.
The comments made by the hon. Member for Aberdeen North are absolutely outrageous, but I would not expect anything less from the SNP. I have made it very clear that I will share legal advice with Members. I also made it clear that the small but risky sites that Members have been talking about were raised by the Secretary of State in a letter to Ofcom in September, and Ofcom has set up a taskforce to look at those services.
The key thing for the Government is to get on with implementing the Online Safety Act. I know that the hon. Lady would like us to spend lots of time delaying, but we are interested in getting on with implementing the Act so that we can keep children safe online. With that, I commend the regulations to the House.
For the benefit of people watching, only Committee members can cast votes in a Division.
Question put.
(3 months, 2 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Sir Desmond. I start by paying tribute to Ellen Roome both for launching this petition and for all the campaigning she has done in this area. Let us take a moment to remember her son, Jools. As a parent, I know that we do everything to keep our children safe. We teach them how to cross a road and why it matters not to talk to strangers—we do all we can, but it can still be terrifying to think about what our children are exposed to, even in the safety of our own homes. I can only imagine how it would feel for a parent not to know how or why their child lost their life. I know that parents across the country feel the same way.
As we have heard, Ellen’s petition received over 120,000 signatures between 10 May and the dissolution of Parliament on 30 May. That shows the strength of feeling on this issue, and I am grateful to the brave parents, including Ellen, Ian and others who campaigned on this issue during the passage of the Online Safety Act, who continue to shine a light on it. The Secretary of State has met them a number of times, and their views are absolutely crucial to the work we are doing in this area. Finally, I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for securing a debate on this e-petition on behalf of the Petitions Committee, along with other hon. and right hon. Members for their powerful contributions.
I know how long it has taken to get the Online Safety Act across the line. It is not a perfect piece of legislation, and the delay in delivering it has come at a heartbreaking human cost. As the Secretary of State has set out numerous times, we are working to implement the Act as quickly as we possibly can so that the protections it puts in place can begin to change the online world that our children experience.
The Act has two provisions relevant to this debate. First, section 101 seeks to address problems faced when there is uncertainty over the circumstances leading to the death of a child. The provision supports coroners and procurators fiscal in their investigations by giving Ofcom the power to require information about a child’s online activity following a request from the investigating coroner. It is already in force, and the coroners have begun to make use of the powers available to them.
Secondly, section 75 imposes additional duties on categorised services to be transparent with parents regarding a company’s data disclosure processes following the death of a child. We have been clear that we plan to build on the Online Safety Act where it does not go far enough, and the Secretary of State only yesterday set out how the Online Safety Act is uneven and, in some cases, unsatisfactory. He also set out the need for Parliament to learn to legislate much faster—we cannot wait another 10 years to make changes to the legislation.
At the end of last year, the Secretary of State decided to use his powers to issue a statement of strategic priorities to Ofcom, asking them to ensure that safety is embedded in our online world from the very start. That is why the Government will also seek to establish a data preservation process through clause 122 of the Data (Use and Access) Bill. The proposed clause will require Ofcom to issue a data preservation notice to specified companies at the request of the coroner or, in Scotland, the procurator fiscal. That will require these companies to preserve information relating to the use of their services by the child who has died. This proposal fulfils a manifesto commitment to further strengthen powers, and will help coroners understand the tragic circumstances surrounding a child’s death.
Let me turn to the matter of coroners sharing information with families. Interested persons, including bereaved families, have the right to receive evidence from coroners, subject to their judicial discretion. The chief coroner has provided detailed guidance on this. Coroners have a statutory duty to issue a prevention of future deaths report if their investigation reveals that future deaths could be prevented by one or more measures. Evidence accessed via Ofcom powers will help to inform a decision on whether a report should be issued.
I know from parents and children just how complex this issue is. The Secretary of State recently visited the NSPCC, where he met a group of young people to understand more about their lives online. The NSPCC was concerned that giving parents complete access to their children’s social media accounts could raise complex issues around children’s rights to privacy and, in extreme cases—as we have heard today—safeguarding. For example, as raised earlier, if a child is exploring their sexuality online, they may not want their parents to know and they would be right to expect that privacy.
All Members raised the retrospective application of section 101 of the Act. Ofcom’s powers to require information from companies on behalf of coroners can still be used where a second coroner’s inquest is ordered. Ofcom can use these powers on the instruction of a coroner. Ofcom will also be able to use data preservation notices in the event that a second coroner’s inquest is ordered. Any personal data that is captured by the data preservation notice, and held by the online service at the time of issue, will still be in scope and must be retained upon receipt of notice. However, I have heard very powerfully from all Members today about the lengths parents have to go to request a second inquest and about the associated costs. As I have said, the legislation is not perfect and there is room for improvement, and I would like to meet Members and parents to explore this matter further. We need to continue to review the legislation.
When it comes to age limits, a smartphone and social media ban for under-16s has been raised. We are aware of the ongoing debate as to what age children should have smartphones or access to social media. As the Secretary of State for Science, Innovation and Technology has previously said, there are no current plans to implement a smartphone or social media ban for children. We will continue to do what is necessary to keep our children safe online.
On that note, we have heard from several Members today about their concerns for children’s mental health, when their expectations are often measured against heavily doctored images they see online. Will the Minister commit to use and/or amend legislation that commits hosts—as is common with regulated news outlets —to clearly identify doctored imagery, and the accounts and pages that spread them?
I will come to that point.
On the issue of a ban on smartphones and social media for under-16s, we are focused on building the evidence base to inform any future action. We have launched a research project looking at the links between social media and children’s wellbeing. I heard from the hon. Member for Esher and Walton (Monica Harding) that that needs to come forward and I will pass that on to my colleagues in the Department.
My hon. Friend the Member for Lowestoft (Jess Asato) mentioned the private Member’s Bill in the name of my hon. Friend the Member for Whitehaven and Workington (Josh MacAlister). We are aware of his Bill and share his commitment to keeping children safe online. We are aware of the ongoing discussion around children’s social media and smartphone use, and it is important that we allocate sufficient time to properly debate the issue. We are focused on implementing the Online Safety Act and building the evidence base to inform any future action. Of course, we look forward to seeing the detail of my hon. Friend’s proposal and the Government will set out their position on that in line with the parliamentary process.
My hon. Friend the Member for Darlington (Lola McEvoy) raised the issue of Ofcom’s ambitions. Ofcom has said that its codes will be iterative, and the Secretary of State’s statement will outline clear objectives for it to require services to improve safety for their users.
The hon. Member for Twickenham (Munira Wilson) and my hon. Friend the Member for Bournemouth West (Jessica Toale) mentioned engagement with children, and we know how important that is. Ofcom engaged with thousands of children when developing its codes, and the Children’s Commissioner is a statutory consultee on those codes, but of course we must do more.
The hon. Member for Huntingdon (Ben Obese-Jecty) raised the matter of mental health services and our commitment in that regard. He is right that the Government’s manifesto commits to rolling out Young Futures hubs. That national network is expected to bring local services together to deliver support for not only teenagers at risk of being drawn into crime, but those facing mental health challenges, and, where appropriate, to deliver universal youth provision. As he rightly said, that is within the health portfolio, but I am happy to write to him with more detail on where the programme is.
We want to empower parents to keep their children safe online. We must also protect children’s right to express themselves freely, and safeguard their dignity and autonomy online.
The Minister spoke earlier about age limits. I was not sure if she had finished responding to Members’ comments and questions, and whether she would be able to comment on not only what the various age thresholds should be, but what they mean. In particular, if the GDPR age is 13, does that mean that parental controls can effectively be switched off by somebody of age 13, 14 or 15?
I am sure the right hon. Gentleman’s party would have discussed the issue of the age limit and why it was 13 during the passage of the Online Safety Act.
I am more than happy to write to him in detail on why the age limit has been set at 13. As I said, there is currently a live discussion about raising the age and evidence is being collated.
The challenge of keeping our children safe in a fast-moving world is one that we all—Government, social media platforms, parents and society at large—share. As we try to find the solutions, we are committed to working together and continuing conversations around access to data in the event of the tragic death of a child.
I will finish by again thanking Ellen for her tireless campaigning. I also thank all the speakers for their thoughtful contributions. I know that Ellen has waited a long time for change and we still have a long way to go. Working with Ellen, the Bereaved Families for Online Safety group, other parents and civil society organisations, we will build a better online world for our children.
(3 months, 3 weeks ago)
Commons ChamberIn case the House has not heard, this Government are driving innovation, with a record £20.4 billion of research and development investment for 2025-26, powering an innovation-led economy across the UK. In Staffordshire, UK Research and Innovation is backing more than £29 million for 70 cutting-edge research and innovation projects. A stand-out example is Innovate UK’s support for the Staffordshire net zero skills for growth project, which is equipping the country to seize opportunities in the net zero transition.
Towns such as those in my constituency are key to the economy, but can face unique challenges in accessing innovation opportunities. Please could the Minister tell me how she plans to ensure that towns such as Stafford and Eccleshall are able to access new jobs, skills, investment and growth opportunities?
The Department has a clear vision to ensure that the UK remains at the forefront of global innovation—a place where cutting-edge businesses of all sizes can start and grow, and where local people have high-quality jobs, building on local strengths. I am delighted to hear about the new multimillion-pound facility being built at Newcastle and Stafford colleges’ Stafford campus in my hon. Friend’s constituency, supported by £15 million of Government investment. It will welcome learners from September and will help to provide the technical skills that businesses need, both now and in the future, to support regional and national productivity.
DSIT is leading the charge by establishing the digital centre of Government to harness technology and transform our public services. We are committed to improving digital inclusion and accessibility to ensure high-quality online services that are available to everyone. In the coming months, the Department will outline its plans and priorities for a digital centre and to advance digital inclusion.
Sunderland was recently named the UK’s smartest city by The Times. It was a pleasure to welcome the Secretary of State when he visited recently. More than 5,000 homes in our city now have assistive technology installed, supporting the independence of older and disabled people and improving their access to care. How do the Government plan to build on the example of Sunderland to improve access to public services across the UK?
The Government recognise the potential for digital technology to support people to live independently. We will set new national standards for care technologies and develop trusted guidance so that people who draw on care, their families and care providers can confidently buy what works and get the safest, most effective tech into their homes or services. In addition, we will take forward a range of initiatives in 2025-26, including funding more home adaptations and promoting the better use of care technology.
What steps is her Department taking to help older people who do not feel comfortable utilising technology to access public services?
The hon. Gentleman will be happy to hear that the Public Sector Bodies (Websites and Mobile Applications) (No. 2) Accessibility Regulations 2018 require most public sector organisations to ensure their services are accessible to disabled and older people by meeting the requirements of the web content accessibility guidelines and by publishing an accessibility statement in the prescribed format. The Government Digital Service’s accessibility monitoring team reviews public sector websites to ensure compliance with the accessibility regulations and supports Departments to improve their services.
I welcome the Minister’s approach to improving access through technology. However, the majority of the concerns that colleagues and I receive are from those who cannot use technology. Rather than improving access, for some, technology can act as a barrier. What is her assessment of the impact of digital exclusion in the UK? Will the digital inclusion strategy that she has announced include digital exclusion at all levels of Government?
Digital inclusion is a priority for this Government. We have set up the digital inclusion and skills unit to ensure that everyone has the access, skills, support and confidence to participate in modern digital society, whatever their circumstances. Work is ongoing to develop our approach to digital inclusion and co-ordinate across Departments, and we hope to announce more on that soon. We will work closely with the third sector, the industry, devolved Governments and local authorities to ensure that future interventions are targeted and based on individuals’ needs.
(4 months, 2 weeks ago)
Commons ChamberI thank the hon. Member for Leeds East (Richard Burgon) for opening the debate and all other colleagues who have contributed. I know that this issue will be close to the hearts of many of us, because it is about protecting the safety of everyone, including our children and young people.
This evening I want to talk about why this issue matters and what the Online Safety Act will do about it. First, I would like to share my deepest sympathies with family and friends of Joe Nihill—a 23-year-old man who ended his life after finding suicide-related content online. Unfortunately, stories such as Joe’s are not uncommon—we have heard about Tom, a 22-year-old young man, who also died from suicide. As part of our work in online safety we speak to groups that have campaigned for years for a safer internet, often led by bereaved families. I thank Joe’s mother Catherine, his sister-in-law Melanie and all the bereaved families for their tireless work. We continue to listen to their expertise in this conversation.
People who are thinking about ending their lives or hurting themselves might turn to the internet as a place of refuge. All too often, what they find instead is content encouraging them not to seek help. That deluge of content has a real-world impact. Suicide-related internet use is a factor in around a quarter of deaths by suicide among people aged 10 to 19 in the UK—at least 43 deaths a year. Lots of research in this area focuses on children, but it is important to recognise that suicide-related internet use can be a factor in suicide in all age groups. These harms are real, and tackling them must be a collective effort.
On the hon. Member’s first point, we welcome efforts by all companies, including internet service providers, to tackle illegal content so that no more lives are tragically lost to suicide. Online safety forms a key pillar of the Government’s suicide prevention strategy. However, we are clear that the principal responsibility sits squarely with those who post such hateful content, and the site where it is allowed to fester—sites that, until now, have not been made to face the consequences. The Online Safety Act has been a long time coming. A decade of delay has come at a tragic human cost, but change is on its way. On Monday, Ofcom published its draft illegal harms codes under the Online Safety Act, which are a step change.
On the hon. Member’s second point, I can confirm that from next spring, for the first time, social media platforms and search engines will have to look proactively for and take down illegal content. These codes will apply to sites big and small. If services do not comply they could be hit by massive fines, or Ofcom could, with the agreement of the courts, use business disruption measures —court orders that mean that third parties have to withdraw their services or restrict or block access to non-compliant services in the UK. We have made intentionally encouraging or assisting suicide a priority offence under the Act. That means that all providers, no matter their size, will have to show that they are taking steps to stop their sites being used for such content.
The strongest protection in the Act’s frameworks are for children, so on the hon. Member’s third point, I assure him that under the draft child safety codes, any site that allows content that promotes self-harm, eating disorders or suicide will now have to use highly effective age limits to stop children from accessing such content. Some sites will face extra duties. We have laid the draft regulations setting out the threshold conditions for category 1, 2A and 2B services under the Act. Category 1 sites are those that have the ability to spread content easily, quickly and widely. They will have to take down content if it goes against their terms of services, such as posts that could encourage self-harm or eating disorders. They will also have to give adult users the tools to make it less likely they will see content that they do not want to see, or will alert them to the nature of potentially harmful content.
A suicide forum will be unlikely to have terms of services that restrict legal suicide content, and users of these sites are unlikely to want to use tools that make it less likely they will see such content. However, that absolutely does not mean that such forums—what people call “small but risky” sites—can go unnoticed.
Every site, whether it has five users or 500 million users, will have to proactively remove illegal content, such as content where there is proven intent of encouraging someone to end their life. Ofcom has also set up a “small but risky” supervision taskforce to ensure that smaller forums comply with new measures, and it is ready to take enforcement action if they do not do so. The Government understand that just one person seeing this kind of content could mean one body harmed, one life ended, and one family left grieving.
The problem is that the sites that the hon. Member for Leeds East (Richard Burgon) referred to—and there are many others like them—do not necessarily fall into the illegal category, although they still have extremely dangerous and harmful content. Despite a cross-party vote in Parliament to include in the Online Safety Act these very small and very dangerous sites in category 1, there has been a proactive decision to leave them out of the illegal harms codes, which were published yesterday. Can the Minister put on record exactly why that is? Why can these sites not be included in that category? There is all sorts of content glamourising suicide, self-harm, eating disorders and other hate speech that is being promoted by these small sites. They should be regulated to a high level.
Based on research regarding the likely impact of user numbers and functionalities, category 1 is about easy, quick and wide dissemination of regulated user-generated content. As Melanie Dawes set out in her letter to the Secretary of State in September, Ofcom has established a “small but risky” supervision task, as I mentioned, to manage and enforce compliance among smaller services. It has the power to impose significant penalties and, as I say, to take remedial action against non-compliant services. As the hon. Member for Leeds East mentioned earlier, the Online Safety Act is one of the biggest steps that Government have taken on online safety, but it is imperfect. It is an iterative process, and it will be kept under review.
I thank the hon. Gentleman for raising this matter, and for bringing to our memory Joe Nihill and those like him, who turned to the internet for help and were met with harm. On his final point, on the effective implementation of the Online Safety Act, we will continue to engage with all providers in this space. I am confident that these measures are a big step in making tech companies play their part in wiping out those harms and making the internet a safer place for us all. The hon. Gentleman raised the matter of an outstanding question. I do not know whether he has gone to the wrong Department, but I will commit to looking up that question and ensuring that he receives a response to it.
With that, I thank you, Madam Deputy Speaker, and wish you and the whole House a very happy Christmas.
Question put and agreed to.