Motion to Regret
12:24
Moved by
Lord Clement-Jones Portrait Lord Clement-Jones
- View Speech - Hansard - - - Excerpts

That this House regrets that the draft Protection of Children Codes of Practice for search services does not fully deliver the level of protection for children envisaged by the Online Safety Act 2023 due to regulatory gaps, accessibility challenges, and the consultation process failing adequately to address feedback from civil society organisations and victims’ groups.

Relevant document: 25th Report from the Secondary Legislation Scrutiny Committee (special attention drawn to the instrument).

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this is a regret Motion, and one of my regrets today is that we are debating it so long after it was tabled back in May this year. The Online Safety Act 2023 was born from tireless campaigning over a long period, and when I look around the Chamber, I see a number of those who were heavily engaged on that Act. The clear parliamentary intent was to create a safer digital environment. This House passed landmark legislation with the clear ambition to compel online platforms to take proportional measures to safeguard children from accessing or being exposed to harmful and inappropriate content and behaviour.

One of the key questions today, which many have continued to raise since I first put down the regret Motion, is: does the implementation of the Act match that ambition? The children’s codes of practice were intended to translate Parliament’s intent into practical reality; yet following scrutiny by the Secondary Legislation and Scrutiny Committee, extensive feedback from civil society organisations and analysis of emerging online harms, it is clear that in their current form these codes present significant shortcomings, hence this regret Motion. For example, the Molly Rose Foundation, founded following the death of 14 year-old Molly Russell, is deeply dismayed by the lack of ambition in these codes and states explicitly that it does not have confidence that the Online Safety Act will prevent a repeat of Molly’s death.

The Online Safety Act explicitly mandates that a higher standard of protection is provided for children than for adults. It demands that services are safe by design, yet the codes recommend only a limited number of measures that do little to address the fundamental design features and functionalities that facilitate or exacerbate harm to children. Specifically, the codes fail to address the harmful design features that platforms have embedded in their business models, features that prioritise engagement and monetisation over safety. These include scroll mechanisms that trap children in continuous content consumption, push notifications that constantly pull them back to platforms, loot boxes that exploit addictive behaviours, and algorithmic amplification that prioritises content designed to maximise engagement rather than well-being.

Ofcom will require platforms only to reduce the frequency with which children are shown certain forms of harmful content, such as dangerous stunts, rather than demanding they stop recommending it altogether. Several platforms have persuaded the regulator that content moderation is not technically feasible, leading Ofcom to require only “proportionate alternatives” such as preventing access to group chats where primary priority content has been identified, which the Molly Rose Foundation anticipates is highly likely to be gamed by the industry. Measures that could have helped, such as enabling children to provide feedback on algorithmic recommendations, appear to have been watered down and are now effectively left to the platform’s discretion.

The codes fail adequately to require safety by design or to require companies to take specific actions to address high-risk functionalities such as live streaming, despite Ofcom highlighting them in its register of risks. Civil society organisations such as Internet Matters have expressed disappointment that key recommendations on parental controls were not included as specific duties. There is a notable lack of reference to media literacy, which is essential for equipping families to support children’s safety. Concerns surrounding complex issues such as child-on-child harms were raised in consultation, yet these recommendations were not taken forward. The fundamental problem regarding pornography is not just access, but that the pornography itself is extreme, depicting acts that could not be legally published in offline formats such as DVDs. The regulator’s proposed measures for recommender systems are seen as having misdiagnosed the core problem, focusing narrowly on demotion of illegal content rather than addressing the amplification of lawful but cumulatively harmful content.

The second key issue is the failure of process. It is a matter of great concern that civil society organisations and victims’ groups felt that they were not listened to during consultation. These groups draw on the lived and often traumatic experience of victims and survivors, and they report that fundamental issues that they flagged remain unaddressed. There is a suggestion that Ofcom may have given greater weight to industry concerns than to the voices of safety advocates. Ofcom has explicitly confirmed that it has made no quantitative assessment or modelling of the societal costs and impacts of harmful online content. The quantified financial costs to businesses of compliance are given disproportionate weight compared to the immense potential impact of harm on individuals and the wider economic and societal costs.

12:30
The third issue is the question of structural flaws—the safe harbour problem. Perhaps the most urgent concern relates to the fundamental architecture of these codes. Ofcom initially described them as transformational yet subsequently characterised them as merely a first iteration. Was this what we envisaged when we passed the Online Safety Act? Ofcom’s codes provide a safe harbour to platforms that effectively mean that the codes act as a ceiling, not a floor, for their safety standards. This provides platforms for exactly the incentive we would wish to discourage—the incentive to do precisely what is mandated and nothing more.
The measures risk baking in the current industry response rather than incentivising safety by design or taking steps beyond the status quo. The regulatory mechanism mirrors what the largest platforms are already doing, setting a ceiling that disincentivises innovation beyond the status quo. Ofcom’s iterative approach has been criticised as being too slow and reactive, leaving a big gap in outcomes. Despite claiming that the protection provided by children’s codes are transformative and a game-changer, Ofcom has not produced any impact assessment to set out their likely effects, nor does it face any requirement to meet or report against specified harm reduction targets.
Ofcom acknowledges that not all individual risks have specific measures in the codes. A platform may identify new harmful trends through its mandatory risk assessments, new forms of grooming, emerging harms in virtual reality and the exploitation of artificial intelligence. Yet these risks are not covered by specific measures in the current iteration of the codes, and there is no immediate legal obligation for the platform to address them. This means that vulnerable children are left unprotected in the interim period until the next iteration is consulted on. This is unacceptable; we need urgent assurance that the Government are prepared to address this fundamental flaw, potentially by amending the Online Safety Act to close this loophole and ensure that safe by design is delivered in practice.
Then there is the specific decision to reduce protective measures regarding smaller services. Why should smaller sites become a safe harbour where harmful activity flourishes? At the risk of returning to our previous regret Motion on categorisation, we have all been clear that there should be the highest duty on sites that pose the greatest risks. Critics rightly ask why small sites should become a safe harbour where harmful activity flourishes.
Fourthly, there are the practical and accessibility challenges raised by the SLSC. It pointed out that the regulatory framework is complex to the point of opacity. Key documents run to over 600 pages, and this complexity makes it difficult to navigate and undermines accessibility for those trying to understand service providers’ duties. This is not merely an administrative inconvenience for platforms; it makes it harder for parents, victims and smaller organisations to understand their rights and effectively report concerns. The SLSC questioned how practical it is to expect children themselves to complain about harmful content, and it is unclear what further action children could take if a service provider simply rejects their complaint. These systems must be designed with a children-first approach to guarantee that they are truly accessible and effective.
These practical challenges include immediate concerns about implementation. The widespread use of virtual private networks by children and young people risks rendering age assurance measures ineffective. What is the Government’s response to that? Concerns remain that age assurance systems may pose a data protection or privacy threat to users. Crucially, civil society organisations are concerned that both Ofcom and the ICO have not stipulated a clear approach as to how age assurance methods will be evaluated for compliance with data protection. There are concerns also that important content, such as political debate, educational sites and information sites like Wikipedia and support forums dealing with LGBTQ+ rights or sexual health are being inappropriately age gated on social media. We raised this as a major risk during the passage of the Act. What is the Government’s response?
I return to the harms and the scale of emerging threats that these regulatory gaps leave unaddressed. The Child Online Harms Policy Think Tank has highlighted emerging threats that the codes do not adequately address. These are not hypothetical harms—they are happening now. In virtual reality environments, children as young as nine are entering 18-plus virtual rooms. Instances of virtual groping and sexual assault have been documented. The VIRAC project found that grooming is a top risk, with half of young participants reporting strangers asking for personal images or details. Anonymity features in these immersive platforms encourage offenders and complicate identification, policing and moderation, and the high level of immersion makes harassment, hate speech and exposure to harmful adult content far more impactful than in traditional online spaces. Yet it remains unclear how the Online Safety Act will adequately address these challenges, particularly given its focus on content rather than the contact and conduct risks that dominate in virtual reality platforms.
I could mention the threat from artificial intelligence. It is projected that 90% of all online content will be AI generated by the end of 2025. AI-powered systems can inadvertently or deliberately expose children to inappropriate content, encourage risky behaviours and enable new forms of targeted exploitation. This represents a fundamental shift in how children are being exploited online. Young people are increasingly vulnerable to becoming both victims and perpetrators of cybercrime.
The draft codes form an essential part of the new online safety regime, but the criticisms that have been made are completely justified. These codes are too cautious; they fail to incorporate civil society expertise; and they are undermined by the safe harbour provision and an incremental approach that leaves gaps, which leave children vulnerable. Can the Minister confirm what timescale has been agreed with Ofcom for the revision of these codes? We must have assurance on the approach to future consultations and scrutiny, ensuring that Parliament is kept fully informed of progress in closing these dangerous regulatory gaps. We also need confirmation that future risk assessments will be required to consider not just content but also contact, conduct and contract risks—the full range of harms that children face.
The ambition of the Online Safety Act demands robust, comprehensive and urgent action. We cannot afford to leave children exposed to known risks in the digital world while we wait for a gradual, reactive regulatory process to catch up. I beg to move.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, for initiating this debate, and I agree with almost everything he has just said.

I applaud the enormous work that Ofcom has put into creating and implementing the children’s codes. I am pleased to hear that they have already led to a huge reduction in children online accidentally stumbling on pornography and other harmful materials. However, I fear, as the noble Lord has just said, that the rules-based nature of the codes specifies narrow recommended measures rather than incentivising desired outcomes and encouraging the platforms to implement mitigations to children’s harms which go beyond these codes. This is particularly the case with live-streaming, which, according to Ofcom’s own finding, is a risky functionality. The regulator’s register of risk says that live-streaming can be a risk for several kinds of harm to children; it specifies the real-time sharing of suicide and self-harm content.

When Dame Melanie Dawes came before the Communications and Digital Committee, on which I have the privilege to serve, she said that Ofcom had implemented mitigations to live streaming for under 18s. The measures stopped them from using likes, switches off screen capture and prohibits comments on their feeds. This has the beneficial effect of stopping any adult who might consider grooming a child from interacting and encouraging the child user to take further action. However, it still exposes children to potential harms from adult predators. Surely, the best option would be to stop children from using the functionality, or at least introduce some age-appropriate design that limits usage to 16 to 18 year-olds. I know that Ofcom regards such a ban, or even age-appropriate design, as being too punitive for a service that is used by under 18 year-olds, but it would achieve the aim of the Online Safety Act, which is to protect children from harm.

In addition, I would ask the regulator to address established pathways to harm that end in live streams, even if they do not begin there, in particular the specific threat profile of “com groups”, where children are identified and contacted via other functionalities and then moved to live streams, where they are often coerced into horrific actions. These and other upstream measures will protect children from these harms. It may be a good idea to look at introducing time delays between an account being set up and being allowed to start a live stream. Some services, such as LiveMe, have already banned children from live-streaming on their apps. My additional fear is that, even when services go beyond the thresholds set out in the Act, there is no rollback provision to stop them reneging on such beneficial actions.

My other area of concern is the use of VPNs by children, as the noble Lord, Lord Clement-Jones, just raised. A huge rise in their use was reported when the codes were first introduced. Internet Matters estimated that, of the under-18s, one in 10 was using VPNs. The fear was that they were going on to VPNs to access harmful content, which the codes had prevented them reaching. Ofcom has said that it is uncertain why there is a big increase in use. Many children claim that they need the VPN because the internet connection at their school is bad and it is a way of improving access to the internet. I wonder why, if this is the case, the rise in VPN use should coincide with the introduction of the children’s code. If there had been a problem with school connections, surely that issue would have been raised prior to the code’s adoption.

The Children’s Commissioner, in her August report, called for the Government to

“explore options to ensure children aren’t able to use VPNs to avoid the age assurance process”.

This could be achieved by

“amending the Online Safety Act to bring in an additional provision which would require VPN providers in the UK to put in place Highly Effective Age Assurance … and prevent them from accessing pornographic sites”.

Can the Minister tell the House whether any such measures are being considered?

At the very least, there should be an education programme for parents who, in many cases, enhance the policing of their children’s use of VPNs by understanding their possible misuses. For instance, when they are asked to pay for children’s access to the VPN app, they should interrogate the need for this access. Surely general advice for safety protection could be given to parents, as happens with parental control of video games.

I know that Ofcom is carrying out research into why children are using VPNs. It is a welcome step, but I must ask why this was not anticipated and research carried out earlier. I am pleased with the greatly improved safety environment for children introduced with these codes, but the internet is a dangerous place. I therefore ask the Minister to ensure that it is a safe place for our children in all its functionalities.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble, Lord, Lord Clement-Jones, for bringing this regret Motion. He gave a tour de force of all the reasons why we should regret that these codes are not more ambitious. I too wholeheartedly support the Online Safety Act and, once again, it is a privilege to be with the tech team across the aisles that has worked on this legislation for a very long time. I do not in any way want to diminish the substantial work that Ofcom has done on this. It is a ground-breaking piece of legislation, as the noble Lord, Lord Clement-Jones, said. There is a huge amount of work to implement it and I would not want in any way to slow down that implementation. I regret, however, that these codes are not more ambitious.

My remarks will, very briefly, focus on the first group of concerns that the noble Lord, Lord Clement-Jones, focused on: insufficient protections and the lack of ambition in them. I will specifically focus on whether these codes really allow for age-appropriate experiences. Any parent or grandparent knows that what is appropriate for a 13 year-old is very different from what is appropriate for a 17 year-old. Yet, sadly, although Ofcom recognises that user-to-user services should

“consider children in different age groups”,

there is little or no guidance on what they should actually consider. As we are learning, unless those things are specified in detail, the safe harbour provision just means that the user-to-user services do not really need to do it at all. As a result, it is highly unlikely that these codes will produce user-to-user services that are age appropriate for 13 year-olds relative to 17 year-olds. Even more fundamentally, they will not address the millions of under-13s using social media platforms that even those providers themselves admit are only appropriate for 13 year-olds and above.

12:45
It is a weakness in the Act that user-to-user services are required only to ensure that their minimum age limits are enforced consistently. If you go into any primary school today, you will discover that they are indeed enforced consistently—consistently, most children aged under 13 are on the platforms. They are therefore not effective at all. It is a huge yawning gap that there is nothing in these codes that will change whether children aged under 13 access products that are definitely not appropriate for them. The NSPCC estimates that 2.5 million children are bypassing self-declaration age checks. That is your nine year-old grandchild saying that they are okay to use Facebook, Snapchat, Insta or, God forbid, much worse platforms.
I have one question for the Minister: what will the Government do to protect our youngest children—those whom even the social media platforms recognise should not be on these platforms—and ensure that they are not on them? Sadly, that will not happen through these codes.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I also thank the noble Lord, Lord Clement-Jones, for introducing this regret Motion. I am very familiar with it because, as a member of the Secondary Legislation Scrutiny Committee, I was part of the team scrutinising it when it came in front of us. I welcome the Minister to her post. This is one of her early baptisms in the world of online safety and it will be the precursor, I suspect, to many more. I suspect that she will be on a fairly steep learning curve, and I wish her well.

Many people have spoken about the perception that many of us have that we thought we were being very explicit about our hopes and ambitions for the Online Safety Bill as it went through Parliament—with, in particular, a huge amount of time in this House. If she has not yet been able to, I suggest that the Minister could benefit from sitting down over a suitable libation with the noble Lords, Lord Parkinson and Lord Clement-Jones, the noble Viscounts, Lord Camrose and Lord Colville, the noble Baronesses, Lady Harding and Lady Kidron, and others to understand what we thought we were being very clear about in terms of Parliament’s expectations when this Act passed and what we are now experiencing in terms of its enactment. That would be really helpful in understanding where we are coming from when we repeatedly raise some of these issues. That really comes under the heading of an insufficiency of ambition and of clarity of understanding about what it was that we thought we were being very clear about.

There is a failure of process in certain areas. I will not go into great detail, but the fact that smaller, high-risk sites are, to a large extent, excluded is madness. It is exactly on some of those smaller, high-risk sites where you have incidents of people being encouraged to self-harm, of people being encouraged to end their lives and of radicalisation. That is going on in plain sight. At the moment, Ofcom does not appear to feel that it has enough resources to do anything about it. I am also not sure that it feels it is entirely clear, under the auspices of the Act, whether this should indeed be a priority for it.

There are also structural flaws: the noble Lord, Lord Clement-Jones, mentioned the safe harbour. There are three key questions that I will pose to the Minister— I do not expect her to be able to give a magic answer at the Dispatch Box—to really focus on trying to get an understanding of what is going on and some answers. I am sure she will be asked some of these questions in the future.

The first is: does Ofcom have sufficient resources and knowledge at its disposal to do what we very clearly intended it to do in the Act? Given the evidence at the moment of what it is able to do, I am not sure the resources are adequate. If the resources are adequate, they are not being tactically and strategically deployed in the best way to achieve what we were trying to do.

The second point was referred to briefly. We tried very hard, during the passage of the Act, to try to find a place for parents to go. If, under the terms of the Act, they are meant to go to the platform with which they have a problem—perhaps their child was harmed or, God forbid, even died—and the platform is unable to satisfy them and give them an adequate response, they have nowhere to go. We talked about that at length during the passage of the Act, and it is still the case. I do not think, in all conscience, that is adequate or appropriate. I encourage the Government to look carefully at that and how it might be mitigated. Talking to people such as Ian Russell and the Molly Rose Foundation would be a very good way of understanding what those families, who are not getting an adequate response, are going through and will continue to go through.

The third area is the level of scrutiny that the Act is undergoing. We fought in vain to encourage the then Government to agree to set up a Joint Committee of both Houses of Parliament to scrutinise the Online Safety Act on a continuing basis; to establish a dialogue with Ofcom in a direct and relatively open way, but also for it to be possible to do it, if needs be, more discreetly, away from the limelight and publicity; to try to understand some of the issues and problems that Ofcom may be having; and to see how we can help, rather than being slightly outside it, as it is currently constructed. I do not feel comfortable being critical of Ofcom without necessarily being in full receipt of the facts and understanding what is really going on inside. I think all those of us involved in the passage of the Act would like to help Ofcom do its job, not castigate it for not doing what we think it should have done. Trying to see whether there is a way in which we can have a more regular dialogue between Parliament and Ofcom, for each to understand where the other is coming from and to be better informed, would be a good step forward.

The day before yesterday, in our Secondary Legislation Scrutiny Committee, we had yet another statutory instrument on online safety, in this case from the Home Office. Again, I am afraid it was slightly disappointing news. This statutory instrument has a particularly catchy title. It is called the Online Safety (CSEA Content Reporting by Regulated User-to-User Service Providers) (Revocation) Regulations. For those at the Dispatch Box, it is Statutory Instrument 2025 No. 1066, like the Battle of Hastings. In this case, an online portal to enable all reports of child sexual exploitation and abuse to be aggregated in one place was meant to go live, I think, next month. For reasons probably to do with poor design and project planning, it will not go live. It is effectively having to be rebuilt and will hopefully go online, if it works, at some point in the spring. We will publish our report and noble Lords will be able to read it and see that the committee was not exactly happy. In this case, the Home Office provided an inadequate Explanatory Memorandum and has agreed to go back and do a better job. I can see the chair of our committee sitting behind the Minister; he will be well aware of that.

In conclusion, I think the status quo is untenable. Until and unless the group of us who were particularly closely involved in the passage of the Act are more confident that the victims who are suffering in the online world, particularly children, are better protected—until we feel that their concerns and experiences are being responded to more robustly, succinctly and accurately—we will continue to keep on raising this issue again and again.

Baroness Barran Portrait Baroness Barran (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I apologise: I came to listen to this debate from the steps of the throne, but the more I listened, the more I thought I would make a very short contribution. I join others in thanking the noble Lord, Lord Clement-Jones, for his Motion. The noble Lords, Lord Storey and Lord Watson, and others in the House, will know that, as part of the Children’s Wellbeing and Schools Bill, the noble Lord, Lord Nash, and I and others have introduced a number of amendments that are relevant to our debate today. One would raise the age of access to social media for children from 13 to 15. Another would prohibit the use of VPNs by children. A third would ban the use of smartphones in schools during the school day.

The Department for Education and the noble Baroness, Lady Smith of Malvern, in their rejection of our proposed amendments in Committee, cited as reasons for waiting the lack of convincing evidence and the fact that these codes were going to be implemented, and said it was premature to act. I hope there is some way of making sure that the noble Baroness is briefed on today’s debate, because I think she might feel, if she listened to some of the comments around the House, somewhat less reassured. She would also have been less reassured if she had been present earlier this week at the round table we hosted, across parties and with Cross-Bench support, which took evidence from medical experts including the noble Baroness, Lady Cass, academic experts and safeguarding experts. What we heard was deeply troubling.

The Minister may be aware that there are a number of ongoing campaigns about aspects of this and the way in which social media has led to tragic deaths of children. The noble Lord, Lord Russell, referred to Ian Russell and his daughter Molly, but Esther Ghey, mother of Brianna Ghey, and Ellen Roome, mother of Jools, also lost their children tragically as a result of their involvement with social media. This is an opportunity for the Government to be on the right side of history. All the evidence seems to be going in one direction and one direction only in terms of harm to children. If there is ever a time to adopt the precautionary principle, surely this is it.

Lord Watson of Invergowrie Portrait Lord Watson of Invergowrie (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Barran, began with an apology and I must do the same, because I did not leave my office soon enough and I missed the first few paragraphs of the speech by the noble Lord, Lord Clement-Jones, to whom I personally apologise, and I apologise to the House in general for that. As the noble Lord, Lord Russell, said, I am the chair of the Secondary Legislation Scrutiny Committee, but I speak today in an entirely personal capacity.

The noble Lord, Lord Clement-Jones, has actually left very little to say—so I will say very little. I certainly agreed with the important points he highlighted and went into in some detail. The gaps remaining in those codes are a genuine concern. The Department for Science, Innovation and Technology and Ofcom have pointed to the fact that they are simply the first iteration. That may well be the case, but both will need to ensure that any shortcomings that emerge are addressed at the earliest opportunity, and I hope it may be possible for my noble friend, whom I welcome to her post on the Front Bench, to offer an assurance that the necessary legislative changes that result from the shortcomings will be implemented as a matter of priority. Anything else would be entirely inappropriate, and indeed perhaps even unforgivable.

13:00
I have been critical of Ofcom in the past and I remain unconvinced that it has shown the necessary urgency, given the speed at which we all know that online platforms develop their content. In Ofcom’s defence, I am not sure that it has the adequate resources to make sure that these codes and the ones that will follow are adequately implemented. I question whether they have—or will have—enough staff with the appropriate expertise, given the way the platforms are developing, to carry out the responsibilities stemming from the Online Safety Act, and in particular these codes.
Finally, I have a word to say about the complaints process. There are questions around the practicality of expecting children to complain to a service provider in situations where they come into contact with harmful content. The noble Baroness, Lady Barran, highlighted the difference between a 13 year-old and a 17 year-old. They are both children, but they are obviously significantly different in terms of their development and ability to deal with what they see as harmful content. Can my noble friend the Minister say what would happen in situations where complaints from children are rejected by service providers, and what happens thereafter?
In a world where we increasingly rely on technology as part of our day-to-day lives, ensuring that children are safe when they are online is of paramount importance and should be a key priority. Successive Governments have, I acknowledge, moved to effectively tackle this issue and protect children from harmful online activity, but I do not think any of us can get away from the speed of development. As the noble Lord, Lord Clement-Jones, said, there are many issues that need to be addressed in order to ensure that the draft codes of practice operate in an effective manner. That is the least we should be seeking in protecting young people from the risks to their safety online.
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, not much we debate in your Lordships’ House unites us so thoroughly as our shared recognition that children must be protected from harmful online content and behaviours. I am delighted that we are as one when it comes to the importance of shielding young people from extreme pornography, content promoting self-harm or suicide, or other serious risks.

This makes it all the more important to scrutinise how the Government and Ofcom have chosen to implement these protections. The role of the draft codes of practice, laid in April this year and brought into effect in July, is to translate Parliament’s intentions into practical rules for service providers. As the noble Lord, Lord Russell, set out so clearly, there are some serious concerns about whether these codes are achieving their stated objectives, and I thank the noble Lord, Lord Clement-Jones, for bringing this important Motion to the House today and for giving us the chance to air our views.

There is some evidence that the codes are being applied in a way that risks overreach and unintended consequences. Some platforms, such as X and Reddit, in attempting to comply, blocked wide-ranging content, including parliamentary debates on grooming gangs and posts relating to the wars in Ukraine and Gaza. Several experts have warned that such overapplication risks stifling legitimate public debate. It has even been suggested that some platforms deliberately overapply some rules as a way to influence government towards weakening them.

The Act was always designed to respect freedom of expression—political and otherwise—while protecting internet users, especially children, from harm. The Government’s own guidance confirms this, but clearly the practical effect has not always to date reflected that intent.

There also exist concerns about the complexity and accessibility of the codes. Platforms, parents and of course children themselves in some instances may struggle to understand what duties are required and how to enforce them. The guidance is hundreds of pages long and, while Ofcom has issued advice on risk assessments and age-verification measures, there is a real danger that the practical realities of compliance, particularly for smaller providers, leave gaps in protection. Complexity should not become a barrier to the very protections these codes are meant to provide.

We have also been discussing the iterative approach taken by Ofcom. Presenting the codes as a first step, to be refined over time, is in principle essential, for two reasons. The first is that, as we know, this is a pioneering piece of legislation and we must remain open to adapting it. The second is that I am afraid that the people we are up against are inventive users of fast-moving technology.

However, the iterative approach is also clearly creating uncertainty. Civil society organisations have reported that their concerns were not fully addressed during consultation. Children face immediate risks and it is imperative that the Government ensure that these gaps are closed without delay. The noble Lord, Lord Clement-Jones, cited the statistic that a young life aged between 10 and 19 is lost to suicide every week where technology has been a factor. The codes should not act or be viewed as a ceiling for safety standards. Rather, they must set a floor for safety standards and be subject to firm and measurable enforcement.

Enforcement and proportionality are, of course, critical. The Act grants Ofcom significant powers, including fines, criminal liability and restrictions on financial and commercial arrangements. Yet there are practical challenges to ensuring that these powers are applied in a proportionate and evidence-based way. The critical challenge facing the Government as they operate the Act’s machinery is to protect children while avoiding excessive interference with legitimate content and adult access to lawful material.

All that said, we on these Benches do have questions over the Government’s handling of these codes. Our purpose is to challenge the Government to deliver children’s online safety effectively and proportionately. While I welcome the Minister to her place and wish her the very best for her very important role, particularly in this respect, I ask her for some greater clarity, if she is able to provide it, on three strands of Ofcom’s work. First, how will Ofcom monitor implementation by platforms? Secondly, how will it ensure that civil society is genuinely incorporated, and of course that consultees recognise that they have been listened to? Thirdly, how will it address current gaps in coverage without delay?

I am delighted to be participating in this important debate and to have the opportunity to seek these assurances from the Government. We must see rapid action to ensure that the codes protect children in practice, do not inadvertently suppress legitimate debate, and are accessible and enforceable in the real world. I support the scrutiny behind this regret Motion and hope that, when the Minister rises, she will provide answers that reassure us all that the protection of children online is being delivered with both effectiveness and proportionality.

Baroness Lloyd of Effra Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Lloyd of Effra) (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank noble Lords for their valuable contributions today, and I thank the noble Lord, Lord Clement-Jones, for initiating the debate. I absolutely acknowledge the huge expertise in the Room today. I thank the noble Lord, Lord Russell, for his suggestion of further discussions with individual Members.

I found reading the Secondary Legislation Scrutiny Committee’s report an excellent basis for this discussion. That committee plays a very important role, as do other committees, such as the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. The role of ongoing scrutiny by all these bodies is absolutely essential. On the matter of the specific committee that the noble Lord, Lord Russell, mentioned, it would be for the House to decide whether that would be set up to monitor this legislation and the codes.

As others have mentioned, we are working closely with Ofcom to monitor the effectiveness of the Online Safety Act. While the early signs are encouraging, the true test will be whether adults and children are having a safer online experience. Ofcom has put in place a robust monitoring and evaluation program, tracking changes firms are making in response to regulation, gathering data from the supervised services and commissioning research to measure impact. Some of that research has been mentioned in the course of the debate. It is quite extensive and provides a lot of information to civil society organisations, Members of this House and others.

What binds us together is the determination to do everything we need to do to keep children safe online, as built on the evidence. That is a priority. The previous Secretary of State, in issuing his statement of strategic priorities, made it clear that the first priority was safety by design. That builds on the safety by design measures within the codes, such as the safer design of algorithms to filter out harmful content from children’s feeds. On 25 July, Ofcom published its statement, setting out what it proposes to do in consequence of that statement of strategic priorities. Under the Act, it must publish further annual reviews of what action it has taken as a result of the statement of strategic priorities, including on safety by design.

We have taken action to strengthen the regulatory framework by making further offences priority offences under the Online Safety Act, reflecting the most serious and prevalent illegal content and online activity—for example, laying an SI to make cyberflashing, encouraging self-harm and the sharing of intimate images without consent priority offences under the Act.

Others have mentioned the importance of basing our decisions on good evidence of what is happening. Recognising that further research was required to improve the evidence base, the Government have commissioned a feasibility study to explore the impact of smartphones and social media use on children.

Baroness Barran Portrait Baroness Barran (Con)
- Hansard - - - Excerpts

On the point about evidence, I am absolutely not an expert in this but the noble Baroness, Lady Cass, definitely is. I think it would be a very good use of the Minister’s time to meet with her. She described a situation where the research that is being done is at a population level, where changes and attribution will be difficult to discern. I understood the noble Baroness to be making the case that—I do not want to misrepresent her—what clinicians are seeing has a lot of parallels with her review of the Tavistock. On the one hand, you wait for great population-level surveys, but you need to act on what is being seen. It is important that the Government look at both.

Baroness Lloyd of Effra Portrait Baroness Lloyd of Effra (Lab)
- Hansard - - - Excerpts

I thank the noble Baroness for that suggestion. I would be very happy to speak with the noble Baroness, Lady Cass, and leverage her experience in drawing up the right models of evidence-gathering and research.

To come back to the core of some of the points that the noble Lord, Lord Clement-Jones, and others were making about the implementation of the Act through the codes, Ofcom has met the 18-month statutory timeline that was set by Parliament to finalise the guidance and codes of practice relating to illegal harms and the protection of children. The illegal content safety duties came into force in March this year, meaning that all companies in scope will need to protect all users, including children, from illegal content and criminal behaviour on their services. On 24 April this year, Ofcom submitted to the Secretary of State the final draft protection codes of conduct. That regime came into force on 25 July, following parliamentary scrutiny.

13:15
The protection of children codes are a significant step forward, with the largest social media companies having to keep children safe online by law. This means that services are required to implement age checks to stop children being exposed to the most harmful content, such as pornography and content that promotes, encourages or provides instructions for eating disorders, suicide or self-harm. Services must implement age-appropriate measures to protect children from other types of harmful content, including abusive or hateful material, bullying content and content that depicts serious violence or injury. Additionally, services are now required by law to design algorithms in ways that will protect children from being served harmful content.
I will address the question that the noble Baroness, Lady Harding, raised about age limits. As she knows, probably more so than anyone, providers have age restrictions, and these are part of their terms of service. That is something that Ofcom and others will be looking at carefully in their supervisory procedures. Providers can therefore be held to account for everything they say is in their terms of service.
I come to the question about consultation with civil society and others on the codes. In line with the statutory duties, Ofcom consulted widely on the proposals and has spoken to or heard from over 100 child safety organisations, including the Children’s Commissioners and civil society organisations, some of which have been mentioned here today. Ofcom heard from over 27,000 children and 13,000 parents when undertaking its research and conducted in-depth engagement with around 100 children across the UK during its consultation. This research and engagement identified additional ways to strengthen the codes, and that has fed into the first iteration of the codes.
In April 2025, Ofcom published a statement documenting its research and the consultation responses. This statement explains how the draft codes were changed to reflect feedback from civil society stake-holders—examples include changing the recommended system measures to provide stronger protections for children and strengthening the expectation for providers to consider children’s ages.
Noble Lords raised issues around potential regulatory gaps. It is important to note that, for the first time, the industry cannot decline to take steps to protect children because it is too expensive or inconvenient. Protecting children is a priority, and we can see this in the codes. As others have mentioned, Ofcom has taken an iterative approach to the codes. This was to ensure that the initial codes were put in place to protect children as soon as possible and to meet the 18-month statutory deadline.
Since the summer, 6,000 services have already implemented effective age assurance to prevent children seeing harmful content online. We see the codes as the foundation, not the limit, when it comes to children’s online safety. Ofcom has always made it clear that additional measures would be required to build safer online experiences for children, and it is now working to strengthen future codes on online harms technology as the evidence base evolves.
The noble Viscount, Lord Colville, mentioned live-streaming. On 13 June, the regulator announced a consultation on additional measures for the codes, which included measures on live-streaming, tackling intimate image abuse through hash matching and additional steps to ensure services are safer by design. These measures aim to stop illegal content going viral, tackling harms at source and providing further protections to children online.
Many in the House also raised monitoring and evaluation, and how successful the Online Safety Act and the codes will be in protecting children. Ofcom will continue to monitor the implementation of the codes’ measures to identify gaps in protections and the risk assessment process for platforms. It also gathers evidence through research and intelligence activity, and through engagement with civil society. It also regularly publishes its own research into online harms and relevant reports. Ofcom has the flexibility to establish other mechanisms for conducting research about users’ experiences. Examples include working with the Children’s Commissioner to establish a panel of children who regularly feed back on their online experiences and how they are changing.
The process of monitoring and research will identify areas for future consultation with stakeholders. In doing so, Ofcom is fulfilling its duties to ensure that the codes of practice are compatible with the pursuit of online safety objectives outlined in the Act, evaluating the codes and keeping them under review. It has also stated that this dynamic regulatory approach is an example of how it intends to fulfil the Government’s strategic priority of agile regulation and the need for monitoring, risk-assessing and mitigating new online harms. As the noble Viscount, Lord Camrose, said, this is intended to strike the right balance. In parallel, the Government and Ofcom are actively monitoring the regime’s impact through a programme of evaluation work, with findings feeding into the Secretary of State’s statutory post-implementation review of the effectiveness of the Act. This review must take place two to five years after the legislation comes into force.
Noble Lords also raised concerns about the ease with which services are able to access and understand Ofcom’s guidance and codes. The Act requires that measures within the codes are clear and detailed. That is why Ofcom has made a significant effort to produce accessible materials alongside the codes in order to make compliance more effective. Examples include a digital safety toolbox that supports small and medium-sized services to comply with the codes. It has also made efforts to make the codes accessible to parents and children by publishing versions of the codes at a glance for parents in age-appropriate videos. These efforts to raise awareness of the codes reflect Ofcom’s regard for the Government’s strategic priority of inclusivity and resilience—in particular, the need for parents, carers and children to understand the risks and be supported to stay safe against online harm. To the point raised about media literacy by the noble Viscount, Lord Colville, the draft curriculum review raised the issue of media literacy; in due course, the response will be published by the Government.
Ofcom has built on these engagement efforts by engaging directly with different groups of stakeholders, including webinars delivered to 187 organisations and over 400 meetings with providers, providing further support to help with compliance. The Government are not saying that the codes are perfect or that more does not need to be done, but the child safety and illegal content codes are a really positive step forward.
I turn to some other points raised in the debate. On the complaints procedure under the Online Safety Act, services likely to be accessed by children have a duty to implement processes that allow parents or children to easily report harmful content that is present in parts of the service that children can access. That is something that Ofcom will look at in its supervisory activities.
Many have mentioned Ofcom’s importance and the resources that it has available to it. The resources available to Ofcom have increased significantly in the area of online safety to a projected £92 million in 2025-26, which is an uplift on previous years—something that the noble Lord, Lord Russell, and my noble friend Lord Watson of Invergowrie highlighted in the debate.
Regarding what some regard as safe harbour, which other noble Lords mentioned, we believe it would not be desirable or effective to remove aspects of the framework that require Ofcom to publish pre-emptive guidance about how far providers need to go to fulfil their duties. That would be likely to create an uncertain and unclear operating environment, reducing legal certainty for services, Ofcom and users. It could also lead to more legal challenges and obstacles to delivering the vital safety benefits that this legislation provides.
A number of noble Lords mentioned the fast-growing technological advances, in particular AI. Ofcom published an open letter to all service providers in November 2024 outlining the scope and expectations of those services to protect children online. We are seeing some activity as a result, with one service reducing services for children as a result.
On virtual private networks, the Government will continue to monitor the use of circumvention techniques, including VPNs, and any future interventions will be informed by the evidence. At the moment, there is limited evidence on children’s use of VPNs, and the Government are looking at ways of addressing this evidence gap. There are no current plans to ban the use of VPNs, as there are legitimate reasons for using them. The initial findings indicate that age verification is being implemented effectively. The Age Verification Providers Association reported that an additional 5 million age checks were conducted daily during the first few days after the child safety duties came into effect. If a provider was not complying with its duties by promoting or encouraging VPN usage to bypass age-assurance methods, Ofcom could apply any of its enforcement powers.
To address the question of small and low-risk services, it is important that we think about the risk here. Ofcom has the statutory duty to have regard to the principles of proportionality. With many services carrying a low risk of harm, the risk assessment duties that apply to all services are key to ensure that risky services of all sizes do not slip through the net of regulation.
I will make my final remarks. The protection of children codes mark a positive shift in how children will experience the online world. Ofcom’s enforcement programme has already resulted in investigations into companies responsible for 69 services. In response to one of these investigations, a prominent suicide discussion forum has chosen to restrict access to UK users. However, I repeat that these codes are a starting point. Ofcom’s recent consultations on additional measures to strengthen both the illegal harms code of practice and the protection of children code of practice show that the regulator is committed to strengthen these codes as new harms emerge. The Government have made it clear that nothing is off the table when it comes to keeping children safe, and we will continue to monitor and assess the effectiveness of the Online Safety Act in robustly protecting children online.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for her response and add my welcome to her to the Front Bench: you cannot have enough south Londoners on the Front Bench. I also thank her very much for the serious and comprehensive way in which she answered many of the points raised—and, indeed, some of the points that we did not raise—during the debate.

There is an essential issue running all the way through most of the speeches, which is this question of oversight and scrutiny. I very much hope the Minister will take a leaf out of her predecessor’s book—the noble Baroness, Lady Jones, who I am glad to see is also on the Benches today—in engaging with those Members across the House who have strong views about online safety, who helped take the Bill through, and who genuinely want to see Ofcom succeed in regulating social media platforms. It is not just about formal engagement through the SLSC or other mechanisms, valuable though that is; it is important that we get to grips with a lot of the new information in what she had to say, which I thought was extremely helpful.

13:30
The Minister may well have read the letter that was sent to the noble Baroness, Lady Jones, on 21 May from the chairs of two Select Committees, Chi Onwurah MP and the noble Baroness, Lady Keeley, complaining about the scrutiny process. This is powerful stuff coming from the chairs of two Select Committees. We really have to find a better way of doing it, particularly in this implementation period of the Online Safety Act, which so many of us helped to get on the statute books. We cannot just rely on regret Motions. This is my second regret Motion, and I do not want to be in a position of laying a regret Motion about the SI that the noble Lord, Lord Russell, mentioned, and to keep doing this on a continual basis.
It was good to hear that resources are available. The fact that further research is being carried out by the department is helpful, and no doubt will be helpful through the passage of the children’s Bill. The issue of safety by design is not going to go away. The Minister was reassuring to some extent, but I am not convinced that we have yet got it absolutely as explicit as we should. For instance, we heard—I think when we had the categorisation debate—that addiction is not covered sufficiciently in the Act as a harm for children, so there are gaps. Does that require amendment of the existing Act? Is Ofcom’s interpretation of the existing Act correct? There are a number of issues there. Are we going far enough and fast enough with the existing Act, or do we need already to start thinking about changing it?
I welcome what the Minister said about evaluating the work and that this is a foundation and not a limit. Her response merits careful consideration, and I dare say many of us will want to come back to her and have further discussions about some of the detail in due course. I thank her for taking this debate seriously—the first of many, I am sure—and I very much welcome the approach she has taken. I beg leave to withdraw the Motion.
Motion withdrawn.