(1 day, 15 hours ago)
Lords Chamber Lord Clement-Jones
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones 
        
    
        
    
        That this House regrets that the draft Protection of Children Codes of Practice for search services does not fully deliver the level of protection for children envisaged by the Online Safety Act 2023 due to regulatory gaps, accessibility challenges, and the consultation process failing adequately to address feedback from civil society organisations and victims’ groups.
Relevant document: 25th Report from the Secondary Legislation Scrutiny Committee (special attention drawn to the instrument).
 Lord Clement-Jones (LD)
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones (LD) 
        
    
        
    
        My Lords, this is a regret Motion, and one of my regrets today is that we are debating it so long after it was tabled back in May this year. The Online Safety Act 2023 was born from tireless campaigning over a long period, and when I look around the Chamber, I see a number of those who were heavily engaged on that Act. The clear parliamentary intent was to create a safer digital environment. This House passed landmark legislation with the clear ambition to compel online platforms to take proportional measures to safeguard children from accessing or being exposed to harmful and inappropriate content and behaviour.
One of the key questions today, which many have continued to raise since I first put down the regret Motion, is: does the implementation of the Act match that ambition? The children’s codes of practice were intended to translate Parliament’s intent into practical reality; yet following scrutiny by the Secondary Legislation and Scrutiny Committee, extensive feedback from civil society organisations and analysis of emerging online harms, it is clear that in their current form these codes present significant shortcomings, hence this regret Motion. For example, the Molly Rose Foundation, founded following the death of 14 year-old Molly Russell, is deeply dismayed by the lack of ambition in these codes and states explicitly that it does not have confidence that the Online Safety Act will prevent a repeat of Molly’s death.
The Online Safety Act explicitly mandates that a higher standard of protection is provided for children than for adults. It demands that services are safe by design, yet the codes recommend only a limited number of measures that do little to address the fundamental design features and functionalities that facilitate or exacerbate harm to children. Specifically, the codes fail to address the harmful design features that platforms have embedded in their business models, features that prioritise engagement and monetisation over safety. These include scroll mechanisms that trap children in continuous content consumption, push notifications that constantly pull them back to platforms, loot boxes that exploit addictive behaviours, and algorithmic amplification that prioritises content designed to maximise engagement rather than well-being.
Ofcom will require platforms only to reduce the frequency with which children are shown certain forms of harmful content, such as dangerous stunts, rather than demanding they stop recommending it altogether. Several platforms have persuaded the regulator that content moderation is not technically feasible, leading Ofcom to require only “proportionate alternatives” such as preventing access to group chats where primary priority content has been identified, which the Molly Rose Foundation anticipates is highly likely to be gamed by the industry. Measures that could have helped, such as enabling children to provide feedback on algorithmic recommendations, appear to have been watered down and are now effectively left to the platform’s discretion.
The codes fail adequately to require safety by design or to require companies to take specific actions to address high-risk functionalities such as live streaming, despite Ofcom highlighting them in its register of risks. Civil society organisations such as Internet Matters have expressed disappointment that key recommendations on parental controls were not included as specific duties. There is a notable lack of reference to media literacy, which is essential for equipping families to support children’s safety. Concerns surrounding complex issues such as child-on-child harms were raised in consultation, yet these recommendations were not taken forward. The fundamental problem regarding pornography is not just access, but that the pornography itself is extreme, depicting acts that could not be legally published in offline formats such as DVDs. The regulator’s proposed measures for recommender systems are seen as having misdiagnosed the core problem, focusing narrowly on demotion of illegal content rather than addressing the amplification of lawful but cumulatively harmful content.
The second key issue is the failure of process. It is a matter of great concern that civil society organisations and victims’ groups felt that they were not listened to during consultation. These groups draw on the lived and often traumatic experience of victims and survivors, and they report that fundamental issues that they flagged remain unaddressed. There is a suggestion that Ofcom may have given greater weight to industry concerns than to the voices of safety advocates. Ofcom has explicitly confirmed that it has made no quantitative assessment or modelling of the societal costs and impacts of harmful online content. The quantified financial costs to businesses of compliance are given disproportionate weight compared to the immense potential impact of harm on individuals and the wider economic and societal costs.
 Viscount Colville of Culross (CB)
        
    
    
    
    
    
        
        
        
            Viscount Colville of Culross (CB) 
        
    
        
    
        My Lords, I thank the noble Lord, Lord Clement-Jones, for initiating this debate, and I agree with almost everything he has just said.
I applaud the enormous work that Ofcom has put into creating and implementing the children’s codes. I am pleased to hear that they have already led to a huge reduction in children online accidentally stumbling on pornography and other harmful materials. However, I fear, as the noble Lord has just said, that the rules-based nature of the codes specifies narrow recommended measures rather than incentivising desired outcomes and encouraging the platforms to implement mitigations to children’s harms which go beyond these codes. This is particularly the case with live-streaming, which, according to Ofcom’s own finding, is a risky functionality. The regulator’s register of risk says that live-streaming can be a risk for several kinds of harm to children; it specifies the real-time sharing of suicide and self-harm content.
When Dame Melanie Dawes came before the Communications and Digital Committee, on which I have the privilege to serve, she said that Ofcom had implemented mitigations to live streaming for under 18s. The measures stopped them from using likes, switches off screen capture and prohibits comments on their feeds. This has the beneficial effect of stopping any adult who might consider grooming a child from interacting and encouraging the child user to take further action. However, it still exposes children to potential harms from adult predators. Surely, the best option would be to stop children from using the functionality, or at least introduce some age-appropriate design that limits usage to 16 to 18 year-olds. I know that Ofcom regards such a ban, or even age-appropriate design, as being too punitive for a service that is used by under 18 year-olds, but it would achieve the aim of the Online Safety Act, which is to protect children from harm.
In addition, I would ask the regulator to address established pathways to harm that end in live streams, even if they do not begin there, in particular the specific threat profile of “com groups”, where children are identified and contacted via other functionalities and then moved to live streams, where they are often coerced into horrific actions. These and other upstream measures will protect children from these harms. It may be a good idea to look at introducing time delays between an account being set up and being allowed to start a live stream. Some services, such as LiveMe, have already banned children from live-streaming on their apps. My additional fear is that, even when services go beyond the thresholds set out in the Act, there is no rollback provision to stop them reneging on such beneficial actions.
My other area of concern is the use of VPNs by children, as the noble Lord, Lord Clement-Jones, just raised. A huge rise in their use was reported when the codes were first introduced. Internet Matters estimated that, of the under-18s, one in 10 was using VPNs. The fear was that they were going on to VPNs to access harmful content, which the codes had prevented them reaching. Ofcom has said that it is uncertain why there is a big increase in use. Many children claim that they need the VPN because the internet connection at their school is bad and it is a way of improving access to the internet. I wonder why, if this is the case, the rise in VPN use should coincide with the introduction of the children’s code. If there had been a problem with school connections, surely that issue would have been raised prior to the code’s adoption.
The Children’s Commissioner, in her August report, called for the Government to
“explore options to ensure children aren’t able to use VPNs to avoid the age assurance process”.
This could be achieved by
“amending the Online Safety Act to bring in an additional provision which would require VPN providers in the UK to put in place Highly Effective Age Assurance … and prevent them from accessing pornographic sites”.
Can the Minister tell the House whether any such measures are being considered?
At the very least, there should be an education programme for parents who, in many cases, enhance the policing of their children’s use of VPNs by understanding their possible misuses. For instance, when they are asked to pay for children’s access to the VPN app, they should interrogate the need for this access. Surely general advice for safety protection could be given to parents, as happens with parental control of video games.
I know that Ofcom is carrying out research into why children are using VPNs. It is a welcome step, but I must ask why this was not anticipated and research carried out earlier. I am pleased with the greatly improved safety environment for children introduced with these codes, but the internet is a dangerous place. I therefore ask the Minister to ensure that it is a safe place for our children in all its functionalities.
 Baroness Harding of Winscombe (Con)
        
    
    
    
    
    
        
        
        
            Baroness Harding of Winscombe (Con) 
        
    
        
    
        My Lords, I thank the noble, Lord, Lord Clement-Jones, for bringing this regret Motion. He gave a tour de force of all the reasons why we should regret that these codes are not more ambitious. I too wholeheartedly support the Online Safety Act and, once again, it is a privilege to be with the tech team across the aisles that has worked on this legislation for a very long time. I do not in any way want to diminish the substantial work that Ofcom has done on this. It is a ground-breaking piece of legislation, as the noble Lord, Lord Clement-Jones, said. There is a huge amount of work to implement it and I would not want in any way to slow down that implementation. I regret, however, that these codes are not more ambitious.
My remarks will, very briefly, focus on the first group of concerns that the noble Lord, Lord Clement-Jones, focused on: insufficient protections and the lack of ambition in them. I will specifically focus on whether these codes really allow for age-appropriate experiences. Any parent or grandparent knows that what is appropriate for a 13 year-old is very different from what is appropriate for a 17 year-old. Yet, sadly, although Ofcom recognises that user-to-user services should
“consider children in different age groups”,
there is little or no guidance on what they should actually consider. As we are learning, unless those things are specified in detail, the safe harbour provision just means that the user-to-user services do not really need to do it at all. As a result, it is highly unlikely that these codes will produce user-to-user services that are age appropriate for 13 year-olds relative to 17 year-olds. Even more fundamentally, they will not address the millions of under-13s using social media platforms that even those providers themselves admit are only appropriate for 13 year-olds and above.
 Lord Russell of Liverpool (CB)
        
    
    
    
    
    
        
        
        
            Lord Russell of Liverpool (CB) 
        
    
        
    
        My Lords, I also thank the noble Lord, Lord Clement-Jones, for introducing this regret Motion. I am very familiar with it because, as a member of the Secondary Legislation Scrutiny Committee, I was part of the team scrutinising it when it came in front of us. I welcome the Minister to her post. This is one of her early baptisms in the world of online safety and it will be the precursor, I suspect, to many more. I suspect that she will be on a fairly steep learning curve, and I wish her well.
Many people have spoken about the perception that many of us have that we thought we were being very explicit about our hopes and ambitions for the Online Safety Bill as it went through Parliament—with, in particular, a huge amount of time in this House. If she has not yet been able to, I suggest that the Minister could benefit from sitting down over a suitable libation with the noble Lords, Lord Parkinson and Lord Clement-Jones, the noble Viscounts, Lord Camrose and Lord Colville, the noble Baronesses, Lady Harding and Lady Kidron, and others to understand what we thought we were being very clear about in terms of Parliament’s expectations when this Act passed and what we are now experiencing in terms of its enactment. That would be really helpful in understanding where we are coming from when we repeatedly raise some of these issues. That really comes under the heading of an insufficiency of ambition and of clarity of understanding about what it was that we thought we were being very clear about.
There is a failure of process in certain areas. I will not go into great detail, but the fact that smaller, high-risk sites are, to a large extent, excluded is madness. It is exactly on some of those smaller, high-risk sites where you have incidents of people being encouraged to self-harm, of people being encouraged to end their lives and of radicalisation. That is going on in plain sight. At the moment, Ofcom does not appear to feel that it has enough resources to do anything about it. I am also not sure that it feels it is entirely clear, under the auspices of the Act, whether this should indeed be a priority for it.
There are also structural flaws: the noble Lord, Lord Clement-Jones, mentioned the safe harbour. There are three key questions that I will pose to the Minister— I do not expect her to be able to give a magic answer at the Dispatch Box—to really focus on trying to get an understanding of what is going on and some answers. I am sure she will be asked some of these questions in the future.
The first is: does Ofcom have sufficient resources and knowledge at its disposal to do what we very clearly intended it to do in the Act? Given the evidence at the moment of what it is able to do, I am not sure the resources are adequate. If the resources are adequate, they are not being tactically and strategically deployed in the best way to achieve what we were trying to do.
The second point was referred to briefly. We tried very hard, during the passage of the Act, to try to find a place for parents to go. If, under the terms of the Act, they are meant to go to the platform with which they have a problem—perhaps their child was harmed or, God forbid, even died—and the platform is unable to satisfy them and give them an adequate response, they have nowhere to go. We talked about that at length during the passage of the Act, and it is still the case. I do not think, in all conscience, that is adequate or appropriate. I encourage the Government to look carefully at that and how it might be mitigated. Talking to people such as Ian Russell and the Molly Rose Foundation would be a very good way of understanding what those families, who are not getting an adequate response, are going through and will continue to go through.
The third area is the level of scrutiny that the Act is undergoing. We fought in vain to encourage the then Government to agree to set up a Joint Committee of both Houses of Parliament to scrutinise the Online Safety Act on a continuing basis; to establish a dialogue with Ofcom in a direct and relatively open way, but also for it to be possible to do it, if needs be, more discreetly, away from the limelight and publicity; to try to understand some of the issues and problems that Ofcom may be having; and to see how we can help, rather than being slightly outside it, as it is currently constructed. I do not feel comfortable being critical of Ofcom without necessarily being in full receipt of the facts and understanding what is really going on inside. I think all those of us involved in the passage of the Act would like to help Ofcom do its job, not castigate it for not doing what we think it should have done. Trying to see whether there is a way in which we can have a more regular dialogue between Parliament and Ofcom, for each to understand where the other is coming from and to be better informed, would be a good step forward.
The day before yesterday, in our Secondary Legislation Scrutiny Committee, we had yet another statutory instrument on online safety, in this case from the Home Office. Again, I am afraid it was slightly disappointing news. This statutory instrument has a particularly catchy title. It is called the Online Safety (CSEA Content Reporting by Regulated User-to-User Service Providers) (Revocation) Regulations. For those at the Dispatch Box, it is Statutory Instrument 2025 No. 1066, like the Battle of Hastings. In this case, an online portal to enable all reports of child sexual exploitation and abuse to be aggregated in one place was meant to go live, I think, next month. For reasons probably to do with poor design and project planning, it will not go live. It is effectively having to be rebuilt and will hopefully go online, if it works, at some point in the spring. We will publish our report and noble Lords will be able to read it and see that the committee was not exactly happy. In this case, the Home Office provided an inadequate Explanatory Memorandum and has agreed to go back and do a better job. I can see the chair of our committee sitting behind the Minister; he will be well aware of that.
In conclusion, I think the status quo is untenable. Until and unless the group of us who were particularly closely involved in the passage of the Act are more confident that the victims who are suffering in the online world, particularly children, are better protected—until we feel that their concerns and experiences are being responded to more robustly, succinctly and accurately—we will continue to keep on raising this issue again and again.
 Baroness Barran (Con)
        
    
    
    
    
    
        
        
        
            Baroness Barran (Con) 
        
    
        
    
        My Lords, I apologise: I came to listen to this debate from the steps of the throne, but the more I listened, the more I thought I would make a very short contribution. I join others in thanking the noble Lord, Lord Clement-Jones, for his Motion. The noble Lords, Lord Storey and Lord Watson, and others in the House, will know that, as part of the Children’s Wellbeing and Schools Bill, the noble Lord, Lord Nash, and I and others have introduced a number of amendments that are relevant to our debate today. One would raise the age of access to social media for children from 13 to 15. Another would prohibit the use of VPNs by children. A third would ban the use of smartphones in schools during the school day.
The Department for Education and the noble Baroness, Lady Smith of Malvern, in their rejection of our proposed amendments in Committee, cited as reasons for waiting the lack of convincing evidence and the fact that these codes were going to be implemented, and said it was premature to act. I hope there is some way of making sure that the noble Baroness is briefed on today’s debate, because I think she might feel, if she listened to some of the comments around the House, somewhat less reassured. She would also have been less reassured if she had been present earlier this week at the round table we hosted, across parties and with Cross-Bench support, which took evidence from medical experts including the noble Baroness, Lady Cass, academic experts and safeguarding experts. What we heard was deeply troubling.
The Minister may be aware that there are a number of ongoing campaigns about aspects of this and the way in which social media has led to tragic deaths of children. The noble Lord, Lord Russell, referred to Ian Russell and his daughter Molly, but Esther Ghey, mother of Brianna Ghey, and Ellen Roome, mother of Jools, also lost their children tragically as a result of their involvement with social media. This is an opportunity for the Government to be on the right side of history. All the evidence seems to be going in one direction and one direction only in terms of harm to children. If there is ever a time to adopt the precautionary principle, surely this is it.
 Lord Watson of Invergowrie (Lab)
        
    
    
    
    
    
        
        
        
            Lord Watson of Invergowrie (Lab) 
        
    
        
    
        My Lords, the noble Baroness, Lady Barran, began with an apology and I must do the same, because I did not leave my office soon enough and I missed the first few paragraphs of the speech by the noble Lord, Lord Clement-Jones, to whom I personally apologise, and I apologise to the House in general for that. As the noble Lord, Lord Russell, said, I am the chair of the Secondary Legislation Scrutiny Committee, but I speak today in an entirely personal capacity.
The noble Lord, Lord Clement-Jones, has actually left very little to say—so I will say very little. I certainly agreed with the important points he highlighted and went into in some detail. The gaps remaining in those codes are a genuine concern. The Department for Science, Innovation and Technology and Ofcom have pointed to the fact that they are simply the first iteration. That may well be the case, but both will need to ensure that any shortcomings that emerge are addressed at the earliest opportunity, and I hope it may be possible for my noble friend, whom I welcome to her post on the Front Bench, to offer an assurance that the necessary legislative changes that result from the shortcomings will be implemented as a matter of priority. Anything else would be entirely inappropriate, and indeed perhaps even unforgivable.
 Viscount Camrose (Con)
        
    
    
    
    
    
        
        
        
            Viscount Camrose (Con) 
        
    
        
    
        My Lords, not much we debate in your Lordships’ House unites us so thoroughly as our shared recognition that children must be protected from harmful online content and behaviours. I am delighted that we are as one when it comes to the importance of shielding young people from extreme pornography, content promoting self-harm or suicide, or other serious risks.
This makes it all the more important to scrutinise how the Government and Ofcom have chosen to implement these protections. The role of the draft codes of practice, laid in April this year and brought into effect in July, is to translate Parliament’s intentions into practical rules for service providers. As the noble Lord, Lord Russell, set out so clearly, there are some serious concerns about whether these codes are achieving their stated objectives, and I thank the noble Lord, Lord Clement-Jones, for bringing this important Motion to the House today and for giving us the chance to air our views.
There is some evidence that the codes are being applied in a way that risks overreach and unintended consequences. Some platforms, such as X and Reddit, in attempting to comply, blocked wide-ranging content, including parliamentary debates on grooming gangs and posts relating to the wars in Ukraine and Gaza. Several experts have warned that such overapplication risks stifling legitimate public debate. It has even been suggested that some platforms deliberately overapply some rules as a way to influence government towards weakening them.
The Act was always designed to respect freedom of expression—political and otherwise—while protecting internet users, especially children, from harm. The Government’s own guidance confirms this, but clearly the practical effect has not always to date reflected that intent.
There also exist concerns about the complexity and accessibility of the codes. Platforms, parents and of course children themselves in some instances may struggle to understand what duties are required and how to enforce them. The guidance is hundreds of pages long and, while Ofcom has issued advice on risk assessments and age-verification measures, there is a real danger that the practical realities of compliance, particularly for smaller providers, leave gaps in protection. Complexity should not become a barrier to the very protections these codes are meant to provide.
We have also been discussing the iterative approach taken by Ofcom. Presenting the codes as a first step, to be refined over time, is in principle essential, for two reasons. The first is that, as we know, this is a pioneering piece of legislation and we must remain open to adapting it. The second is that I am afraid that the people we are up against are inventive users of fast-moving technology.
However, the iterative approach is also clearly creating uncertainty. Civil society organisations have reported that their concerns were not fully addressed during consultation. Children face immediate risks and it is imperative that the Government ensure that these gaps are closed without delay. The noble Lord, Lord Clement-Jones, cited the statistic that a young life aged between 10 and 19 is lost to suicide every week where technology has been a factor. The codes should not act or be viewed as a ceiling for safety standards. Rather, they must set a floor for safety standards and be subject to firm and measurable enforcement.
Enforcement and proportionality are, of course, critical. The Act grants Ofcom significant powers, including fines, criminal liability and restrictions on financial and commercial arrangements. Yet there are practical challenges to ensuring that these powers are applied in a proportionate and evidence-based way. The critical challenge facing the Government as they operate the Act’s machinery is to protect children while avoiding excessive interference with legitimate content and adult access to lawful material.
All that said, we on these Benches do have questions over the Government’s handling of these codes. Our purpose is to challenge the Government to deliver children’s online safety effectively and proportionately. While I welcome the Minister to her place and wish her the very best for her very important role, particularly in this respect, I ask her for some greater clarity, if she is able to provide it, on three strands of Ofcom’s work. First, how will Ofcom monitor implementation by platforms? Secondly, how will it ensure that civil society is genuinely incorporated, and of course that consultees recognise that they have been listened to? Thirdly, how will it address current gaps in coverage without delay?
I am delighted to be participating in this important debate and to have the opportunity to seek these assurances from the Government. We must see rapid action to ensure that the codes protect children in practice, do not inadvertently suppress legitimate debate, and are accessible and enforceable in the real world. I support the scrutiny behind this regret Motion and hope that, when the Minister rises, she will provide answers that reassure us all that the protection of children online is being delivered with both effectiveness and proportionality.
 The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Lloyd of Effra) (Lab)
    
        
    
    
    
    
    
        
        
        
            The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Lloyd of Effra) (Lab) 
        
    
        
    
        My Lords, I thank noble Lords for their valuable contributions today, and I thank the noble Lord, Lord Clement-Jones, for initiating the debate. I absolutely acknowledge the huge expertise in the Room today. I thank the noble Lord, Lord Russell, for his suggestion of further discussions with individual Members.
I found reading the Secondary Legislation Scrutiny Committee’s report an excellent basis for this discussion. That committee plays a very important role, as do other committees, such as the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. The role of ongoing scrutiny by all these bodies is absolutely essential. On the matter of the specific committee that the noble Lord, Lord Russell, mentioned, it would be for the House to decide whether that would be set up to monitor this legislation and the codes.
As others have mentioned, we are working closely with Ofcom to monitor the effectiveness of the Online Safety Act. While the early signs are encouraging, the true test will be whether adults and children are having a safer online experience. Ofcom has put in place a robust monitoring and evaluation program, tracking changes firms are making in response to regulation, gathering data from the supervised services and commissioning research to measure impact. Some of that research has been mentioned in the course of the debate. It is quite extensive and provides a lot of information to civil society organisations, Members of this House and others.
What binds us together is the determination to do everything we need to do to keep children safe online, as built on the evidence. That is a priority. The previous Secretary of State, in issuing his statement of strategic priorities, made it clear that the first priority was safety by design. That builds on the safety by design measures within the codes, such as the safer design of algorithms to filter out harmful content from children’s feeds. On 25 July, Ofcom published its statement, setting out what it proposes to do in consequence of that statement of strategic priorities. Under the Act, it must publish further annual reviews of what action it has taken as a result of the statement of strategic priorities, including on safety by design.
We have taken action to strengthen the regulatory framework by making further offences priority offences under the Online Safety Act, reflecting the most serious and prevalent illegal content and online activity—for example, laying an SI to make cyberflashing, encouraging self-harm and the sharing of intimate images without consent priority offences under the Act.
Others have mentioned the importance of basing our decisions on good evidence of what is happening. Recognising that further research was required to improve the evidence base, the Government have commissioned a feasibility study to explore the impact of smartphones and social media use on children.
 Baroness Barran (Con)
        
    
    
    
    
    
        
        
        
            Baroness Barran (Con) 
        
    
        
    
        On the point about evidence, I am absolutely not an expert in this but the noble Baroness, Lady Cass, definitely is. I think it would be a very good use of the Minister’s time to meet with her. She described a situation where the research that is being done is at a population level, where changes and attribution will be difficult to discern. I understood the noble Baroness to be making the case that—I do not want to misrepresent her—what clinicians are seeing has a lot of parallels with her review of the Tavistock. On the one hand, you wait for great population-level surveys, but you need to act on what is being seen. It is important that the Government look at both.
 Baroness Lloyd of Effra (Lab)
    
        
    
    
    
    
    
        
        
        
            Baroness Lloyd of Effra (Lab) 
        
    
        
    
        I thank the noble Baroness for that suggestion. I would be very happy to speak with the noble Baroness, Lady Cass, and leverage her experience in drawing up the right models of evidence-gathering and research.
To come back to the core of some of the points that the noble Lord, Lord Clement-Jones, and others were making about the implementation of the Act through the codes, Ofcom has met the 18-month statutory timeline that was set by Parliament to finalise the guidance and codes of practice relating to illegal harms and the protection of children. The illegal content safety duties came into force in March this year, meaning that all companies in scope will need to protect all users, including children, from illegal content and criminal behaviour on their services. On 24 April this year, Ofcom submitted to the Secretary of State the final draft protection codes of conduct. That regime came into force on 25 July, following parliamentary scrutiny.
 Lord Clement-Jones (LD)
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones (LD) 
        
    
        
    
        My Lords, I thank the Minister for her response and add my welcome to her to the Front Bench: you cannot have enough south Londoners on the Front Bench. I also thank her very much for the serious and comprehensive way in which she answered many of the points raised—and, indeed, some of the points that we did not raise—during the debate.
There is an essential issue running all the way through most of the speeches, which is this question of oversight and scrutiny. I very much hope the Minister will take a leaf out of her predecessor’s book—the noble Baroness, Lady Jones, who I am glad to see is also on the Benches today—in engaging with those Members across the House who have strong views about online safety, who helped take the Bill through, and who genuinely want to see Ofcom succeed in regulating social media platforms. It is not just about formal engagement through the SLSC or other mechanisms, valuable though that is; it is important that we get to grips with a lot of the new information in what she had to say, which I thought was extremely helpful.