Protection of Children Codes of Practice Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Business and Trade
(1 day, 20 hours ago)
Lords Chamber Lord Clement-Jones
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones 
        
    
        
    
        That this House regrets that the draft Protection of Children Codes of Practice for search services does not fully deliver the level of protection for children envisaged by the Online Safety Act 2023 due to regulatory gaps, accessibility challenges, and the consultation process failing adequately to address feedback from civil society organisations and victims’ groups.
Relevant document: 25th Report from the Secondary Legislation Scrutiny Committee (special attention drawn to the instrument).
 Lord Clement-Jones (LD)
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones (LD) 
        
    
        
    
        My Lords, this is a regret Motion, and one of my regrets today is that we are debating it so long after it was tabled back in May this year. The Online Safety Act 2023 was born from tireless campaigning over a long period, and when I look around the Chamber, I see a number of those who were heavily engaged on that Act. The clear parliamentary intent was to create a safer digital environment. This House passed landmark legislation with the clear ambition to compel online platforms to take proportional measures to safeguard children from accessing or being exposed to harmful and inappropriate content and behaviour.
One of the key questions today, which many have continued to raise since I first put down the regret Motion, is: does the implementation of the Act match that ambition? The children’s codes of practice were intended to translate Parliament’s intent into practical reality; yet following scrutiny by the Secondary Legislation and Scrutiny Committee, extensive feedback from civil society organisations and analysis of emerging online harms, it is clear that in their current form these codes present significant shortcomings, hence this regret Motion. For example, the Molly Rose Foundation, founded following the death of 14 year-old Molly Russell, is deeply dismayed by the lack of ambition in these codes and states explicitly that it does not have confidence that the Online Safety Act will prevent a repeat of Molly’s death.
The Online Safety Act explicitly mandates that a higher standard of protection is provided for children than for adults. It demands that services are safe by design, yet the codes recommend only a limited number of measures that do little to address the fundamental design features and functionalities that facilitate or exacerbate harm to children. Specifically, the codes fail to address the harmful design features that platforms have embedded in their business models, features that prioritise engagement and monetisation over safety. These include scroll mechanisms that trap children in continuous content consumption, push notifications that constantly pull them back to platforms, loot boxes that exploit addictive behaviours, and algorithmic amplification that prioritises content designed to maximise engagement rather than well-being.
Ofcom will require platforms only to reduce the frequency with which children are shown certain forms of harmful content, such as dangerous stunts, rather than demanding they stop recommending it altogether. Several platforms have persuaded the regulator that content moderation is not technically feasible, leading Ofcom to require only “proportionate alternatives” such as preventing access to group chats where primary priority content has been identified, which the Molly Rose Foundation anticipates is highly likely to be gamed by the industry. Measures that could have helped, such as enabling children to provide feedback on algorithmic recommendations, appear to have been watered down and are now effectively left to the platform’s discretion.
The codes fail adequately to require safety by design or to require companies to take specific actions to address high-risk functionalities such as live streaming, despite Ofcom highlighting them in its register of risks. Civil society organisations such as Internet Matters have expressed disappointment that key recommendations on parental controls were not included as specific duties. There is a notable lack of reference to media literacy, which is essential for equipping families to support children’s safety. Concerns surrounding complex issues such as child-on-child harms were raised in consultation, yet these recommendations were not taken forward. The fundamental problem regarding pornography is not just access, but that the pornography itself is extreme, depicting acts that could not be legally published in offline formats such as DVDs. The regulator’s proposed measures for recommender systems are seen as having misdiagnosed the core problem, focusing narrowly on demotion of illegal content rather than addressing the amplification of lawful but cumulatively harmful content.
The second key issue is the failure of process. It is a matter of great concern that civil society organisations and victims’ groups felt that they were not listened to during consultation. These groups draw on the lived and often traumatic experience of victims and survivors, and they report that fundamental issues that they flagged remain unaddressed. There is a suggestion that Ofcom may have given greater weight to industry concerns than to the voices of safety advocates. Ofcom has explicitly confirmed that it has made no quantitative assessment or modelling of the societal costs and impacts of harmful online content. The quantified financial costs to businesses of compliance are given disproportionate weight compared to the immense potential impact of harm on individuals and the wider economic and societal costs.
 Lord Clement-Jones (LD)
        
    
    
    
    
    
        
        
        
            Lord Clement-Jones (LD) 
        
    
        
    
        My Lords, I thank the Minister for her response and add my welcome to her to the Front Bench: you cannot have enough south Londoners on the Front Bench. I also thank her very much for the serious and comprehensive way in which she answered many of the points raised—and, indeed, some of the points that we did not raise—during the debate.
There is an essential issue running all the way through most of the speeches, which is this question of oversight and scrutiny. I very much hope the Minister will take a leaf out of her predecessor’s book—the noble Baroness, Lady Jones, who I am glad to see is also on the Benches today—in engaging with those Members across the House who have strong views about online safety, who helped take the Bill through, and who genuinely want to see Ofcom succeed in regulating social media platforms. It is not just about formal engagement through the SLSC or other mechanisms, valuable though that is; it is important that we get to grips with a lot of the new information in what she had to say, which I thought was extremely helpful.