(1 year, 10 months ago)
Commons ChamberIt is a pleasure to follow my right hon. Friend the Member for Chelmsford (Vicky Ford), who made a very powerful speech, and I completely agree with her about the importance of treating eating disorders as being of the same scale of harm as other things in the Bill.
I was the media analyst for Merrill Lynch about 22 years ago, and I made a speech about the future of media in which I mentioned the landscape changing towards one of self-generated media. However, I never thought we would get to where it is now and what the effect is. I was in the Pizza Express on Gloucester Road the other day at birthday party time, and an 11-year-old boy standing in the queue was doomscrolling TikTok videos rather than talking to his friends, which I just thought was a really tragic indication of where we have got to.
Digital platforms are also critical sources of information and our public discourse. Across the country, people gather up to 80% of information from such sources, but we should not have trust in them. Their algorithms, which promote and depromote, and their interfaces, which engage, are designed, as we have heard, to make people addicted to the peer validation and augmentation of particular points of view. They are driving people down tribal rabbit holes to the point where they cannot talk to each other or even listen to another point of view. It is no wonder that 50% of young people are unhappy or anxious when they use social media, and these algorithmic models are the problem. Trust in these platforms is wrong: their promotion or depromotion of messages and ideas is opaque, often subjective and subject to inappropriate influence.
It is right that we tackle illegal activity and that harms to children and the vulnerable are addressed, and I support the attempt to do that in the Bill. Those responsible for the big platforms must be held to account for how they operate them, but trusting in those platforms is wrong, and I worry that compliance with their terms of service might become a tick-box absolution of their responsibility for unhappiness, anxiety and harm.
What about harm to our public sphere, our discourse, and our processes of debate, policymaking and science? To trust the platforms in all that would be wrong. We know they have enabled censorship. Elon Musk’s release of the Twitter files has shown incontrovertibly that the big digital platforms actively censor people and ideas, and not always according to reasonable moderation. They censor people according to their company biases, by political request, or with and on behalf of the three-letter Government agencies. They censor them at the behest of private companies, or to control information on their products and the public policy debate around them. Censorship itself creates mistrust in our discourse. To trust the big platforms always to do the right thing is wrong. It is not right that they should be able to hide behind their terms of service, bury issues in the Ofcom processes in the Bill, or potentially pay lip service to a tick-box exercise of merely “having regard” to the importance of freedom of expression. They might think they can just write a report, hire a few overseers, and then get away scot-free with their cynical accumulation, and the sale of the data of their addicted users and the manipulation of their views.
The Government have rightly acknowledged that addressing such issues of online safety is a work in progress, but we must not think that the big platforms are that interested in helping. They and their misery models are the problem. I hope that the Government, and those in the other place, will include in the Bill stronger duties to stop things that are harmful, to promote freedom of expression properly, to ensure that people have ready and full access to the full range of ideas and opinions, and to be fully transparent in public and real time about the way that content is promoted or depromoted on their platforms. Just to trust in them is insufficient. I am afraid the precedent has been set that digital platforms can be used to censor ideas. That is not the future; that is happening right now, and when artificial intelligence comes, it will get even worse. I trust that my colleagues on the Front Bench and in the other place will work hard to improve the Bill as I know it can be improved.
I strongly support the Bill. This landmark piece of legislation promises to put the UK at the front of the pack, and I am proud to see it there. We must tackle online abuse while protecting free speech, and I believe the Bill gets that balance right. I was pleased to serve on the Bill Committee in the last Session, and I am delighted to see it returning to the Chamber. The quicker it can get on to the statute book, the more children we can protect from devastating harm.
I particularly welcome the strengthened protections for children, which require platforms to clearly articulate in their terms of service what they are doing to enforce age requirements on their site. That will go some way to reassuring parents that their children’s developing brains will not be harmed by early exposure to toxic, degrading, and demeaning extreme forms of pornography. Evidence is clear that early exposure over time warps young girls’ views of what is normal in a relationship, with the result that they struggle to form healthy equal relationships. For boys, that type of sexual activity is how they learn about sex, and it normalises abusive, non-consensual and violent acts. Boys grow up into men whose neural circuits become habituated to that type of imagery. They actually require it, regardless of the boundaries of consent that they learn about in their sex education classes—I know this is a difficult and troubling subject, but we must not be afraid to tackle it, which is what we are doing with the Bill. It is well established that the rise of that type of pornography on the internet over time has driven the troubling and pernicious rise in violence against women and girls, perpetrated by men, as well as peer-on-peer child sexual abuse and exploitation.
During Committee we had a good debate about the need for greater criminal sanctions to hold directors individually to account and drive a more effective safety culture in the boardroom. I am proud to serve in the Chamber with my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates). I have heard about all their work on new clause 2 and commend them heartily for it. I listened carefully to the Minister’s remarks in Committee and thank him and the Secretary of State for their detailed engagement.
The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.
We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.
There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from the hon. Member for Pontypridd (Alex Davies-Jones), my right hon. Friend the Member for Witham (Priti Patel) and the right hon. Member for Barking (Dame Margaret Hodge)—I have to put that right, having not mentioned her last time—as well as from my hon. Friend the Member for Gosport (Dame Caroline Dinenage); the hon. Member for Aberdeen North (Kirsty Blackman); the former Secretary of State, my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright); and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).
I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.
We had great contributions from my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom), the hon. Member for Strangford (Jim Shannon) and my hon. Friend the Member for Dover (Mrs Elphicke). I am glad that my hon. Friend the Member for Stone (Sir William Cash) gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.
There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend the Member for Chelmsford (Vicky Ford) and from my hon. Friend the Member for Yeovil (Mr Fysh). The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes), whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).
I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.
We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.
We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.
As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.
My hon. Friend has rightly prioritised the protection of children. He will recall that throughout the debate, a number of Members have asked the Government to consider the amendment that will be tabled by Baroness Kidron, which will require coroners to have access to data in cases in which the tragic death of a child may be related to social media and other online activities. Is my hon. Friend able to give a commitment from the Dispatch Box that the Government will look favourably on that amendment?
Coroners already have some powers in this area, but we are aware of instances raised by my right hon. Friend and others in which that has not been the case. We will happily work with Baroness Kidron, and others, and look favourably on changes where they are necessary.
I entirely agree that our focus has been on protecting children, but is the Minister as concerned as I am about the information and misinformation, and about the societal impacts on our democracy, not just in this country but elsewhere? The hon. Member for Watford suggested a Committee that could monitor such impacts. Is that something the Minister will reconsider?
For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.
May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?
We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—
I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.
I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.
My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.
As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.
However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.
We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.
We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.
This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.
I am afraid I have only three minutes, so I am not able to give way.
The Government cannot accept the Labour amendments that would re-add the adult safety duties and the concept of content that is harmful to adults. These duties and the definition of harmful content were removed from the Bill in Committee to protect free speech and to ensure that the Bill does not incentivise tech companies to censor legal content. It is not appropriate for the Government to decide whether legal content is harmful to adult users, and then to require companies to risk assess and set terms for such content. Many stakeholders and parliamentarians are justifiably concerned about the consequences of doing so, and I share those concerns. However, the Government recognise the importance of giving users the tools and information they need to keep themselves safe online, which is why we have introduced to the Bill a fairer, simpler approach for adults—the triple shield.
Members have talked a little about user empowerment. I will not have time to cover all of that, but the Government believe we have struck the right balance of empowering adult users on the content they see and engage with online while upholding the right to free expression. For those reasons, I am not able to accept these amendments, and I hope the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson) will not press them to a vote.
The Government amendments are consequential on removing the “legal but harmful” sections, which were debated extensively in Committee.
The Government recognise the concern of my hon. Friend the Member for Stroud about anonymous online abuse, and I applaud her important campaigning in this area. We expect Ofcom to recommend effective tools for compliance, with the requirement that these tools can be applied by users who wish to filter out non-verified users. I agree that the issue covered by amendment 52 is important, and I am happy to continue working with her to deliver her objectives in this area.
My right hon. Friend the Member for Chelmsford spoke powerfully, and we take the issue incredibly seriously. We are committed to introducing a new communications offence of intentional encouragement and assistance of self-harm, which will apply whether the victim is a child or an adult.
I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.
I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
I beg to move, That the Bill be now read the Third time.
It has been a long road to get here, and it has required a huge team effort that has included Members from across the House, the Joint Committee, Public Bill Committees, the Ministers who worked on this over the years in the Department for Digital, Culture, Media and Sport and my predecessors as Secretaries of State. Together, we have had some robust and forthright debates, and it is thanks to Members’ determination, expertise and genuine passion on this issue that we have been able to get to this point today. Our differences of opinion across the House have been dwarfed by the fact that we are united in one single goal: protecting children online.
I have been clear since becoming Secretary of State that protecting children is the very reason that this Bill exists, and the safety of every child up and down the UK has driven this legislation from the start. After years of inaction, we want to hold social media companies to account and make sure that they are keeping their promises to their own users and to parents. No Bill in the world has gone as far as this one to protect children online. Since this legislation was introduced last year, the Government have gone even further and made a number of changes to enhance and broaden the protections in the Bill while also securing legal free speech. If something should be illegal, we should have the courage of our convictions to make it illegal, rather than creating a quasi-legal category. That is why my predecessor’s change that will render epilepsy trolling illegal is so important, and why I was determined to ensure that the promotion of self-harm, cyber-flashing and intimate image abuse are also made illegal once and for all in this Bill.
Will my right hon. Friend make it clear, when the Bill gets to the other place, that content that glamorises eating disorders will be treated as seriously as content glamorising other forms of self-harm?
I met my right hon. Friend today to discuss that very point, which is particularly important and powerful. I look forward to continuing to work with her and the Ministry of Justice as we progress this Bill through the other place.
The changes are balanced with new protections for free speech and journalism—two of the core pillars of our democratic society. There are amendments to the definition of recognised news publishers to ensure that sanctioned outlets such as RT must not benefit.
Since becoming Secretary of State I have made a number of my own changes to the Bill. First and foremost, we have gone even further to boost protections for children. Social media companies will face a new duty on age limits so they can no longer turn a blind eye to the estimated 1.6 million underage children who currently use their sites. The largest platforms will also have to publish summaries of their risk assessments for illegal content and material that is harmful for children—finally putting transparency for parents into law.
I believe it is blindingly obvious and morally right that we should have a higher bar of protection when it comes to children. Things such as cyber-bullying, pornography and posts that depict violence do enormous damage. They scar our children and rob them of their right to a childhood. These measures are all reinforced by children and parents, who are given a real voice in the legislation by the inclusion of the Children’s Commissioner as a statutory consultee. The Bill already included provisions to make senior managers liable for failure to comply with information notices, but we have now gone further. Senior managers who deliberately fail children will face criminal liability. Today, we are drawing our line in the sand and declaring that the UK will be the world’s first country to comprehensively protect children online.
Those changes are completely separate to the changes I have made for adults. Many Members and stakeholders had concerns over the “legal but harmful” section of the Bill. They were concerned that it would be a serious threat to legal free speech and would set up a quasi-legal grey area where tech companies would be encouraged to take down content that is perfectly legal to say on our streets. I shared those concerns, so we have removed “legal but harmful” for adults. We have replaced it with a much simpler and fairer and, crucially, much more effective mechanism that gives adults a triple shield of protection. If it is illegal, it has to go. If it is banned under the company’s terms and conditions, it has to go.
Lastly, social media companies will now offer adults a range of tools to give them more control over what they see and interact with on their own feeds.
My right hon. Friend makes an important point about things that are illegal offline but legal online. The Bill has still not defined a lot of content that could be illegal and yet promoted through advertising. As part of their ongoing work on the Bill and the online advertising review, will the Government establish the general principle that content that is illegal will be regulated whether it is an ad or a post?
I completely agree with my hon. Friend on the importance of this topic. That is exactly why we have the online advertising review, a piece of work we will be progressing to tackle the nub of the problem he identifies. We are protecting free speech while putting adults in the driving seat of their own online experience. The result is today’s Bill.
I thank hon. Members for their hard work on this Bill, including my predecessors, especially my right hon. Friend the Member for Mid Bedfordshire (Ms Dorries). I thank all those I have worked with constructively on amendments, including my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates), for Stone (Sir William Cash), for Dover (Mrs Elphicke), for Rutland and Melton (Alicia Kearns), and my right hon. Friends the Members for South Holland and The Deepings (Sir John Hayes), for Chelmsford (Vicky Ford), for Basingstoke (Dame Maria Miller) and for Romsey and Southampton North (Caroline Nokes).
I would like to put on record my gratitude for the hard work of my incredibly dedicated officials—in particular, Sarah Connolly, Orla MacRae and Emma Hindley, along with a number of others; I cannot name them all today, but I note their tremendous and relentless work on the Bill. Crucially, I thank the charities and devoted campaigners, such as Ian Russell, who have guided us and pushed the Bill forward in the face of their own tragic loss. Thanks to all those people, we now have a Bill that works.
Legislating online was never going to be easy, but it is necessary. It is necessary if we want to protect our values —the values that we protect in the real world every single day. In fact, the NSPCC called this Bill “a national priority”. The Children’s Commissioner called it
“a once-in-a-lifetime opportunity to protect all children”.
But it is not just children’s organisations that are watching. Every parent across the country will know at first hand just how difficult it is to shield their children from inappropriate material when social media giants consistently put profit above children’s safety. This legislation finally puts it right.