All 9 Dean Russell contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Tue 21st Jun 2022
Tue 28th Jun 2022
Mon 5th Dec 2022
Tue 17th Jan 2023

Online Safety Bill

Dean Russell Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.

We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.

Dean Russell Portrait Dean Russell
- Hansard - -

My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.

One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.

That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.

I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.

Online Safety Bill (First sitting)

Dean Russell Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

That is noted.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I refer Members to my entry in the Register of Members’ Financial Interests regarding work I did six months ago for a business called DMA.

None Portrait The Chair
- Hansard -

We will now hear oral evidence from Kevin Bakhurst, group director of broadcasting and online content at Ofcom, and Richard Wronka, director of Ofcom’s online harms policy. Before calling the first Member to ask a question, I remind all Members that questions should be limited to matters within the scope of the Bill, and we must stick to the timings in the programme motion that the Committee has agreed. For this witness panel, we have until 10.05 am. Could the witnesses please introduce themselves for the record?

Kevin Bakhurst: Good morning. I am Kevin Bakhurst, group director at Ofcom for broadcasting and online content.

Richard Wronka: I am Richard Wronka, a director in Ofcom’s online safety policy team.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - -

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

Dean Russell Portrait Dean Russell
- Hansard - -

Q But as the Bill stands, there is a very clear point about stopping harmful content being sent to people, so I imagine that would cover it at least in that sense, would it not?

Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Just to continue the point made by my colleague, you are right to say that Ministry of Justice colleagues are considering the flashing image offence as a separate matter. But would you agree that clause 150, on harmful communications, does criminalise and therefore place into the scope of the Bill communications intended to cause harm to a “likely audience” where such harm is

“psychological harm amounting to serious distress”?

Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.

Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.

--- Later in debate ---
Dean Russell Portrait Dean Russell
- Hansard - -

Q Before Christmas, I was involved in the Joint Committee that carried out pre-legislative work on the Bill. We heard platforms repeatedly state their belief that they are doing all they can to ensure safety and protect from harm. Actually, they do not even come close. My question to both platforms—and the others we are hearing from later today—is to what extent are you going to have to be dragged, kicking and screaming, to make sure these measures are put in place, or are you willing to work with Ofcom and other organisations to make sure that that is done?

Ben Bradley: Speaking for TikTok, we view ourselves as a second-generation platform. We launched in 2018, and at that time when you launched a product you had to make sure that safety was at the heart of it. I think the Secretary of State herself has said that the Bill process actually predates the launch of TikTok in the UK.

We view ourselves as an entertainment platform and to express yourself, enjoy yourself and be entertained you have to feel safe, so I do not think we would be seen as kicking and screaming under this regime. It is something that we have supported for a long time and we are regulated by Ofcom under the video-sharing platform, or VSP, regime. What the Bill will achieve is to raise the floor of industry standards, a bit like GDPR did for data, so that for all the companies in the future—to Alex’s point, this is about the next five and 10 years—there will be a baseline of standards that everyone must comply with and expectations that you will be regulated. Also, it takes a lot of these difficult decisions about the balance between safety and expression, privacy and security out of the hands of tech companies and into the hands of a regulator that, of course, will have democratic oversight.

Katy Minshall: I do not have very much more to add. We already engage positively with Ofcom. I remember appearing before a Select Committee back in 2018 or 2019 and at that point saying that we were absolutely supportive of Ofcom taking in this role and regulation potentially being a game changer. We are supportive of the systems and processes approach and look forward to engaging constructively in the regulation.

Dean Russell Portrait Dean Russell
- Hansard - -

Q In terms of the timing, once the Bill comes into law, there may be a period where it is enforced to set everything up. Are both your platforms already gearing up to make sure you fulfil the requirements of the Bill from day one?

Katy Minshall: I am glad you asked that question. The problem with the Bill is it depends on so many things that do not exist yet. We are looking at the Bill and thinking how we can prepare and start thinking about what is necessary, but in practice, content that is harmful to adults and harmful to children has not been set out yet. So much of the Bill depends on secondary legislation and codes of practice, and as I described earlier in the question from Alex Davies-Jones, there are such real workability questions around exemptions and ID verification that I worry there would be the risk of substantial delays at the other end, which I do not think anyone wants to see.

Ben Bradley: It is the same from our perspective. We have our community guidelines and we are committed to enforcing those at the moment. A lot of the detail of the Bill will be produced in Ofcom’s codes of practice but I think it is important we think about operationalising the process, what it looks like in practice and whether it is workable.

Something like Katy mentioned in terms of the user empowerment duties, how prescriptive those would be and how those would work, not just from the platforms of today but for the future, is really important. For TikTok, to use a similar example on the user empowerment duties, the intent is to discover content from all over the world. When you open the app, you are recommended content from all sorts of users and there is no expectation that those would be verified. If you have opted into this proposed user empowerment duty, there is a concern that it could exacerbate the risk of filter bubbles, because you would only be receiving content from users within the UK who have verified themselves, and we work very hard to make sure there is a diverse range of recommendations in that. I think it is a fairly easy fix. Much like elsewhere in the Bill, where Ofcom has flexibility about whether to require specific recommendations, they could have that flexibility in this case as well, considering whether this type of power works for these types of platforms.

To use the example of the metaverse, how would it work once the metaverse is up and running? The whole purpose of the metaverse is a shared environment in which users interact, and because the Bill is so prescriptive at the minute about how this user empowerment duty needs to be achieved, it is not clear, if you were verified and I were unverified and you had opted not to see my content but I moved something in the shared environment, like this glass, whether that would move for everyone. It is a small point, but it just goes to the prescriptiveness of how it is currently drafted and the importance of giving Ofcom the flexibility that it has elsewhere in the Bill, but in this section as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a few questions, starting with Twitter, in relation to young people using the platform. How do you currently make sure that under-13s do not use the platform? What actions do you take to ensure that happens? Going forward, will that change?

Katy Minshall: At present, we follow the industry standard of age self-declaration. How you manage and verify identity—whether using a real-name system or emerging technologies like blockchain or documentation—is at the heart of a range of industries, not just ours.

Technology will change and new products that we cannot even envisage today will come on to the market. In terms of what we would do in relation to the Bill, as I said, until we see the full extent of the definitions and requirements, we cannot really say what exact approach we would take.

Online Safety Bill (Second sitting)

Dean Russell Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

Thank you very much. Don’t worry, ladies; I am sure other colleagues will have questions that they wish to pursue. Dean Russell, please.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

Q Thank you, Chair. I guess this is for all three of you, but it is actually directed primarily at Richard—apologies. I do not mean to be rude—well, I am probably about to be rude.

One of the reasons why we are bringing in this Bill is that platforms such as Facebook—Meta, sorry—just have not fulfilled their moral obligations to protect children from harm. What commitment are you making within your organisation to align yourself to deliver on the requirements of the Bill?

To be frank, the track record up until now is appalling, and all I hear when in these witness sessions, including before Christmas on the Joint Committee, is that it is as though the big platforms think they are doing a good job—that they are all fine. They have spent billions of pounds and it is not going anywhere, so I want to know what practical measures you are going to be putting into place following this Bill coming into law.

Richard Earley: Of course, I do not accept that we have failed in our moral obligation to our users, particularly our younger users. That is the most important obligation that we have. I work with hundreds of people, and there are thousands of people at our company who spend every single day talking to individuals who have experienced abuse online, people who have lived experience of working with victims of abuse, and human rights defenders—including people in public life such as yourself—to understand the impact that the use of our platform can have, and work every day to make it better.

Dean Russell Portrait Dean Russell
- Hansard - -

Q But do you accept that there is a massive gap between those who you perhaps have been protecting and those who are not protected, hence the need for us to put this law in place?

Richard Earley: Again, we publish this transparency report every quarter, which is our attempt to show how we are doing at enforcing our rules. We publish how many of the posts that break our rules we take down ourselves, and also our estimates of how likely you are to find a piece of harmful content on the platform—as I mentioned, it is around three in every 10,000 for hate speech right now—but we fully recognise that you will not take our word for it. We expect confidence in that work to be earned, not just assumed.

That is why last year, we commissioned EY to carry out a fully independent audit of these systems. It published that report last week when we published our most recent transparency report and, again, I am very happy to share it with you here. The reason we have been calling for many years for pieces of legislation like this Bill to come into effect is that we think having Ofcom, the regulator—as my colleagues just said—able to look in more detail at the work we are doing, assess the work we are doing, and identify areas where we could do more is a really important part of what this Bill can do.

Dean Russell Portrait Dean Russell
- Hansard - -

Q I am conscious of the time, sorry. I know colleagues want to come in, but what are the practical measures? What will you be doing differently moving forward following this Bill?

Richard Earley: To start with, as I said, we are not waiting for the Bill. We are introducing new products and new changes all the time.

Dean Russell Portrait Dean Russell
- Hansard - -

Q Which will do what, sorry? I do not mean to be rude, but what will they be?

Richard Earley: Well, I just spoke about some of the changes we made regarding young people, including defaulting them into private accounts. We have launched additional tools making it possible for people to put in lists of words they do not want to see. Many of those changes are aligned with the core objectives of the Bill, which are about assessing early the risks of any new tools that we launch and looking all the time at how the use of technology changes and what new risks that might bring. It is then about taking proactive steps to try to reduce the risk of those harms.

Dean Russell Portrait Dean Russell
- Hansard - -

Q May I ask you a specific question? Will that include enabling bereaved parents to see their children’s Facebook posts and profile?

Richard Earley: This is an issue we have discussed at length with DCMS, and we have consulted a number of people. It is, of course, one of the most sensitive, delicate and difficult issues we have to deal with, and we deal with those cases very regularly. In the process that exists at present, there are, of course, coronial powers. There is a process in the UK and other countries for coroners to request information.

When it comes to access for parents to individuals’ accounts, at present we have a system for legacy contacts on some of our services, where you can nominate somebody to have access to your account after you pass away. We are looking at how that can be expanded. Unfortunately, there are an awful lot of different obligations we have to consider, not least the obligations to a person who used our services and then passed away, because their privacy rights continue after they have passed away too.

Dean Russell Portrait Dean Russell
- Hansard - -

Okay, so there is a compassion element. I am conscious of time, so I will stop there.

None Portrait The Chair
- Hansard -

One moment, please. I am conscious of the fact that we are going to run out of time. I am not prepared to allow witnesses to leave without feeling they have had a chance to say anything. Ms Foreman, Ms O’Donovan, is there anything you want to comment on from what you have heard so far? If you are happy, that is fine, I just want to make sure you are not being short-changed.

Becky Foreman: No.

Katie O'Donovan: No, I look forward to the next question.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have four Members plus the Minister to get in, so please be brief. I call Dean Russell.

Dean Russell Portrait Dean Russell
- Hansard - -

Q Thank you, Sir Roger. My question builds on the future-proofing. Obviously, the big focus now is the metaverse and a virtual reality world. My question has two parts. First, is the Bill helping already by encouraging the new start-ups in that space to put safety first? Secondly, do you agree that a Joint Committee of the Houses of Parliament that continued to look at the Act and its evolution over the long term once it had been passed would be beneficial? I will come to you first, Lulu.

Lulu Freemont: On future-proofing, one of the real strengths of the Bill is the approach: it is striving to rely on systems and processes, to be flexible and to adapt to future technologies. If the Bill sticks to that approach, it will have the potential to be future-proof. Some points in the Bill raise a slight concern about the future-proofness of the regulation. There is a risk that mandating specific technologies—I know that is one of Ofcom’s powers under the Bill—would put a bit of a timestamp on the regulation, because those technologies will likely become outdated at some point. Ensuring that the regulation remains flexible enough to build on the levels of risk that individual companies have, and on the technologies that work for the development and innovation of those individual companies, will be a really important feature, so we do have some concerns around the mandating of specific technologies in the Bill.

On the point about setting up a committee, one of the things for which techUK has called for a really long time is an independent committee that could think about the current definitions of harm and keep them under review. As companies put in place systems and processes that might mitigate levels of risk of harm, will those levels of harm still be harmful? We need to constantly evolve the regime so that it is true to the harms and risks that are present today, and to evaluate it against human rights implications. Having some sort of democratically led body to think about those definitional points and evaluate them as times change and harm reduces through this regime would be very welcome.

Adam Hildreth: To add to that, are people starting to think differently? Yes, they definitely are. That ultimately, for me, is the purpose of the Bill. It is to get people to start thinking about putting safety as a core principle of what they do as an overall business—not just in the development of their products, but as the overall business. I think that will change things.

A lot of the innovation that comes means that safety is not there as the principal guiding aspect, so businesses do need some help. Once they understand how a particular feature can be exploited, or how it impacts certain demographics or particular age groups—children being one of them—they will look for solutions. A lot of the time, they have no idea before they create this amazing new metaverse, or this new metaverse game, that it could actually be a container for harmful content or new types of harm. I think this is about getting people to think. The risk assessment side is critical, for me—making sure they go through that process or can bring on experts to do that.

Ian Stevenson: I would split the future-proofing question into two parts. There is a part where this Bill will provide Ofcom with a set of powers, and the question will be: does Ofcom have the capacity and agility to keep up with the rate of change in the tech world? Assuming it does, it will be able to act fairly quickly. There is always a risk, however, that once a code of conduct gets issued, it becomes very difficult to update that code of conduct in a responsive way.

There is then a second piece, which is: are the organisations that are in scope of regulation, and the powers that Ofcom has, sufficient as things change? That is where the idea of a long-term committee to keep an eye on this is extremely helpful. That would be most successful if it did not compromise Ofcom’s independence by digging deeply into individual codes of conduct or recommendations, but rather focused on whether Ofcom has the powers and capacity that it needs to regulate as new types of company, platform and technology come along.

Dean Russell Portrait Dean Russell
- Hansard - -

Thank you.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Online Safety Bill (Ninth sitting)

Dean Russell Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.

However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

It was mentioned earlier that some of us were on previous Committees that made recommendations more broadly that would perhaps be in line with the amendment. Since that time, there has been lots of discussion around this topic, and I have raised it with the Minister and colleagues. I feel reassured that there is a great need to keep the clause as is because of the fact that exceptional circumstances do arise. However, I would like reassurances that directions would be made only in exceptional circumstances and would not override the Ofcom policy or remit, as has just been discussed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful. It is an important point.

Dean Russell Portrait Dean Russell
- Hansard - -

During the Joint Committee we were concerned about future-proofing. Although I appreciate it is not specifically included in the Bill because it is a House matter, I urge the setting up of a separate Online Safety Act committee that runs over time, so that it can continue to be improved upon and expanded, which would add value. We do not know what the next metaverse will be in 10 years’ time. However, I feel confident that the metaverse was included and I am glad that the Minister has confirmed that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Dean Russell

Main Page: Dean Russell (Conservative - Watford)

Online Safety Bill (Tenth sitting)

Dean Russell Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I am jumping ahead a bit, but I know that we will discuss clause 150, Zach’s law and epilepsy in particular at some point. Given the definition that my hon. Friend has just cited, am I correct to assume that the physical harm posed to those with epilepsy who might be targeted online will be covered, and that it is not just about psychological harm?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I admire my hon. Friend’s attention to the debate. The definition of harm for the harmful communications offence in clause 150 is set out in clause 150(4). In that context, harm is defined slightly differently, as

“psychological harm amounting to at least serious distress”.

The definition of harm in clause 187 that I read out is the definition of harm used elsewhere in the Bill. However, as I said before in the House and in the evidence session, the Government’s belief and intention is that epilepsy trolling would fall in the scope of clause 150, because giving someone an epileptic fit clearly does have a physical implication, as my hon. Friend said, but also causes psychological harm. Being given an epileptic fit is physically damaging, but it causes psychological harm as well.

Despite the fact that the definition of harm in clause 187 does not apply in clause 150, which has its own definition of harm, I am absolutely categoric that epilepsy trolling is caught by clause 150 because of the psychological harm it causes. I commend my hon. Friend the Member for Watford for being so attentive on the question of epilepsy, and also in this debate.

Returning to the definition of harm in clause 187, besides the wide definition covering physical and psychological harm, clause 187(4) makes it clear that harm may also arise not just directly but if the content prompts individuals to

“act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”.

Clause 187(4)(b) covers content where the

“individuals do or say something to another individual that results in”

that individual suffering harm. I hope the shadow Minister is reassured that the definition of harm that applies here is extremely wide in scope.

There was a question about media literacy, which I think the hon. Member for Batley and Spen raised in an intervention. Media literacy duties on Ofcom already exist in the Communications Act 2003. The Government published a comprehensive and effective media literacy strategy about a year ago. In December—after the first version of the Bill was produced, but before the second and updated version—Ofcom updated its policy in a way that went beyond the duties contained in the previous version of the Bill. From memory, that related to the old clause 103, in the version of the Bill published in May last year, which is of course not the same clause in this version of the Bill, as it has been updated.

The hon. Member for Aberdeen North raised, as ever, some important points of detail. She asked about future proofing. The concept of harm expressed in the clause is a general concept of harm. The definition of harm is whatever is harmful to children, which includes things that we do not know about at the moment and that may arise in the future. Secondly, primary priority content and priority content that is harmful can be updated from time to time by a statutory instrument. If some new thing happens that we think deserves to be primary priority content or priority content that is harmful to children, we can update that using a statutory instrument.

The hon. Lady also asked about exclusions in clause 53(5). The first exclusion in subsection (5)(a) is illegal content, because that is covered elsewhere in the Bill—it is covered in clause 52. That is why it is excluded, because it is covered elsewhere. The second limb, subsection 5(b), covers some financial offences. Those are excluded because they are separately regulated. Financial services are separately regulated. The hon. Lady used the example of gambling. Gambling is separately regulated by the Gambling Act 2005, a review of which is imminent. There are already very strong provisions in that Act, which are enforced by the regulator, the Gambling Commission, which has a hard-edged prohibition on gambling if people are under 18.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.

Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.

When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.

When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.

Dean Russell Portrait Dean Russell
- Hansard - -

One of the things we found on the Joint Committee last year was the consistent message that we should not need to put this Bill in place. I want to put on the record my continued frustration that Meta and the other social media platforms are requiring us to put this Bill in place because they are not doing the monitoring, engaging in that way or putting users first. I hope that the process of going through the Bill has helped them to see the need for more monitoring. It is disappointing that we have had to get to this point. The UK Government are having to lead the world by putting this Bill in place—it should not be necessary. I hope that the companies do not simply follow what we are putting forward, but go much further and see that it is imperative to change the way they work and support their users around the world.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I thank the hon. Gentleman and I agree. It is a constant frustration that we need this Bill. We do need it, though. In fact, amendment 55 would really assist with that, by requiring those services to go further in transparency reporting and to disclose

“the languages in which the service has safety systems or classifiers”.

We need to see what they are doing on this issue. It is an easily reported piece of information that will have an outsized impact on safety, even for English speakers. It will help linguistic groups in the multilingual UK and around the world.

Reporting on language would not be a big burden on companies. In her oral evidence, Frances Haugen told the Committee that large platforms can trivially produce this additional data merely by changing a single line of code when they do their transparency reports. We must not become wrapped up in the comfort of the language we all speak and ignore the gaping loophole left for other languages, which allows harms to slip through.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

Dean Russell Portrait Dean Russell
- Hansard - -

As a member of the Joint Committee, on which I worked with the hon. Member for Ochil and South Perthshire, I thank the Minister for including this clause on a point that was debated at length by the Joint Committee. Its inclusion is crucial to organisations in my constituency such as Dignify—a charity that works to raise awareness and campaign on this important point, to protect children but also wider society. As this is one of the 66 recommendations that the Minister took forward in the Bill, I would like to thank him; it is very welcome, and I think that it will make a huge difference to children and to society.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Dean Russell

Main Page: Dean Russell (Conservative - Watford)

Online Safety Bill (Fourteenth sitting)

Dean Russell Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I move the amendment in my name and will speak to amendment 113, which is in the name of the hon. Member for Blackpool North and Cleveleys (Paul Maynard).

The amendment would put into effect Zach’s law in full. Zach, as many Members know, is an amazing, energetic and bright young boy from my constituency. I had the absolute pleasure of visiting Zach and his mum Clare at their home in Hartshead a few weeks ago. We chatted about school and his forthcoming holiday, and he even invited me to the pub. However, Zach also has epilepsy.

Disgustingly, he was trolled online a few years ago and sent flashing images by bullies, designed to trigger his condition and give him an epileptic seizure, a seizure that not only would cause him and his family great distress, but can be extremely dangerous and cause Zach significant psychological and physical harm. I know that we are all united in our disgust at such despicable actions and committed to ensuring that this type of unbelievable online bullying is against the law under the Bill.

On Second Reading, I raised the matter directly with the Minister and I am glad that he pointed to clause 150 and stated very explicitly that subsection (4) will cover the type of online harm that Zach has encountered. However, we need more than just a commitment at the Dispatch Box by the Minister, or verbal reassurances, to protect Zach and the 600,000 other people in the UK with epilepsy.

The form of online harm that Zach and others with epilepsy have suffered causes more than just “serious distress”. Members know that the Bill as drafted lists

“psychological harm amounting to at least serious distress”

as a qualifying criterion of the offence. However, I believe that does not accurately and fully reflect the harm that epilepsy trolling causes, and that it leaves a significant loophole that none of us here wish to see exploited

For many people with epilepsy, the harm caused by this vicious online trolling is not only psychological but physical too. Seizures are not benign events. They can result in broken bones, concussion, bruises and cuts, and in extreme cases can be fatal. It is simply not right to argue that physical harm is intrinsically intertwined with psychological harm. They are different harms with different symptoms. While victims may experience both, that is not always the case.

Professor Sander, medical director of the Epilepsy Society and professor of neurology at University College London Hospitals NHS Foundation Trust, who is widely considered one of the world’s leading experts on epilepsy, has said:

“Everyone experiences seizures differently. Some people may be psychologically distressed by a seizure and not physically harmed. Others may be physically harmed but not psychologically distressed. This will vary from person to person, and sometimes from seizure to seizure depending on individual circumstances.”

Amendment 112 will therefore expand the scope of clause 150 and insert on the face of the Bill that an offence will also be committed under the harmful communications clause when physical harm has occurred as a consequence of receiving a message sent online with malicious intent. In practical terms, if a person with epilepsy were to receive a harmful message online that triggers their epilepsy and they subsequently fall off their chair and hit their head, that physical harm will be proof of a harmful communication offence, without the need to prove any serious psychological distress that may have been caused.

This simple but effective amendment, supported by the Epilepsy Society, will ensure that the horrific trolling that Zach and others with epilepsy have had to endure will be covered in full by the Bill. That will mean that the total impact that such trolling has on the victims is reflected beyond solely psychological distress, so there can be no ambiguity and nowhere for those responsible for sending these images and videos to hide.

I am aware that the Minister has previously pointed to the possibility of a standalone Bill—a proposal that is under discussion in the Ministry of Justice. That is all well and good, but that should not delay our action when the Bill before us is a perfectly fit legislative vehicle to end epilepsy trolling, as the Law Commission report recommended.

I thank colleagues from across the House for the work they have done on this important issue. I sincerely hope that the amendment is one instance where we can be united in this Committee. I urge the Minister to adopt amendment 112, to implement Zach’s law in full and to provide the hundreds of thousands of people across the UK living with epilepsy the legal protections they need to keep them safe online. It would give me no greater pleasure than to call at Zach’s house next time I am in the area and tell him that this is the case.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

May I praise the hon. Member for Batley and Spen for such an eloquent and heartfelt explanation of the reason why this amendment to the Bill is so important?

I have been campaigning on Zach’s law for the past nine months. I have spoken to Zach multiple times and have worked closely with my hon. Friend the Member for Stourbridge (Suzanne Webb) in engaging directly with Facebook, Twitter and the big platforms to try to get them to do something, because we should not need to have a law to stop them sending flashing images. We had got quite far a few months ago, but now that seems to have stalled, which is very frustrating.

I am stuck between my heart and my head on this amendment. My heart says we need to include the amendment right now, sort it out and get it finalised. However, my head says we have got to get it right. During the Joint Committee for Online Safety before Christmas and in the evidence sessions for this Bill, we heard that if the platforms want to use a loophole and get around things they will. I have even seen that with regard to the engagements and the promises we have had.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wonder whether the hon. Gentleman would consider a belt and braces approach as the best way forward? We could have it in the Bill and have the other legislation, in order that this will definitely protect people and companies will not be able to wriggle out of it.

Dean Russell Portrait Dean Russell
- Hansard - -

That is an excellent point. I have yet to make up my mind which way to vote if the amendment is pressed to a vote; I do not know whether this is a probing amendment. Having spoken to the Epilepsy Society and having been very close to this issue for many months, for me to feel comfortable, I want the Minister not just to say, as he has said on the Floor of the House, to me personally, in meetings and recently here, that the clause should cover epilepsy, and does seem to, and that he is very confident of that, but to give some assurance that we will change the law in some form.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am incredibly grateful for the hon. Member’s comments and contribution. I agree wholeheartedly. We need more than a belief and an intention. There is absolutely no reason why we cannot have this in black and white in the Bill. I hope he can find a way to do the right thing today and vote for the amendment.

Dean Russell Portrait Dean Russell
- Hansard - -

The phrase “Do the right thing” is at the heart of this. My hon. Friend the Member for Ipswich (Tom Hunt) presented the Flashing Images Bill yesterday. A big part of this is about justice. I am conscious that we have got to get the balance right; stopping this happening has an impact for the people who choose to do this. I am keen to hear what the Minister says. We have got to get this right. I am keen to get some assurances, which will very much sway my decision on the vote today.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

At the risk of following my earlier voting pattern, I am also very much with the hon. Member for Batley and Spen in spirit. I could not do the subject any more justice than she has, describing this appalling online behaviour and just how damaging it is. I am a member of the all-party parliamentary group on epilepsy and have lived experience myself.

I want to highlight the comments of the Epilepsy Society, which I am sure is following our work this afternoon. It welcomes many of the introductions to the Bill, but highlights something of a legislative no man’s land. Clause 187 mentions physical harm, but does not apply to clause 150. Clause 150 only covers psychological harm when, as we have heard described, many seizures result in physical harm and some of that is very serious. I know the Minister is equally committed to see this measure come about and recognises the points we have demonstrated. The hon. Lady is right that we are united. I suspect the only point on which there might be some difference is around timing. I will be looking to support the introduction and the honouring in full of Zach’s law before the Bill is passed. There are many other stages.

My understanding is that many others wish to contribute, not least the Ministry of Justice. My hope, and my request to the Minister, is that those expert stakeholder voices will be part of the drafting, should it not be the case that supporting the amendment presented today is the very best and strongest way forward. I want to see recognition in law.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it is not a bonus, because we cannot have two different laws that criminalise the same thing. We want to have laws that are, essentially, mutually exclusive. If a person commits a particular act, it should be clear which Act the offence is being committed under. Imagine that there were two different offences for the same act with different sentences—one is two years and one is 10 years. Which sentence does the judge then apply? We do not want to have law that overlaps, where the same act is basically a clear offence under two different laws. Just by using the term “physical harm”, amendment 112 creates that. I accept that it would cover epilepsy, but it would also cover a whole load of other things, which would then create duplication.

That is why the right way to do this is essentially through a better drafted version of amendment 113, which specifically targets epilepsy. However, it should be done with drafting that has been done properly—with respect to my hon. Friend the Member for Blackpool North and Cleveleys, who drafted the amendment—with definitions that are done properly, and so on. That is what we want to do.

Dean Russell Portrait Dean Russell
- Hansard - -

Having been involved on this Bill for quite a while now and having met Zach, I know the concerns that the Epilepsy Society have had. For me, we just need the Minister to tell us, which I think he has, that this will become law, whatever the vehicle for that is. If we know that this will be an offence by the end of this year—hopefully by summer, if not sooner—so that people cannot send flashing images to people with epilepsy, like Zach, then I will feel comfortable in not backing the amendment, on the premise that the Government will do something, moving forward. Am I correct in that understanding?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. Just to be clear, in no world will a new law pass by the summer recess. However, I can say that the Government are committed, unequivocally, to there being a new offence in law that will criminalise epilepsy trolling specifically. That commitment is categoric. The only matter on which I need to come back to the House, which I will try to do on Report, is to confirm specifically which Bill that offence will go in. The commitment to legislate is made unequivocally today.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The progress that the campaign has made, with the clear commitment from the Government that we are going to legislate for a specific epilepsy trolling offence, is a huge step forward. I entirely understand the hon. Lady’s impatience. I have tried to be as forthcoming as I can be about likely times, in answer to the question from the hon. Member for Aberdeen North, within the constraints of what is currently collectively agreed, beyond which I cannot step.

Amendment 112 will sort out the epilepsy, but unfortunately it will create duplicative criminal law. We cannot let our understandable sense of urgency end up creating a slightly dysfunctional criminal statute book. There is a path that is as clear as it reasonably can be. Members of the Committee will probably have inferred the plan from what I said earlier. This is a huge step forward. I suggest that we bank the win and get on with implementing it.

Dean Russell Portrait Dean Russell
- Hansard - -

I appreciate that there will be differences of opinion, but I feel that Zach should be smiling today whatever the outcome—if there is a vote, or if this is a probing amendment. When I have chatted about this previously over many months, it has been a real challenge. The Minister quite rightly said that the Bill already covered epilepsy. I felt that to be true. This is a firming up of the agreement we had. This is the first time I have heard this officially in any form. My message to Zach and the Epilepsy Society, who may well be watching the Committee, is that I hope they will see this as a win. With my head and my heart together, I feel that it is a win, but I forewarn the Minister that I will continue to be like a dog with a bone and make sure that those promises are delivered upon.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think that is probably a good place to leave my comments. I can offer public testimony of my hon. Friend’s tenacity in pursuing this issue.

I ask the hon. Member for Batley and Spen to withdraw the amendment. I have given the reasons why: because it would create duplicative criminal law. I have been clear about the path forward, so I hope that on that basis we can work together to get this legislated for as a new offence, which is what she, her constituent and my hon. Friends the Members for Watford and for Eastbourne and others have been calling for.

Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate

Dean Russell

Main Page: Dean Russell (Conservative - Watford)

Online Safety Bill (Seventeenth sitting)

Dean Russell Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

As another member of the Joint Committee, I totally understand the reasoning. I want to put on record my support for setting up a Committee through the approach the Minister mentioned using statutory instruments. I will not support the new clause but I strongly support the Joint Committee continuing in some form to enable scrutiny. When we look forward to the metaverse, virtual reality and all the things that are coming, it is important that that scrutiny continues. No offence to Opposition colleagues, but I do not think the new clause is the right way to do that. However, the subject is worth further exploration, and I would be very supportive of that happening.

Online Safety Bill

Dean Russell Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - -

I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.

We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell Portrait Dean Russell
- Hansard - -

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

--- Later in debate ---
John Penrose Portrait John Penrose
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

Dean Russell Portrait Dean Russell
- Hansard - -

On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

--- Later in debate ---
Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

The Minister might be of the same mind himself.

Through speaking in these debates, my office has seen an increase in correspondence from parents who are thankful that these difficult issues are being talked about. The world is changing and progressing, and if we are going to live in a world where we want to protect our children and our grandchildren—I have six grandchildren —and all other grandchildren who are involved in social media, the least we can do is make sure they are safe.

I commend the hon. Member for Batley and Spen (Kim Leadbeater) and others, including the hon. Member for Watford (Dean Russell), who have spoken about Zach’s law. We are all greatly impressed that we have that in the Bill through constructive lobbying. New clause 28, which the hon. Member for Rotherham (Sarah Champion) referred to, relates to advocacy for young people. That is an interesting idea, but I feel that advocacy should be for the parents first and not necessarily young people.

Ahead of the debate, I was in contact with the Royal College of Psychiatrists. It published a report entitled “Technology use and the mental health of children and young people”—new clause 16 is related to that—which was an overview of research into the use of screen time and social media by children and young teenagers. It has been concluded that excessive use of phones and social media by a young person is detrimental to their development and mental health—as we all know and as Members have spoken about—and furthermore that online abuse and bullying has become more prevalent because of that. The right hon. Member for Witham (Priti Patel) referred to those who are susceptible to online harm. We meet them every day, and parents tell me that our concerns are real.

A recent report by NHS Digital found that one in eight 11 to 16-year-olds reported that they had been bullied online. When parents contact me, they say that bulling online is a key issue for them, and the statistics come from those who choose to be honest and talk about it. Although the Government’s role is to create a Bill that enables protection for our children, there is also an incredible role for schools, which can address bullying. My hon. Friend the Member for Upper Bann (Carla Lockhart) and I talked about some of the young people we know at school who have been bullied online. Schools have stepped in and stopped that, encouraging and protecting children, and they can play that role as well.

We have all read of the story of Molly Russell, who was only 14 years old when she took her life. Nobody in this House or outside it could not have been moved by her story. Her father stated that he strongly believed that the images, videos and information that she was able to access through Instagram played a crucial part in her life being cut short. The Bill must complete its passage and focus on strengthening protections online for children. Ultimately, the responsibility is on large social media companies to ensure that harmful information is removed, but the Bill puts the onus on us to hold social media firms to account and to ensure that they do so.

Harmful and dangerous content for children comes in many forms—namely, online abuse and exposure to self-harm and suicidal images. In addition, any inappropriate or sexual content has the potential to put children and young people at severe risk. The Bill is set to put provisions in place to protect victims in the sharing of nude or intimate photos. That is increasingly important for young people, who are potentially being groomed online and do not understand the full extent of what they are doing and the risks that come with that. Amendments have been tabled to ensure that, should such cases of photo sharing go to court, provisions are in place to ensure complete anonymity for the victims—for example, through video links in court, and so on.

I commend the right hon. Member for Basingstoke (Dame Maria Miller), who is not in her place, for her hard work in bringing forward new clause 48. Northern Ireland, along with England and Wales, will benefit from new clause 53, and I welcome the ability to hand down sentences of between six months and potentially five years.

Almost a quarter of girls who have taken a naked image have had their image sent to someone else online without their permission. Girls face very distinct and increased risks on social media, with more than four in five online grooming crimes targeting girls, and 97% of child abuse material featuring the sexual abuse of girls—wow, we really need to do something to protect our children and to give parents hope. There needs to be increased emphasis and focus on making children’s use of the internet safer by design. Once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of sexual abuse, which often stem from photo sharing.

The Minister referred to terrorism and how terrorism can be promoted online. I intervened on him to mention the glorification of IRA terrorism and how that encourages further acts of terrorism and people who are susceptible to be involved. I am quite encouraged by the Minister’s response, and I think that we need to take a significant step. Some in Northern Ireland, for instance, try to rewrite history and use the glorification of terrorism for that purpose. We would like to see strengthening of measures to ensure that those involved in those acts across Northern Ireland are controlled.

In conclusion, there are many aspects of the Bill that I can speak in support of in relation to the benefits of securing digital protections for those on social media. This is, of course, about protecting not just children, but all of us from the dangers of social media. I have chosen to speak on these issues as they are often raised by constituents. There are serious matters regarding the glorification and encouragement of self-harm that the Bill needs to address. We have heard stories tonight that are difficult to listen to, because they are true stories from people we know, and we have heard horror stories about intimate photo sharing online. I hope that action on those issues, along with the many others that the Government are addressing, will be embedded in the Bill with the intent to finally ensure that we have regulations and protection for all people, especially our children—I think of my children and grandchildren, and like everybody else, my constituents.

Dean Russell Portrait Dean Russell
- View Speech - Hansard - -

I welcome the Minister to his place; I know that he will be excellent in this role, and it is incredible that he is so across the detail in such a short time.

I will primarily talk about new clause 53—that may not be that surprising, given how often it has been spoken about today—which is, ultimately, about Zach’s law. Zach is a truly heroic figure, as has been said. He is a young child with cerebral palsy, autism and epilepsy who was cruelly trolled by sick individuals who sent flashing images purposely to cause seizures and cause him damage. That was not unique to Zach, sadly; it happened to many people across the internet and social media. When somebody announced that they were looking for support, having been diagnosed with epilepsy, others would purposely identify that and target the person with flashing images to trigger seizures. That is absolutely despicable.

My hon. Friend the Member for Stourbridge (Suzanne Webb) has been my partner in crime—or in stopping the crime—over the past two years, and this has been a passion for us. Somebody said to me recently that we should perhaps do our victory lap in the Chamber today for the work that has been done to change the law, but Zach is the person who will get to go around and do that, as he did when he raised funds after he was first cruelly trolled.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) also deserves an awful lot of praise. My hon. Friend the Member for Stourbridge and I worked with him on the Joint Committee on the draft Online Safety Bill this time last year. It was incredible to work with Members of both Houses to look at how we can make the Bill better. I am pleased about the response to so many measures that we put forward, including the fact that we felt that the phrase “legal but harmful” created too many grey areas that would not catch the people who were doing these awful—what I often consider to be—crimes online to cause harm.

I want to highlight some of what has been done over the past two years to get Zach’s law to this point. If I ever write a memoir, I am sure that my diaries will not be as controversial as some in the bookshops today, but I would like to dedicate a chapter to Zach’s law, because it has shown the power of one individual, Zach, to change things through the democratic process in this House, to change the law for the entire country and to protect people who are vulnerable.

Not only was Zach’s case raised in the Joint Committee’s discussions, but afterwards my hon. Friend the Member for Stourbridge and I managed to get all the tech companies together on Zoom—most people will probably not be aware of this—to look at making technical changes to stop flashing images being sent to people. There were lots of warm words: lots of effort was supposedly put in so that we would not need a law to stop flashing images. We had Giphy, Facebook, Google, Twitter—all these billion-pound platforms that can do anything they want, yet they could not stop flashing images being sent to vulnerable people. I am sorry, but that is not the work of people who really want to make a difference. That is people who want to put profit over pain—people who want to ensure that they look after themselves before they look after the most vulnerable.

--- Later in debate ---
Suzanne Webb Portrait Suzanne Webb
- Hansard - - - Excerpts

Talking of Christmas, would not the best Christmas present for lovely Zach be to enshrine new clause 53, that amazing amendment, as Zach’s law? Somehow we should formalise it as Zach’s law—that would be a brilliant Christmas present.

Dean Russell Portrait Dean Russell
- Hansard - -

I wholeheartedly agree. Zach, if you are listening right now, you are an absolute hero—you have changed so much for so many people. Without your effort, this would not be happening today. In future, we can look back on this and say, “You know what? Democracy does work.”

I thank all hon. Members for their campaigning work to raise Zach’s law in the public consciousness. It even reached the US. I am sure many hon. Members dance along to Beyoncé of an evening or listen to her in the car when they are bopping home; a few months ago she changed one of her YouTube videos, which had flashing images in it, because the Epilepsy Society reached out to describe the dangers that it would cause. These campaigns work. They are about public awareness and about changing the law. We talk about the 15 minutes of shame that people face on social media, but ultimately the shame is on the platforms for forcing us to legislate to make them do the right thing.

I will end with one small point. The internet has evolved; the world wide web has evolved; social media is evolving; the metaverse, 3D virtual reality worlds and augmented reality are changing. I urge the Government or the House to look at creating a Committee specifically on the Bill. I know that there are lots of arguments that it should be a Sub-Committee of the Digital, Culture, Media and Sport Committee, but the truth is that the online world is changing dramatically. We cannot take snapshots every six months, every year or every two years and assume that they will pick up on all the changes happening in the world.

As the hon. Member for Pontypridd (Alex Davies-Jones) said, TikTok did not even exist when the Bill was first discussed. We now have an opportunity to ask what is coming next, keep pace with it and put ethics and morality at the heart of the Bill to ensure that it is fit for purpose for many decades to come. I thank the Minister for his fantastic work; my partner in crime, my hon. Friend the Member for Stourbridge, for her incredible work; and all Members across the House. Please, please, let us get this through tonight.

Laura Farris Portrait Laura Farris (Newbury) (Con)
- View Speech - Hansard - - - Excerpts

It is a privilege to follow my hon. Friend the Member for Watford (Dean Russell) and so many hon. Members who have made thoughtful contributions. I will confine my comments to the intersection of new clauses 28 and 45 to 50 with the impact of online pornography on children in this country.

There has been no other time in the history of humanity when we have exposed children to the violent, abusive, sexually explicit material that they currently encounter online. In 2008, only 14% of children under 13 had seen pornography; three years later, that figure had risen to 49%, correlating with the rise in children owning smartphones. Online pornography has a uniquely pernicious impact on children. For very young children, there is an impact just from seeing the content. For older teenagers, there is an impact on their behaviour.

We are seeing more and more evidence of boys exhibiting sexually aggressive behaviour, with actions such as strangulation, which we have dealt with separately in this House, and misogynistic attitudes. Young girls are being conditioned into thinking that their value depends on being submissive or objectified. That is leading children down a pathway that leads to serious sexual offending by children against children. Overwhelmingly, the victims are young girls.

Hon. Members need not take my word for it: after Everyone’s Invited began documenting the nature and extent of the sexual experiences happening in our schools, an Ofsted review revealed that the most prevalent victims of serious sexual assaults among the under-25s are girls aged 15 to 17. In a recent publication in anticipation of the Bill, the Children’s Commissioner cited the example of a teenage boy arrested for his part in the gang rape of a 14-year old girl. In his witness statement to the police, the boy said that it felt just like a porn film.

Dr John Foubert, the former White House adviser on rape prevention, has said:

“It wasn’t until 10 years ago when I came to the realization that the secret ingredient in the recipe for rape was not secret at all…That ingredient…is today’s high speed Internet pornography.”

The same view has been expressed, in one form or another, by the chief medical officers for England and for Wales, the Independent Inquiry into Child Sexual Abuse, the Government Equalities Office, the Children’s Commissioner, Ofsted and successive Ministers.

New clause 28 requests an advocacy body to represent and protect the interests of child users. I welcome the principle behind the new clause. I anticipate that the Minister will say that he is already halfway there by making the Children’s Commissioner a statutory consultee to Ofcom, along with the Domestic Abuse Commissioner and others who have been named in this debate. However, whatever the Government make of the Opposition’s new clause, they must surely agree that it alights on one important point: the online terrain in respect of child protection is evolving very fast.

By the time the Bill reaches the statute book, new providers will have popped up again. With them will come unforeseen problems. When the Bill was first introduced, TikTok did not exist, as my hon. Friend the Member for Watford said a moment ago, and neither did OnlyFans. That is precisely the kind of user-generated site that is likely to try and dodge its obligations to keep children safe from harm, partly because it probably does not even accept that it exposes them to harm: it relies on the fallacy that the user is in control, and operates an exploitative business model predicated on that false premise.

I think it important for someone to represent the issue of child protection on a regular basis because of the issue of age verification, which we have canvassed, quite lightly, during the debate. Members on both sides of the House have pointed out that the current system which allows children to self-certify their date of birth is hopelessly out of date. I know that Ministers envisage something much more ambitious with the Bill’s age assurance and age verification requirements, including facial recognition technology, but I think it is worth our having a constant voice reporting on the adequacy of whatever age assurance steps internet providers may take, because we know how skilful children can be in navigating the internet. We know that there are those who have the technological skills to IP shroud or to use VPN. I also think it important for there to be a voice to maintain the pressure on the Government—which is what I myself want to do tonight—for an official Government inquiry into pornography harms, akin to the one on gambling harms that was undertaken in 2019. That inquiry was extremely important in identifying all the harm that was caused by gambling. The conclusions of an equivalent inquiry into pornography would leave no wriggle room for user-generated services to deny the risk of harm.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) pointed out, very sensibly, that her new clauses 45 to 50 build on all the Law Commission’s recommendations. It elides with so much work that has already been done in the House. We have produced, for instance, the Domestic Abuse Act 2021, which dealt with revenge porn, whether threatened or actual and whether genuine or fake, and with coercive control. Many Members recognise what was achieved by all our work a couple of years ago. However, given the indication from Ministers that they are minded to accept the new clauses in one form or another, I should like them to explain to the House how they think the Bill will capture the issue of sexting, if, indeed, it will capture that issue at all.

As the Minister will know, sexting means the exchanging of intimate images by, typically, children, sometimes on a nominally consensual basis. Everything I have read about it seems to say, “Yes, prima facie this is an unlawful act, but no, we do not seek to criminalise children, because we recognise that they make errors of judgment.” However, while I agree that it may be proportionate not to criminalise children for doing this, it remains the case that when an image is sent with the nominal consent of the child—it is nearly always a girl—it is often a product of duress, the image is often circulated much more widely than the recipient, and that often has devastating personal consequences for the young girl involved. All the main internet providers now have technology that can identify a nude image. It would be possible to require them to prevent nude images from being shared when, because of extended age-verification abilities, they know that the user is a child. If the Government are indeed minded to accept new clauses 45 to 50, I should like them to address that specific issue of sexting rather than letting it fall by the wayside as something separate, or outside the ambit of the Bill.

--- Later in debate ---
We have heard a great many tragic stories today about children who have been harmed through other people’s direct access to their lives over mobile phones, but, as my hon. Friend said, one of the overriding results of the internet is the sexualisation of children in a truly destructive way. As my hon. Friend also said, about 50% of 12-year-olds have now seen online pornography, and 1.4 million UK children access porn every month. There is nothing mainstream about this pornography. It is not the same as the dodgy magazines of old. Violence, degrading behaviour, abuse and addiction are all mainstream on pornography sites now.
Dean Russell Portrait Dean Russell
- Hansard - -

Does my hon. Friend agree that the work of charities such as Dignify in Watford, where Helen Roberts does incredible work in raising awareness of this issue, is essential to ensuring that people are aware of the harm that can be done?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I completely agree. Other charities, such as CEASE—the Centre to End All Sexual Exploitation —and Barnardo’s have been mentioned in the debate, and I think it so important to raise awareness. There are many harms in the internet, but pornography is an epidemic. It makes up a third of the material on the internet, and its impact on children cannot be overstated. Many boys who watch porn say that it gives them ideas about the kind of sex that they want to try. It is not surprising that a third of child sexual abuse is committed by other children. During puberty—that very important period of development—boys in particular are subject to an erotic imprint. The kind of sex that they see and the sexual ideas that they have during that time determine what they see as normal behaviour for the rest of their lives. It is crucial for children to be protected from harmful pornography that encourages the objectification and abuse of—almost always—women.

Online Safety Bill

Dean Russell Excerpts
We are in an era where our discussion forums have become polarised. We are crossing new frontiers but we cannot accept the status quo. Our democracy depends on this.
Dean Russell Portrait Dean Russell (Watford) (Con)
- View Speech - Hansard - -

I rise to talk broadly about new clause 2, which I am pleased that the Government are engaging on. My right hon. and hon. Friends have done incredible work to make that happen. I share their elation. As—I think—the only Member who was on the Joint Committee under the fantastic Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), and on both Committees, I have seen the Bill’s passage over the past year or so and been happy with how the Government have engaged with it. That includes on Zach’s law, which will ensure that trolls cannot send flashing images to people with epilepsy. I shared my colleagues’ elation with my hon. Friend the Member for Stourbridge (Suzanne Webb) when we were successful in convincing the Government to make that happen.

May I reiterate the learnings from the Joint Committee and from the Committee earlier last year? When we took evidence from the tech giants—they are giants—it was clear that, as giants do, they could not see the damage underfoot and the harm that they were doing because they are so big. They were also blind to the damage they were doing because they chose not to see it. I remember challenging a witness from one of the big tech giants about whether they had followed the Committee on the harms that they were causing to vulnerable children and adults. I was fascinated by how the witnesses just did not care. Their responses were, “Well, we are doing enough already. We are already trying. We are putting billions of pounds into supporting people who are being harmed.” They did not see the reality on the ground of young people being damaged.

When I interviewed my namesake, Ian Russell, I was heartbroken because we had children of a similar age. I just could not imagine having the conversations he must have had with his family and friends throughout that terrible tragedy.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

Is my hon. Friend aware that Ian Russell has pointed out that 26% of young people who present at hospital with self-harm and suicide attempts have accessed such predatory, irresponsible and wilful online content?

Dean Russell Portrait Dean Russell
- Hansard - -

My hon. Friend is absolutely right. One of the real horrors is that, as I understand it, Facebook was not going to release—I do not want to break any rules here—the content that his daughter had being viewing, to help with the process of healing.

If I may, I want to touch on another point that has not been raised today, which is the role of a future Committee. I appreciate that is not part of the Bill, but I feel strongly that this House should have a separate new Committee for the Online Safety Bill. The internet and the world of social media is changing dramatically. The metaverse is approaching very rapidly, and we are seeing the rise of virtual reality and augmented reality. Artificial intelligence is even changing the way we believe what we see online and at a rate that we cannot imagine. I have a few predictions. I anticipate that in the next few years we will probably have the first No. 1 book and song written by AI. We can now hear online fake voices and impersonations of people by AI. We will have songs and so on created in ways that fool us and fool children even more. I have no doubt that in the coming months and years we will see the rise of children suing their parents for sharing content of them when they were younger without permission. We will see a changing dynamic in the way that young people engage with new content and what they anticipate from it.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

My hon. Friend is making a valuable contribution to the debate, as I expected he would having discussed it with him from the very beginning. What he describes is not only the combination of heartlessness and carelessness on the part of the tech companies, but the curious marriage of an anarchic future coupled with the tyranny of their control of that future. He is absolutely right that if we are to do anything about that in this place, we need an ongoing role for a Committee of the kind he recommends.

Dean Russell Portrait Dean Russell
- Hansard - -

I thank my right hon. Friend for those comments. I will wrap up shortly, Mr Deputy Speaker. On that point, I have said before that the use of algorithms on platforms is in my mind very similar to addictive drugs: they get people addicted and get them to change their behaviours. They get them to cut off from their friends and family, and then they direct them in ways that we would not allow if we could wrap our arms around them and stop it. But they are doing that in their own bedrooms, classrooms and playgrounds.

I applaud the work on the Bill. Yes, there are ways it could be improved and a committee that looks at ways to improve it as the dynamics of social media change will be essential. However, letting the Bill go to the other place will be a major shift forwards in protecting our young people both now and in the future.

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

Thank you for your patience, Siobhan Baillie.