All 3 Eleanor Laing contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022

Online Safety Bill

Eleanor Laing Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. Before I call the shadow Secretary of State, it will be obvious to the House that we have approximately one hour for Back-Bench contributions and that a great many people want to speak. I warn colleagues that not everybody will have the opportunity and that there will certainly be a time limit, which will probably begin at five minutes.

--- Later in debate ---
Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

We will begin with a time limit of five minutes, but that is likely to reduce.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Some colleagues have been in touch with me to ask my view on one overriding matter relating to this Bill: does it impinge on our civil liberties and our freedom of speech? I say to colleagues that it does neither, and I will explain how I have come to that conclusion.

In the mid-1990s, when social media and the internet were in their infancy, the forerunners of the likes of Google scored a major win in the United States. Effectively, they got the US Congress to agree to the greatest “get out of jail free” card in history: namely, to agree that social media platforms are not publishers and are not responsible for the content they carry. That has led to a huge flowering of debate, knowledge sharing and connections between people, the likes of which humanity has never seen before. We should never lose sight of that in our drive to fairly regulate this space. However, those platforms have also been used to cause great harm in our society, and because of their “get out of jail free” card, the platforms have not been accountable to society for the wrongs that are committed through them.

That is quite simplistic. I emphasise that as time has gone by, social media platforms have to some degree recognised that they have responsibilities, and that the content they carry is not without impact on society—the very society that they make their profits from, and that nurtured them into existence. Content moderation has sprung up, but it has been a slow process. It is only a few years ago that Google, a company whose turnover is higher than the entire economy of the Netherlands, was spending more on free staff lunches than on content moderation.

Content moderation is decided by algorithms, based on terms and conditions drawn up by the social media companies without any real public input. That is an inadequate state of affairs. Furthermore, where platforms have decided to act, there has been little accountability, and there can be unnecessary takedowns, as well as harmful content being carried. Is that democratic? Is it transparent? Is it right?

These masters of the online universe have a huge amount of power—more than any industrialist in our history—without facing any form of public scrutiny, legal framework or, in the case of unwarranted takedowns, appeal. I am pleased that the Government have listened in part to the recommendations published by the Digital, Culture, Media and Sport Committee, in particular on Parliament’s being given control through secondary legislation over legal but harmful content and its definition—an important safeguard for this legislation. However, the Committee and I still have queries about some of the Bill’s content. Specifically, we are concerned about the risks of cross-platform grooming and bread- crumbing—perpetrators using seemingly innocuous content to trap a child into a sequence of abuse. We also think that it is a mistake to focus on category 1 platforms, rather than extending the provisions to other platforms such as Telegram, which is a major carrier of disinformation. We need to recalibrate to a more risk-based approach, rather than just going by the numbers. These concerns are shared by charities such as the National Society for the Prevention of Cruelty to Children, as the hon. Member for Manchester Central (Lucy Powell) said.

On a systemic level, consideration should be given to allowing organisations such as the Internet Watch Foundation to identify where companies are failing to meet their duty of care, in order to prevent Ofcom from being influenced and captured by the heavy lobbying of the tech industry. There has been reference to the lawyers that the tech industry will deploy. If we look at any newspaper or LinkedIn, we see that right now, companies are recruiting, at speed, individuals who can potentially outgun regulation. It would therefore be sensible to bring in outside elements to provide scrutiny, and to review matters as we go forward.

On the culture of Ofcom, there needs to be greater flexibility. Simply reacting to a large number of complaints will not suffice. There needs to be direction and purpose, particularly with regard to the protection of children. We should allow for some forms of user advocacy at a systemic level, and potentially at an individual level, where there is extreme online harm.

On holding the tech companies to account, I welcome the sanctions regime and having named individuals at companies who are responsible. However, this Bill gives us an opportunity to bring about real culture change, as has happened in financial services over the past two decades. During Committee, the Government should actively consider the suggestion put forward by my Committee—namely, the introduction of compliance officers to drive safety by design in these companies.

Finally, I have concerns about the definition of “news publishers”. We do not want Ofcom to be effectively a regulator or a licensing body for the free press. However, I do not want in any way to do down this important and improved Bill. I will support it. It is essential. We must have this regulation in place.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker, but I was under the impression that I was to wind up for my party, rather than speaking at this juncture.

--- Later in debate ---
Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - -

If the hon. Gentleman would prefer to save his slot until later—

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would, Madam Deputy Speaker, if that is all right with you.

Eleanor Laing Portrait Madam Deputy Speaker
- Hansard - -

Then we shall come to that arrangement. I call Dame Margaret Hodge.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. After the next speaker, the time limit will be reduced to four minutes.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. I am reluctant to reduce the time limit, but I am receiving appeals for me to try to get more people in, so I will reduce it to three minutes. However, not everyone will have a chance to speak this evening.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Order. Really, people just ought to have more courtesy than to get up and, when there is still business going on in this House, to behave as if it is not sitting because it is after 10 o’clock. We really have to observe courtesy at all times in here.

Online Safety Bill (Carry-Over)

Motion made, and Question put forthwith (Standing Order No. 80A(1)(a)),

That if, at the conclusion of this Session of Parliament, proceedings on the Online Safety Bill have not been completed, they shall be resumed in the next Session.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill

Eleanor Laing Excerpts
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- View Speech - Hansard - -

Order. The House will see that a great many people still wish to speak. May I explain that there are two groups of amendments? We will finish debating this group at 4.30 pm, after which there will be some votes, and debate on the next group of amendments will last until 7 o’clock. By my calculations, there might be more time for speeches during the debate on the next group, so if anyone wishes to speak on that group rather than the current group, I would be grateful if they came and indicated that to me. Meanwhile, if everyone takes about eight minutes and no longer, everyone will have the opportunity to speak. I call Sir Jeremy Wright.

Jeremy Wright Portrait Sir Jeremy Wright
- View Speech - Hansard - - - Excerpts

I shall speak to the amendments in my name and the names of other right hon. and hon. Members, to whom I am grateful for their support. I am also grateful to the organisations that helped me to work through some of the problems I am about to identify, including the Carnegie Trust, Reset and the Antisemitism Policy Trust.

On the first amendments I shall talk about, amendments 42 and 43, I have been able to speak to Lego, so I can honestly say that these amendments were put together with Lego. Let me explain. The focus of the Bill, quite rightly, is on safety, and there is no safety more important than the safety of children. In that respect, the Bill is clear: platforms must give the safety of children the utmost priority and pay close attention to ways to enhance it. In other parts of the Bill, however, there are countervailing duties—for example, in relation to freedom of speech and privacy—where, predominantly in relation to adults, we expect platforms to conduct a balancing exercise. It seems right to me to think about that in the context of children, too.

As I said, the emphasis is rightly on children’s safety, but the safest approach would be to prohibit children from any online activity at all. We would not regard such an approach as sensible, because there are benefits to children in being able to engage—safely, of course—in online activity and to use online products and services. It seems to me that we ought to recognise that in the language of the Bill. Amendment 42 would do that when consideration is given to the safety duties designed to protect children set out in clause 11, which requires that “proportionate measures” must be taken to protect children’s safety and goes on to explain what factors might be taken into account when deciding what is proportionate, by adding

“the benefits to children’s well-being”

of the product or service in that list of factors. Amendment 43 would do the same when consideration is given to the online safety objectives set out in schedule 4. Both amendments are designed to ensure that the appropriate balance is struck when judgments are taken by platforms.

Others have spoken about journalistic content, and I am grateful for what the Minister said about that, but my amendment 10 is aimed at the defect that I perceive in clause 16. The Bill gives additional protections and considerations to journalists, which is entirely justifiable, given the important role that journalism plays in our society, but those extra protections mean that it will be harder for platforms to remove potentially harmful content that is also journalistic content. We should be sure, therefore, that the right people get the benefit of that protection.

It is worth having look at what clause 16 says and does. It sets out that a platform—a user-to-user service—in category 1 will have

“A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about…how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and…whether to take action against a user generating, uploading or sharing such content.”

So it is important, because of the significance of those protections, that we get right the definitions of those who should benefit from them. Amendment 10 would amend clause 16(8), which states that:

“For the purposes of this section content is “journalistic content”, in relation to a user-to-user service, if…the content is”

either

“news publisher content in relation to that service”—

the definition of which I will return to—

“or…regulated user-generated content in relation to that service”.

That is the crucial point. The content also has to be

“generated for the purposes of journalism”

and be linked to the UK.

The first problem here is that journalism is not defined in the Bill. There are definitions of journalism, but none appears in the text of this Bill. “UK-linked” does not narrow it down much, and “regulated user-generated content” is a very broad category indeed. Clause 16 as drafted offers the protection given to journalistic content not just to news publishers, but to almost everybody else who chooses to define themselves as a journalist, whether or not that is appropriate. I do not think that that is what the Bill is intended to do, or an approach that this House should endorse. Amendment 10 would close the loophole by removing the second limb, regulated user-generated content that is not news publisher content. Let me be clear: I do not think that that is the perfect answer to the question I have raised, but it is better than the Bill as it stands, and if the Government can come up with a way of reintroducing protections of this kind for types of journalistic content beyond news publisher content that clearly deserve them, I will be delighted and very much open to it. Currently, however, the Bill is defective and needs to be remedied.

That brings us to the definition of news publisher content, because it is important that if we are to give protection to that category of material, we are clear about what we mean by it. Amendments 11 and 12 relate to the definition of news publisher content that arises from the definition of a recognised news publisher in clauses 49 and 50. That matters for the same reason as I just set out: we should give these protections only to those who genuinely deserve them. That requires rigorous definition. Clause 50 states that if an entity is not named in the Bill, as some are, it must fulfil a set of conditions set out in subsection (2), which includes having a standards code and policies and procedures for handling and resolving complaints. The difficulty here is that in neither case does the Bill refer to any quality threshold for those two things, so having any old standards code or any old policy for complaints will apparently qualify. That cannot be right.

I entirely accept that inserting a provision that the standards code and the complaints policies and procedures should be both “suitable and sufficient” opens the question whose job it becomes to decide what is suitable and sufficient. I am familiar with all the problems that may ensue, so again, I do not say that the amendment is the final word on the subject, but I do say that the Government need to look more carefully at what the value of those two items on the list really is if the current definition stands. If we are saying that we want these entities to have a standards code and a complaints process that provide some reassurance that they are worthy of the protections the Bill gives, it seems to me that meaningful criteria must apply, which currently they do not.

The powers of the Secretary of State have also been discussed by others, but I perhaps differ from their view in believing that there should be circumstances in which the Secretary of State should hold powers to act in genuine emergency situations. However, being able to direct Ofcom, as the Bill allows the Secretary of State to do, to modify a code of practice

“for reasons of public policy”

is far too broad. Amendment 13 would simply remove that capacity, with amendment 14 consequential upon it.

I accept that on 7 July the Secretary of State issued a written statement that helps to some extent on that point—it was referred to by my hon. Friend the Member for Croydon South South (Chris Philp). First, it states that the Secretary of State would act only in “exceptional circumstances”, although it does not say who defines what exceptional circumstances are, leaving it likely that the Secretary of State would do so, which does not help us much. Secondly, it states the intention to replace the phrase

“for reasons of public policy”

with a list of circumstances in which the Secretary of State might act. I agree with my hon. Friend the Member for Solihull (Julian Knight) that that is still too broad. The proposed list comprises

“national security, public safety, public health, the UK’s international relations and obligations, economic policy and burden to business.”—[Official Report, 7 July 2022; Vol. 717, c. 69WS.]

The platforms we are talking about are businesses. Are we really saying that a burden on them would give the Secretary of State reason to say to Ofcom, the independent regulator, that it must change a code of practice? That clearly cannot be right. This is still too broad a provision. The progress that has been made is welcome, but I am afraid that there needs to be more to further constrain this discretion. That is because, as others have said, the independence of the regulator is crucial not just to this specific part of the Bill but to the credibility of the whole regulatory and legislative structure here, and therefore we should not undermine it unless we have to.

--- Later in debate ---
Kevan Jones Portrait Mr Jones
- Hansard - - - Excerpts

I am grateful to the Minister and I will be keeping a beady eye to see how far things go. The proposal would make a difference. It is a simple but effective way of protecting people, especially young people.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

Very good, that was wonderfully brief.

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- View Speech - Hansard - - - Excerpts

May I join others in welcoming my hon. Friend the Member for Folkestone and Hythe (Damian Collins) to his place on the Front Bench? He brings a considerable amount of expertise. I also, although it is a shame he is not here to hear me say nice things about him, pay tribute, as others have, to my hon. Friend the Member for Croydon South (Chris Philp). I had the opportunity to work with him, his wonderful team of officials and wonderful officials at the Home Office on some aspects of this Bill, and it was a great pleasure to do so. As we saw again today, his passion for this subject is matched only by his grasp of its fine detail.

I particularly echo what my hon. Friend said about algorithmic promotion, because if we address that, alongside what the Government have rightly done on ID verification options and user empowerment, we would address some of the core wiring and underpinnings at an even more elemental level of online harm.

I want to talk about two subjects briefly. One is fraud, and the other is disinformation. Opposition amendment 20 refers to disinformation, but that amendment is not necessary because of the amendments that the Government are bringing to the National Security Bill to address state-sponsored disinformation. I refer the House in particular to Government amendment 9 to that Bill. That in turn amends this Bill—it is the link, or so-called bridge, between the two. Disinformation is a core part of state threat activity and it is one of the most disturbing, because it can be done at huge volume and at very low cost, and it can be quite hard to detect. When someone has learned how to change the way people think, that makes that part of their weaponry look incredibly valuable to them.

We often talk about this in the context of elections. I think we are actually pretty good—when I say “we”, I mean our country, some other countries and even the platforms themselves—at addressing disinformation in the context of the elections themselves: the process of voting, eligibility to vote and so on. However, first, that is often not the purpose of disinformation at election time and, secondly, most disinformation occurs outside election times. Although our focus on interference with the democratic process is naturally heightened coming up to big democratic events, it is actually a 365-day-a-year activity.

There are multiple reasons and multiple modes for foreign states to engage in that activity. In fact, in many ways, the word “disinformation” is a bit unsatisfactory because a much wider set of things comes under the heading of information operations. That can range from simple untruths to trying to sow many different versions of an event, particularly a foreign policy or wartime event, to confuse the audience, who are left thinking, “Oh well, whatever story I’m being told by the BBC, my newspaper, or whatever it is, they are all much of a muchness.” Those states are competing for truth, even though in reality, of course, there is one truth. Sometimes the aim is to big up their own country, or to undermine faith in a democracy like ours, or the effectiveness of free societies.

Probably the biggest category of information operations is when there is not a particular line to push at all, but rather the disinformer is seeking to sow division or deepen division in our society, often by telling people things that they already believe, but more loudly and more aggressively to try to make them dislike some other group in society more. The purpose, ultimately, is to destabilise a free and open society such as ours and that has a cancerous effect. We talk sometimes of disinformation being spread by foreign states. Actually, it is not spread by foreign states; it is seeded by foreign states and then spread usually by people here. So they create these fake personas to plant ideas and then other people, seeing those messages and personas, unwittingly pick them up and pass them on themselves. It is incredibly important that we tackle that for the health of our democracy and our society.

The other point I want to mention briefly relates to fraud and the SNP amendments in the following group, but also Government new clause 14 in this group. I strongly support what the Government have done, during the shaping of the Bill, on fraud; there have been three key changes on fraud. The first was to bring user-generated content fraud into the scope of the Bill. That is very important for a particularly wicked form of fraud known as romance fraud. The second was to bring fraudulent advertising into scope, which is particularly important for categories of fraud such as investment fraud and e-commerce. The third big change was to make fraud a priority offence in the Bill, meaning that it is the responsibility of the platforms not just to remove that content when they are made aware of it, but to make strenuous efforts to try to stop it appearing in front of their users in the first place. Those are three big changes that I greatly welcome.

There are three further things I think the Government will need to do on fraud. First, there is a lot of fraudulent content beyond categories 1 and 2A as defined in the Online Safety Bill, so we are going to have to find ways—proportionate ways—to make sure that that fraudulent content is suppressed when it appears elsewhere, but without putting great burdens on the operators of all manner of community websites, village newsletters and so on. That is where the DCMS online advertising programme has an incredibly important part to play.

The second thing is about the huge variety of channels and products. Telecommunications are obviously important, alongside online content, but even within online, as the so-called metaverse develops further, with the internet of things and the massive potential for defrauding people through deep fakes and so on, we need to be one step ahead of these technologies. I hope that in DCMS my hon. Friends will look to create a future threats unit that seeks to do that.

Thirdly, we need to make sure everybody’s incentives are aligned on fraud. At present, the banks reimburse people who are defrauded and I hope that rate of reimbursement will shortly be increasing. They are not the only ones involved in the chain that leads to people being defrauded and often they are not the primary part of that chain. It is only right and fair, as well as economically efficient, to make sure the other parts of the chain that are involved share in that responsibility. The Bill makes sure their incentives are aligned because they have to take proportionate steps to stop fraudulent content appearing in front of customers, but we need to look at how we can sharpen that up to make sure everybody’s incentives are absolutely as one.

This is an incredibly important Bill. It has been a long time coming and I congratulate everybody, starting with my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), my hon. Friend the Member for Croydon South (Chris Philp) and others who have been closely involved in creating it. I wish my hon. Friend the Minister the best of luck.

Online Safety Bill

Eleanor Laing Excerpts
Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - - - Excerpts

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - -

It will be obvious to everyone present that a great many Members wish to speak. Although we have a lot of time for this Bill, it is not infinite, and some speeches, so far, have been extremely long. I am trying to manage this without a formal time limit, because the debate flows better without one, but I hope that Members will now limit themselves to around eight minutes. If they do not do so, there will be a formal time limit of less than eight minutes.

John McDonnell Portrait John McDonnell (Hayes and Harlington) (Lab)
- View Speech - Hansard - - - Excerpts

The debate so far has been serious, and it has respected the views that have been expressed not only by Members from across the House, on a whole range of issues, but by the families joining us today who have suffered such a sad loss.

I wish to address one detailed element of the Bill, and I do so in my role as secretary of the National Union of Journalists’ cross-party parliamentary group. It is an issue to which we have returned time and again when we have been debating legislation of this sort. I just want to bring it to the attention of the House; I do not intend to divide the House on this matter. I hope that the Government will take up the issue, and then, perhaps, when it goes to the other place, it will be resolved more effectively than it has been in this place. I am happy to offer the NUJ’s services in seeking to provide a way forward on this matter.

Many investigative journalists base their stories on confidential information, disclosed often by whistleblowers. There has always been an historic commitment—in this House as well—to protect journalists’ right to protect their sources. It has been at the core of the journalists’ code of practice, promoted by the NUJ. As Members know, in some instances, journalists have even gone to prison to protect their sources, because they believe that it is a fundamental principle of journalism, and also a fundamental principle of the role of journalism in protecting our democracy.

The growth in the use of digital technology in journalism has raised real challenges in protecting sources. In the case of traditional material, a journalist has possession of it, whereas with digital technology a journalist does not own or control the data in the same way. Whenever legislation of this nature is discussed, there has been a long-standing, cross-party campaign in the House to seek to protect this code of practice of the NUJ and to provide protection for journalists to protect their sources and their information. It goes back as far as the Police and Criminal Evidence Act 1984. If Members can remember the operation of that Act, they will know that it requires the police or the investigatory bodies to produce a production order, and requires notice to be given to journalists of any attempt to access information. We then looked at it again in the Investigatory Powers Act 2016. Again, what we secured there were arrangements by which there should be prior approval by a judicial commissioner before an investigatory power can seek communications data likely to compromise a journalists’ sources. There has been a consistent pattern.

To comply with Madam Deputy Speaker’s attempt to constrain the length of our speeches, let me briefly explain to Members what amendment 204 would do. It is a moderate probing amendment, which seeks to ask the Government to look again at this matter. When Ofcom is determining whether to issue a notice to intervene or when it is issuing a notice to that tech platform to monitor user-to-user content, the amendment asks it to consider the level of risk of the specified technology accessing, retaining or disclosing the identity of any confidential journalistic source or confidential journalistic material. The amendment stands in the tradition of the other amendments that have been tabled in this House and that successive Government have agreed to. It puts the onus on Ofcom to consider how to ensure that technologies can be limited to the purpose that was intended. It should not result in massive data harvesting operations, which was referred to earlier, or become a back door way for investigating authorities to obtain journalistic data, or material, without official judicial approval.