All 19 Chris Philp contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage

Online Safety Bill

Chris Philp Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.

Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.

Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?

I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.

Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.

The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.

I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.

--- Later in debate ---
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.

Chris Philp Portrait Chris Philp
- Hansard - -

It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.

A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.

For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.

The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Chris Philp Portrait Chris Philp
- Hansard - -

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (First sitting)

Chris Philp Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But as the Bill stands, there is a very clear point about stopping harmful content being sent to people, so I imagine that would cover it at least in that sense, would it not?

Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Q Just to continue the point made by my colleague, you are right to say that Ministry of Justice colleagues are considering the flashing image offence as a separate matter. But would you agree that clause 150, on harmful communications, does criminalise and therefore place into the scope of the Bill communications intended to cause harm to a “likely audience” where such harm is

“psychological harm amounting to serious distress”?

Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.

Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I would suggest that the definition in clause 150 would cover epilepsy trolling.

You mentioned that you met recently with European regulators. Briefly, because we are short of time, were there any particular messages, lessons or insights you picked up in those meetings that might be of interest to the Committee?

Kevin Bakhurst: Yes, there were a number, and liaising with European regulators and other global regulators in this space is a really important strand of our work. It often said that this regime is a first globally. I think that is true. This is the most comprehensive regime, and it is therefore potentially quite challenging for the regulator. That is widely recognised.

The second thing I would say is that there was absolute recognition of how advanced we are in terms of the recruitment of teams, which I touched on before, because we have had the funding available to do it. There are many countries around Europe that have recruited between zero and 10 and are imminently going to take on some of these responsibilities under the Digital Services Act, so I think they are quite jealous.

The last thing is that we see continued collaboration with other regulators around the world as a really important strand, and we welcome the information-sharing powers that are in the Bill. There are some parallels, and we want to take similar approaches on areas such as transparency, where we can collaborate and work together. I think it is important—

None Portrait The Chair
- Hansard -

Order. I am afraid we have come to the end of the allotted time for questions. On behalf of the Committee, I thank our witnesses for their evidence.

Examination of Witnesses

Dame Rachel de Souza, Lynn Perry MBE and Andy Burrows gave evidence.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Sorry to interject, Dame Rachel, but do you agree that it is not just about stopping under-18s viewing pornography; it also includes stopping children under 13 accessing social media entirely, as per those companies’ purported terms and conditions, which are frequently flouted?

Dame Rachel de Souza: Absolutely. I have called together the tech companies. I have met the porn companies, and they reassured me that as long as they were all brought into the scope of this Bill, they would be quite happy as this is obviously a good thing. I brought the tech companies together to challenge them on their use of age assurance. With their artificial intelligence and technology, they know the age of children online, so they need to get those children offline. This Bill is a really good step in that direction; it will hold them to account and ensure they get children offline. That was a critically important one for me.

I was also pleased to see the holding to account of companies, which is very important. On full coverage of pornography, I was pleased to see the offence of cyber-flashing in the Bill. Again, it is particularly about age assurance.

What I would say is that nudge is not working, is it? We need this in the Bill now, and we need to get it there. In my bit of work with those 2,000 young people, we asked what they had seen in the last month, and 40% of them have not had bad images taken down. Those aspects of the Bill are key.

Andy Burrows: This is a landmark Bill, so we thank you and the Government for introducing it. We should not lose sight of the fact that, although this Bill is doing many things, first and foremost it will become a crucial part of the child protection system for decades to come, so it is a hugely important and welcome intervention in that respect.

What is so important about this Bill is that it adopts a systemic approach. It places clear duties on platforms to go through the process of identifying the reasonably foreseeable harms and requiring that reasonable steps be taken to mitigate them. That is hugely important from the point of view of ensuring that this legislation is future-proofed. I know that many companies have argued for a prescriptive checklist, and then it is job done—a simple compliance job—but a systemic approach is hugely important because it is the basis upon which companies have very clear obligations. Our engagement is very much about saying, “How can we make sure this Bill is the best it can possibly be?” But that is on the bedrock of that systemic approach, which is fundamental if we are to see a culture shift in these companies and an emphasis on safety by design—designing out problems that do not have to happen.

I have engaged with companies where child safety considerations are just not there. One company told me that grooming data is a bad headline today and tomorrow’s chip shop wrapper. A systemic approach is the key to ensuring that we start to address that balance.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. I obviously strongly agree with those comments.

I would like to turn to a one or two points that came up in questioning, and then I would like to probe a couple of points that did not. Dame Rachel mentioned advocacy and ensuring that the voice of particular groups—in this context, particularly that of children—is heard. In that context, I would like to have a look at clause 140, which relates to super-complaints. Subsection (4) says that the Secretary of State can, by regulations, nominate which organisations are able to bring super-complaints. These are complaints whereby you go to Ofcom and say that there is a particular company that is failing in its systemic duties.

Subsection (4) makes it clear that the entities nominated to be an authorised super-complainant would include

“a body representing the interests of users of regulated services”,

which would obviously include children. If an organisation such as the Office of the Children’s Commissioner or the NSPCC—I am obviously not prejudicing the future process—were designated as a super-complainant that was able to bring super-complaints to Ofcom, would that address your point about the need for proper advocacy for children?

Dame Rachel de Souza: Absolutely. I stumbled over that a bit when Maria asked me the question, but we absolutely need people who work with children, who know children and are trusted by children, and who can do that nationally in order to be the super-complainants. That is exactly how I would envisage it working.

Andy Burrows: The super-complaint mechanism is part of the well-established arrangements that we see in other sectors, so we are very pleased to see that that is included in the Bill. I think there is scope to go further and look at how the Bill could mirror the arrangements that we see in other sectors—I mentioned the energy, postal and water sectors earlier as examples—so that the statutory user advocacy arrangements for inherently vulnerable children, including children at risk of sexual abuse, mirror the arrangements that we see in those other sectors. That is hugely important as a point of principle, but it is really helpful and appropriate for ensuring that the legislation can unlock the positive regulatory outcomes that we all want to see, so I think it contributes towards really effective regulatory design.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you, Andy. I am conscious of the time, so I will be brief with my final three questions. You made a valid point about large social media platforms receiving complaints generally, but in this case from children, about inappropriate content, such as photographs of them on a social media platform that do not get taken down—the complaint gets ignored, or it takes a very long time. In clause 18, we have duties on the complaints procedures that the big social media firms will now have to follow. I presume that you would join me in urging Ofcom to ensure that how it enforces the duties in clause 18 includes ensuring that big social media firms are responsive and quick in how they deal with complaints. Children are specifically referred to in the clause—for example, in subsection (3) and elsewhere.

Dame Rachel de Souza: Yes, and I was so pleased to see that. The regulator needs to have teeth for it to have any effect—I think that is what we are saying. I want named senior managers to be held accountable for breaches of their safety duties to children, and I think that senior leaders should be liable to criminal sanctions when they do not uphold their duty of care to children.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Good—thank you. I want to say something about gaming, because Kirsty Blackman asked about it. If messages are being sent back and forth in a gaming environment, which is probably the concern, those are in scope of the Bill, because they are user-to-user services.

I will put my last two questions together. Are you concerned about the possibility that encryption in messaging services might impede the automatic scanning for child exploitation and abuse images that takes place, and would you agree that we cannot see encryption happen at the expense of child safety? Secondly, in the context of the Molly Russell reference earlier, are you concerned about the way that algorithms can promote and essentially force-feed children very harmful content? Those are two enormous questions, and you have only two minutes to answer them, so I apologise.

Dame Rachel de Souza: I am going to say yes and yes.

Andy Burrows: I will say yes and yes as well. The point about end-to-end encryption is hugely important. Let us be clear: we are not against end-to-end encryption. Where we have concerns is about the risk profile that end-to-end encryption introduces, and that risk profile, when we are talking about it being introduced into social networking services and bundled with other sector functionality, is very high and needs to be mitigated.

About 70% of child abuse reports that could be lost with Meta going ahead. That is 28 million reports in the past six months, so it is very important that the Bill can require companies to demonstrate that if they are running services, they can acquit themselves in terms of the risk assessment processes. We really welcome the simplified child sexual exploitation warning notices in the Bill that will give Ofcom the power to intervene when companies have not demonstrated that they have been able to introduce end-to-end encryption in a safe and effective way.

One area in which we would like to see the Bill—

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions of this panel. On behalf of the Committee, I thank our witnesses for their evidence, and I am really sorry that we could not get Lynn Perry online. Could we move on to the last panel? Thank you very much.

Examination of Witnesses

Ben Bradley and Katy Minshall gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Sorry, I have to interrupt you there. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you for coming to give evidence to the Committee. On the question about user choice around identity verification, is this not conceptually quite similar to the existing blue tick regime that Twitter operates successfully?

Katy Minshall: As I say, we share your policy objective of giving users more choice. For example, at present we are testing a tool where Twitter automatically blocks abusive accounts on your behalf. We make the distinction based on an account’s behaviour and not on whether it has verified itself in some way.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Well, I’d be grateful if you applied that to my account as quickly as possible!

I do not think that the concept would necessarily operate as you suggested at the beginning. You suggested that people might end up not seeing content posted by the Prime Minister or another public figure. The concept is that, assuming a public figure would choose to verify themselves, content that they posted would be visible to everybody because they had self-verified. The content in the other direction may or may not be, depending on whether the Prime Minister or the Leader of the Opposition chose to see all content or just verified content, but their content—if they verified themselves—would be universally visible, regardless of whatever choice anyone else exercised.

Katy Minshall: Yes, sorry if I was unclear. I totally accept that point, but it would mean that some people would be able to reply to Boris Johnson and others would not. I know we are short on time, but it is worth pointing out that in a YouGov poll in April, nearly 80% of people said that they would not choose to provide ID documents to access certain websites. The requirements that you describe are based on the assumption that lots of people will choose to do it, when in reality that might not be the case.

A public figure might think, “Actually, I really appreciate that I get retweets, likes and people replying to my tweets,” but if only a small number of users have taken the opportunity to verify themselves, that is potentially a disincentive even to use this system in the first place—and all the while we were creating a system, we could have been investing in or trying to develop new solutions, such as safety mode, which I described and which tries to prevent abusive users from interacting with you.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I want to move on to the next question because we only have two minutes left.

Ben, you talked about the age verification measures that TikTok currently takes. For people who do not come via an age-protected app store, it is basically self-declared. All somebody has to do is type in a date of birth. My nine-year-old children could just type in a date of birth that was four years earlier than their real date of birth, and off they would go on TikTok. Do you accept that that is wholly inadequate as a mechanism for policing the age limit of 13?

Ben Bradley: That is not the end of our age assurance system; it is just the very start. Those are the first two things that we have to prevent sign-up, but we are also proactive in surfacing and removing under-age accounts. As I said, we publish every quarter how many suspected under-13s get removed.

Chris Philp Portrait Chris Philp
- Hansard - -

Q If I understood your answer correctly, that is only if a particular piece of content comes to the attention of your moderators. I imagine that only 0.01% or some tiny fraction of content on TikTok comes to the attention of your moderators.

Ben Bradley: It is based on a range of signals that they have available to them. As I said, we publish a number every quarter. In the last quarter, we removed 14 million users across the globe who were suspected to be under the age of 13. That is evidence of how seriously we take the issue. We publish that information because we think it is important to be transparent about our efforts in this space, so that we can be judged accordingly.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. Forgive me for moving on in the interests of time.

Earlier, we debated content of democratic importance and the protections that that and free speech have in the Bill. Do you agree that a requirement to have some level of consistency in the way that that is treated is important, particularly given that there are some glaring inconsistencies in the way in which social media firms treat content at the moment? For example, Donald Trump has been banned, while flagrant disinformation by the Russian regime, lying about what they are doing in Ukraine, is allowed to propagate—including the tweets that I drew to your attention a few weeks ago, Katy.

Katy Minshall: I agree that freedom of expression should be top of mind as companies develop safety and policy solutions. Public interest should always be considered when developing policies. From the perspective of the Bill, I would focus on freedom of expression for everyone, and not limit it to content that could be related to political discussions or journalistic content. As Ben said, there are already wider freedom of expression duties in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

Q To be clear, those freedom of expression duties in clause 19(2) do apply to everyone.

Katy Minshall: Sorry, but I do not know the Bill in those terms, so you would have to tell me the definition.

None Portrait The Chair
- Hansard -

Order. I am afraid that that brings us to the end of the time allotted for the Committee to ask questions in this morning’s sitting. On behalf of the Committee, I thank our witnesses for their evidence. We will meet again at 2 pm in this room to hear further oral evidence.

Online Safety Bill (Second sitting)

Chris Philp Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

I am sorry, but I must move on. Minister, I am afraid you only have five minutes.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Q Welcome to the Committee’s proceedings and thank you for joining us this afternoon. I would like to start on the question of the algorithmic promotion of content. Last week, I met with the Facebook whistleblower, Frances Haugen, who spoke in detail about she had found when working for Facebook, so I will start with you, Richard. On the question of transparency, which other Members of the Committee have touched on, would you have any objection to sharing all the information you hold internally with trusted researchers?

Richard Earley: What information are you referring to?

Chris Philp Portrait Chris Philp
- Hansard - -

Data, in particular on the operation of algorithmic promotion of particular kinds of content.

Richard Earley: We already do things like that through the direct opportunity that anyone has to see why a single post has been chosen for them in their feed. You can click on the three dots next to any post and see that. For researcher access and support, as I mentioned, we have contributed to the publishing of more than 400 reports over the last year, and we want to do more of that. In fact, the Bill requires Ofcom to conduct a report on how to unlock those sorts of barriers, which we think should be done as soon as possible. Yes, in general we support that sort of research.

I would like say one thing, though. I have worked at Facebook—now Meta—for almost five years, and nobody at Facebook has any obligation, any moral incentive, to do anything other than provide people with the best, most positive experience on our platform, because we know that if we do not give people a positive experience, through algorithms or anything else, they will leave our platform and will not use it. They tell us that and they do it, and the advertisers who pay for our services do not want to see that harmful content on our platforms either. All of our incentives are aligned with yours, which are to ensure that our users have a safe and positive experience on our platforms.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yet the algorithms that select particular content for promotion are optimised for user engagement —views, likes and shares—because that increases user stickiness and keeps them on the site for longer. The evidence seems to suggest that, despite what people say in response to the surveys you have just referenced, what they actually interact with the most—or what a particular proportion of the population chooses to interact with the most—is content that would be considered in some way extreme, divisive, or so on, and that the algorithms, which are optimised for user engagement, notice that and therefore uprank that content. Do you accept that your algorithms are optimised for user engagement?

Richard Earley: I am afraid to say that that is not correct. We have multiple algorithms on our services. Many of them, in fact, do the opposite of what you have just described: they identify posts that might be violent, misleading or harmful and reduce the prevalence of them within our feed products, our recommendation services and other parts of the service.

We optimise the algorithm that shows people things for something called meaningful social interaction. That is not just pure engagement; in fact, its focus—we made a large change to our algorithms in 2018 to focus on this—is on the kinds of activities online that research shows are correlated with positive wellbeing outcomes. Joining a group in your local area or deciding to go to an event that was started by one of your friends—that is what our algorithms are designed to promote. In fact, when we made that switch in 2018, we saw a decrease in more than 50 million hours of Facebook use every day as a result of that change. That is not the action of a company that is just focused on maximising engagement; it is a company that is focused on giving our users a positive experience on our platform.

Chris Philp Portrait Chris Philp
- Hansard - -

Q You have alluded to some elements of the algorithmic landscape, but do you accept that the dominant feature of the algorithm that determines which content is most promoted is based on user engagement, and that the things you have described are essentially second-order modifications to that?

Richard Earley: No, because as I just said, when we sent the algorithm this instruction to focus on social interaction it actually decreased the amount of time people spent on our platform.

Chris Philp Portrait Chris Philp
- Hansard - -

Q It might have decreased it, but the meaningful social interaction score is, not exclusively, as you said, but principally based on user engagement, isn’t it?

Richard Earley: As I said, it is about ensuring that people who spend time on our platform come away feeling that they have had a positive experience.

Chris Philp Portrait Chris Philp
- Hansard - -

Q That does not quite answer the question.

Richard Earley: I think that a really valuable part of the Bill that we are here to discuss is the fact that Ofcom will be required, and we in our risk assessments will be required, to consider the impact on the experience of our users of multiple different algorithms, of which we have hundreds. We build those algorithms to ensure that we reduce the prevalence of harmful content and give people the power to connect with those around them and build community. That is what we look forward to demonstrating to Ofcom when this legislation is in place.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes, but in her testimony to, I think, the Joint Committee and the US Senate, in a document that she released to The Wall Street Journal, and in our conversation last week, Frances Haugen suggested that the culture inside Facebook, now Meta, is that measures that tend to reduce user engagement do not get a very sympathetic hearing internally. However, I think we are about to run out of time. I have one other question, which I will direct, again, to Richard. Forgive me, Katie and Becky, but it is probably most relevant for Meta.

None Portrait The Chair
- Hansard -

Q Just one moment, please. Is there anything that the other witnesses need to say about this before we move on? It will have to be very brief.

Katie O'Donovan: I welcome the opportunity to address the Committee. It is so important that this Bill has parliamentary scrutiny. It is a Bill that the DCMS has spent a lot of time on, getting it right and looking at the systems and the frameworks. However, it will lead to a fundamentally different internet for UK users versus the rest of the world. It is one of the most complicated Bills we are seeing anywhere in the world. I realise that it is very important to have scrutiny of us as platforms to determine what we are doing, but I think it is really important to also look at the substance of the Bill. If we have time, I would welcome the chance to give a little feedback on the substance of the Bill too.

Becky Foreman: I would add that the Committee spent a lot of time talking to Meta, who are obviously a big focus for the Bill, but it is important to remember that there are numerous other networks and services that potentially will be caught by the Bill and that are very different from Meta. It is important to remember that.

Chris Philp Portrait Chris Philp
- Hansard - -

While the Bill is proportionate in its measures, it is not designed to impose undue burdens on companies that are not high risk. I have one more question for Richard. I think Katie was saying that she wanted to make a statement?

None Portrait The Chair
- Hansard -

We are out of time. I am sorry about this; I regard it as woefully unsatisfactory. We have got three witnesses here, a lot of questions that need to be answered, and not enough time to do it. However, we have a raft of witnesses coming in for the rest of the day, so I am going to have to draw a line under this now. I am very grateful to you for taking the trouble to come—the Committee is indebted to you. You must have the opportunity to make your case. Would you be kind enough to put any comments that you wish to make in writing so that the Committee can have them. Feel free to go as broad as you would like because I feel very strongly that you have been short-changed this afternoon. We are indebted to you. Thank you very much indeed.

Richard Earley: We will certainly do that and look forward to providing comments in writing.

Examination of Witnesses

Professor Clare McGlynn, Jessica Eagelton and Janaya Walker gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Minister?

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you, Sir Roger, and thank you to the witnesses for coming in and giving very clear, helpful and powerful evidence to the Committee this afternoon. On the question of age verification or age assurance that we have just spoken about, clause 11(14) of the Bill sets a standard in the legislation that will be translated into the codes of practice by Ofcom. It says that, for the purposes of the subsection before on whether or not children can access a particular set of content, a platform is

“only entitled to conclude that it is not possible for children to access a service…if there are systems or processes in place…that achieve the result that children are not normally able to access the service”.

Ofcom will then interpret in codes of practice what that means practically. Professor McGlynn, do you think that standard set out there—

“the result that children are not normally able to access the service or that part of it”

—is sufficiently high to address the concerns we have been discussing in the last few minutes?

Professor Clare McGlynn: At the moment, the wording with regard to age assurance in part 5—the pornography providers—is slightly different, compared with the other safety duties. That is one technicality that could be amended. As for whether the provision you just talked about is sufficient, in truth I think it comes down, in the end, to exactly what is required, and of course we do not yet know what the nature of the age verification or age assurance requirements will actually be and what that will actually mean.

I do not know what that will actually mean for something like Twitter. What will they have to do to change it? In principle, that terminology is possibly sufficient, but it kind of depends in practice what it actually means in terms of those codes of practice. We do not yet know what it means, because all we have in the Bill is about age assurance or age verification.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes, you are quite right that the Ofcom codes of practice will be important. As far as I can see, the difference between clauses 68 and 11(14) is that one uses the word “access” and the other uses the word “encounter”. Is that your analysis of the difference as well?

Professor Clare McGlynn: My understanding as well is that those terms are, at the moment, being interpreted slightly differently in terms of the requirements that people will be under. I am just making a point about it probably being easier to harmonise those terms.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you very much. I wanted to ask you a different question—one that has not come up so far in this session but has been raised quite frequently in the media. It concerns freedom of speech. This is probably for Professor McGlynn again. I am asking you this in your capacity as a professor of law. Some commentators have suggested that the Bill will have an adverse impact on freedom of speech. I do not agree with that. I have written an article in The Times today making that case, but what is your expert legal analysis of that question?

Professor Clare McGlynn: I read your piece in The Times this morning, which was a robust defence of the legislation, in that it said that it is no threat to freedom of speech, but I hope you read my quote tweet, in which I emphasised that there is a strong case to be made for regulation to free the speech of many others, including women and girls and other marginalised people. For example, the current lack of regulation means that women’s freedom of speech is restricted because we fear going online because of the abuse we might encounter. Regulation frees speech, while your Bill does not unduly limit freedom of speech.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Okay, I take your second point, but did you agree with the point that the Bill as crafted does not restrict what you would ordinarily consider to be free speech?

Professor Clare McGlynn: There are many ways in which speech is regulated. The social media companies already make choices about what speech is online and offline. There are strengths in the Bill, such as the ability to challenge when material is taken offline, because that can impact on women and girls as well. They might want to put forward a story about their experiences of abuse, for example. If that gets taken down, they will want to raise a complaint and have it swiftly dealt with, not just left in an inbox.

There are lots of ways in which speech is regulated, and the idea of having a binary choice between free speech and no free speech is inappropriate. Free speech is always regulated, and it is about how we choose to regulate it. I would keep making the point that the speech of women and girls and other marginalised people is minimised at the moment, so we need regulation to free it. The House of Lords and various other reports about free speech and regulation, for example, around extreme pornography, talk about regulation as being human-rights-enhancing. That is the approach we need to take.

None Portrait The Chair
- Hansard -

Thank you very much indeed. Once again, I am afraid I have to draw the session to a close, and once again we have probably not covered all the ground we would have liked. Professor McGlynn, Ms Walker, Ms Eagleton, thank you very much indeed. As always, if you have further thoughts or comments, please put them in writing and let us know. We are indebted to you.

Examination of Witnesses

Lulu Freemont, Ian Stevenson and Adam Hildreth gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you, Sir Roger, and thank you very much indeed for joining us for this afternoon’s session. Adam, we almost met you in Leeds last October or November, but I think you were off with covid at the time.

Adam Hildreth: I had covid at the time, yes.

Chris Philp Portrait Chris Philp
- Hansard - -

Covid struck. I would like to ask Adam and Ian in particular about the opportunities provided by emerging and new technology to deliver the Bill’s objectives. I would like you both to give examples of where you think new tech can help deliver these safety duties. I ask you to comment particularly on what it might do on, first, age assurance—which we debated in our last session—and secondly, scanning for child sexual abuse images in an end-to-end encrypted environment. Adam, do you want to go first?

Adam Hildreth: Well, if Ian goes first, the second question would be great for him to answer, because we worked on it together.

Chris Philp Portrait Chris Philp
- Hansard - -

Fair enough. Ian?

Ian Stevenson: Yes, absolutely. The key thing to recognise is that there is a huge and growing cohort of companies, around the world but especially in the UK, that are working on technologies precisely to try to support those kinds of safety measures. Some of those have been supported directly by the UK Government, through the safety tech challenge fund, to explore what can be done around end-to-end encrypted messaging. I cannot speak for all the participants, but I know that many of them are members of the safety tech industry association.

Between us, we have demonstrated a number of different approaches. My own company, Cyacomb, demonstrated technology that could block known child abuse within encrypted messaging environments without compromising the privacy of users’ messages and communications. Other companies in the UK, including DragonflAI and Yoti, demonstrated solutions based on detecting nudity and looking at the ages of the people in those images, which are again hugely valuable in this space. Until we know exactly what the regulation is going to demand, we cannot say exactly what the right technology to solve it is.

However, I think that the fact that that challenge alone produced five different solutions looking at the problem from different angles shows just how vibrant the innovation ecosystem can be. My background in technology is long and mixed, but I have seen a number of sectors emerge—including cyber-security and fintech—where, once the foundations for change have been created, the ability of innovators to come up with answers to difficult questions is enormous. The capacity to do that is enormous.

There are a couple of potential barriers to that. The strength of the regulation is that it is future proof. However, until we start answering the question, “What do we need to do and when? What will platforms need to do and when will they need to do it?” we do not really create in the commercial market the innovation drivers for the technical solutions that will deliver this. We do not create the drivers for investment. It is really important to be as specific as we can about what needs to be done and when.

The other potential barrier is regulation. We have already had a comment about how there should be a prohibition of general monitoring. We have seen what has happened in the EU recently over concerns about safety technologies that are somehow looking at traffic on services. We need to be really clear that, while safety technologies must protect privacy, there needs to be a mechanism so that companies can understand when they can deploy safety technologies. At the moment there are situations where we talk to potential customers for safety technologies and they are unclear as to whether it would be proportionate to deploy those under, for example, data protection law. There are areas, even within the safety tech challenge fund work on end-to-end encrypted messaging, where it was unclear whether some of the technologies—however brilliant they were at preventing child abuse in those encrypted environments —would be deployable under current data protection and privacy of electronic communications regulations.

There are questions there. We need to make sure that when the Online Safety Bill comes through, it makes clear what is required and how it fits together with other regulations to enable that. Innovators can do almost anything if you give them time and space. They need the certainty of knowing what is required, and an environment where solutions can be deployed and delivered.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Ian, thank you very much. I am encouraged by your optimism about what innovation can ultimately deliver. Adam, let me turn to you.

Adam Hildreth: I agree with Ian that the level of innovation is amazing. If we start talking about age verification and end-to-end encryptions, for me—I am going to say that same risk assessment phrase again—it absolutely depends on the type of service, who is using the service and who is exploiting the service, as to which safety technologies should be employed. I think it is dangerous to say, “We are demanding this type of technology or this specific technology to be deployed in this type of instance,” because that removes the responsibility from the people who are creating it.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Sorry to interject, but to be clear, the Bill does not do that. The Bill specifies the objectives, but it is tech agnostic. The manner of delivering those is, of course, not specified, either in the Bill or by Ofcom.

Adam Hildreth: Absolutely. Sorry, I was saying that I agree with how it has been worded. We know what is available, but technology changes all the time and solutions change all the time—we can do things in really innovative ways. However, the risk assessment has to bring together freedom of speech versus the types at risk of abuse. Is it children who are at risk, and if so, what are they at risk from? That changes the space massively when compared with some adult gaming communities, where what is harmful to them is very different from what harms other audiences. That should dictate for them what system and technology is deployed. Once we understand what best of breed looks like for those types of companies, we should know what good is.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you, Adam. We only have one minute left, so what is your prediction for the potential possibilities that emerging tech presents to deal with the issues of age assurance, which are difficult, and CSEA scanning, given end-to-end encrypted environments?

Adam Hildreth: The technology is there. It exists and it is absolutely deployable in the environments that need it. I am sure Ian would agree; we have seen it and done a lot of testing on it. The technology exists in the environments that need it.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Including inside the end-to-end encrypted environment, rather than just at the device level? Quite a few of the safety challenge solutions that Ian mentioned are at the device level; they are not inside the encryption.

Adam Hildreth: There are ways that can work. Again, it brings in freedom of expression, global businesses and some other areas, so it is more about regulation and consumer concerns about the security of data, rather than whether technological solutions are available.

None Portrait The Chair
- Hansard -

Ms Freemont, Mr Hildreth and Mr Stevenson, thank you all very much indeed. We have run out of time. As ever, if you have any further observations that you wish to make, please put them in writing and let the Committee have them; we shall welcome them. Thank you for your time this afternoon. We are very grateful to you.

Examination of Witnesses

Jared Sine, Nima Elmi and Dr Rachel O’Connell gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Right. For once, we seem to have run out of questions. Minister, do you wish to contribute?

Chris Philp Portrait Chris Philp
- Hansard - -

Everything I was going to ask has already been asked by my colleagues, so I will not duplicate that.

None Portrait The Chair
- Hansard -

Q In that case, given that we have the time, rather than doing what I normally do and inviting you to make any further submissions in writing, if there are any further comments that you would like to make about the Bill, the floor is yours. Let us start with Mr Sine.

Jared Sine: I would just make one brief comment. I think it has been mentioned by everyone here. Everyone has a role to play. Clearly, the Government have a role in proposing and pushing forward the legislation. The platforms that have the content have an obligation and a responsibility to try to make sure that their users are safe. One of the things that Dr O’Connell mentioned is age verification and trying to make sure that we keep young kids off platforms where they should not be.

I think there is a big role to play for the big tech platforms—the Apples and Googles—who distribute our apps. Over the years, we have said again and again to both of those companies, “We have age-gated our apps at 18, yet you will allow a user you know is 15, 14, 16—whatever it is—to download that app. That person has entered that information and yet you still allow that app to be downloaded.” We have begged and pleaded with them to stop and they will not stop. I am not sure that that can be included in the Bill, but if it could be, it would be powerful.

If Apple and Google could not distribute any of our apps—Hinge, Match, Tinder—to anyone under the age of 18, that solves it right there. It is the same methodology that has been used at clubs with bouncers—you have a bouncer at the door who makes sure you are 21 before you go in and have a drink. It should be the same thing with these technology platforms. If they are going to distribute and have these app stores, the store should then have rules that show age-gated apps—“This is for 17-plus or 18-plus”—and should also enforce that. It is very unfortunate that our calls on this front have gone unanswered. If the Bill could be modified to include that, it would really help to address the issue.

Dr Rachel O'Connell: Absolutely. I 100% support that. There is a tendency for people to say, “It is very complex. We need a huge amount of further consultation.” I started my PhD in 1996. This stuff has been going on for all that time. In 2008, there was a huge push by the Attorneys General, which I mentioned already, which brought all of the industry together. That was 2008. We are in 2022 now. 2017 was the Internet Safety Strategy Green Paper. We know what the risks are. They are known; we understand what they are. We understand the systems and processes that facilitate them. We understand what needs to be done to mitigate those risks and harms. Let’s keep on the track that we are going on.

Regarding industry’s concerns, a lot of them will be ironed out when companies are required to conduct risk assessments and impact assessments. They might ask, what are the age bands of your users? What are the risks associated with the product features that you are making available? What are the behaviour modification techniques that you are using, like endless scroll and loot boxes that get kids completely addicted? Are those appropriate for those ages? Then you surface the decision making within the business that results in harms and also the mitigations.

I urge you to keep going on this; do not be deterred from it. Keep the timeframe within which it comes into law fairly tight, because there are children out there who are suffering. As for the harassment—I have experienced it myself, it is horrible.

Those would be my final words.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you for your very powerful testimony, Rhiannon. I appreciate that could not have been easy. Going back to the digital literacy piece, it feels like we were talking about digital literacy in the Bill when it started coming through, and that has been removed now. How important do you think it is that we have a digital literacy strategy, and that we hold social media providers in particular to having a strategy on digital education for young people?

Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Can I start by thanking both Rhiannon-Faye and Susie for coming and giving evidence, and for all the work they are doing in this area? I know it has been done over many years in both cases.

I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?

Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Why would it not affect the Internet Watch Foundation?

Susie Hargreaves: Because they are scanning Facebook—sorry, I am just trying to unpack the way it works. It will affect us, actually. Basically, when we provide our hash list to Facebook, it uses that to scan Messenger, but the actual images that are found—the matches—are not reported to us; they are reported into NCMEC. Facebook does take our hash list. For those of you who do not know about hashing, it is a list of digital fingerprints—unique images of child sexual abuse. We currently have about 1.3 million unique images of child sexual abuse. Facebook does use our hash list, so yes it does affect us, because it would still take our hash list to use on other platforms, but it would not use it on Messenger. The actual matches would go into NCMEC. We do not know how many matches it gets against our hash list, because it goes into NCMEC.

Chris Philp Portrait Chris Philp
- Hansard - -

Q But its ability to check images going across Messenger against your list would effectively terminate.

Susie Hargreaves: Yes, sorry—I was unclear about that. Yes, it would on Messenger.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Clearly the Bill cannot compel the creation of technology that does not exist yet. It is hoped that there will be technology—we heard evidence earlier suggesting that it is very close to existing—that allows scanning in an end-to-end encrypted environment. Do you have any update on that that you can give the Committee? If there is no such technology, how do you think the Bill should address that? Effectively there would be a forced choice between end-to-end encryption and scanning for CSEA content.

Susie Hargreaves: As I said before, it is essential that we do not demonise end-to-end encryption. It is really important. There are lots of reasons why, from a security and privacy point of view, people want to be able to use end-to-end encryption.

In terms of whether the technology is there, we all know that there are things on the horizon. As Ian said in the previous session, the technology is there and is about to be tried out. I cannot give any update at this meeting, but in terms of what we would do if end-to-end encryption is introduced and there is no ability to scan, we could look at on-device scanning, which I believe you mentioned before, Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Yes.

Susie Hargreaves: That is an option. That could be a backstop position. I think that, at the moment, we should stand our ground on this and say, “No, we need to ensure that we have some form of scanning in place if end-to-end encryption is introduced.”

Chris Philp Portrait Chris Philp
- Hansard - -

Q For complete clarity, do you agree that the use of end-to-end encryption cannot be allowed at the expense of child safety?

Susie Hargreaves: I agree 100%.

Chris Philp Portrait Chris Philp
- Hansard - -

Good. Thank you.

None Portrait The Chair
- Hansard -

Thank you very much indeed, Ms McDonald and Ms Hargreaves. We are most grateful to you; thank you for your help.

Examination of Witnesses

Ellen Judson and Kyle Taylor gave evidence.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—

Chris Philp Portrait Chris Philp
- Hansard - -

Q Sorry, Kyle, do come in in a second, but I just want to come back on that point.

Is it not true that a member of the public or anyone debating a legitimate political topic would also benefit from these measures? It is likely that MPs would automatically benefit—near automatically—but a member of the public might equally benefit if the topic they are talking about is of democratic or journalistic importance.

Ellen Judson: Our concern is that defining what is a legitimate political debate is itself already privileging. As you said, an MP is very likely automatically to benefit.

Chris Philp Portrait Chris Philp
- Hansard - -

Well, it is likely; I would not say it is guaranteed.

Ellen Judson: A member of the public may be discussing something—for example, an active political debate that is not about the United Kingdom, which I believe would be out of scope of that protection. They would be engaged in political discussion and exercising freedom of expression, and if they were not doing so in a way that met the threshold for action based on harm, their speech should also come under those protections.

Kyle Taylor: I would add that the way in which you have described it would be so broad as to effectively be meaningless in the context of the Bill, and that instead we should be looking for universal free expression protections in that part of the Bill, and removing this provision. Because what is not, in a liberal democracy, speech of democratic importance? Really, that is everything. When does it reach the threshold where it is an active political debate? Is it when enough people speak about it or enough politicians bring it up? It is so subjective and so broad effectively to mean that everything could qualify. Again, this is not taking a harms-based approach to online safety, because the question is not “Who is saying it?” or “In what context?”; the question is, “Does this have the propensity to cause harm at scale?”

Chris Philp Portrait Chris Philp
- Hansard - -

Q The harms are covered elsewhere in the Bill. This is saying what you have to take into account. In fact, at the very beginning of your remarks, Kyle, you said that some of the stuff in the US a week or two ago might have been allowed to stand under these provisions, but the provision does not provide an absolute protection; it simply says that the provider has to take it into account. It is a balancing exercise. Other parts of the Bill say, “You’ve got to look at the harm on a systemic basis.” This is saying, “You’ve got to take into account whether the content is of democratic or journalistic importance.” You made a point a second ago about general protection on free speech, which is in clause 19(2).

Kyle Taylor: Can I respond to that?

Chris Philp Portrait Chris Philp
- Hansard - -

Yes, sure.

Kyle Taylor: My point is that if there is a provision in the Bill about freedom of expression, it should be robust enough that this protection does not have to be in the Bill. To me, this is saying, “Actually, our free expression bit isn’t strong enough, so we’re going to reiterate it here in a very specific context, using very select language”. That may mean that platforms decide not to act for fear of reprisal, as opposed to pursuing online safety. I suggest strengthening the freedom of expression section so that it hits all the points that the Government intend to hit, and removing those qualifiers that create loopholes and uncertainty for a regime that, if it is systems-based, does not have loopholes.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I understand the point you are making, logically. Someone mentioned the human rights element earlier. Of course, article 10 of the European convention on human rights expresses the right to freedom of speech. The case law deriving from that ECHR article provides an enhanced level of protection, particularly for freedom of the press relative to otherwise, so there is some established case law which makes that point. You were talking about human rights earlier, weren’t you?

Ellen Judson: We absolutely recognise that. There is discussion in terms of meeting certain standards of responsible journalism in relation to those protections. Our concern is very much that the people and actors who would most benefit from the journalistic protections specifically would be people who do not meet those standards and cannot prove that they meet those standards, because the standards are very broad. If you intend your content to be journalistic, you are in scope, and that could apply to extremists as much as to people meeting standards of responsible journalism.

Chris Philp Portrait Chris Philp
- Hansard - -

Q If you are talking about clause 16, it is not that you intend it to be journalistic content; it is that it is journalistic content. You might be talking about clause 50, which is the general exemption to recognise news publishers from the provisions of the Bill. That of course does not prevent social media platforms from choosing to apply their terms and conditions to people who are recognised news publishers; it is just that the Bill is not compelling them. It is important to make that clear—that goes back to the point you made right at the beginning, Kyle. A couple of times in your testimony so far, you have said that you think the way the definition of “recognised news publisher” is drafted in clause 50 is too wide, and potentially susceptible to, basically, abuse by people who are in essence pretending to be news publishers, but who are not really. They are using this as a way to get a free pass from the provisions of the Bill. I completely understand that concern. Do you have any specific suggestions for the Committee about how that concern might be addressed? How could we change the drafting of the Bill to deal with that issue?

Kyle Taylor: Remove the exemption.

Chris Philp Portrait Chris Philp
- Hansard - -

Q You mean completely? Just delete it?

Kyle Taylor: Well, I am struggling to understand how we can look at the Bill and say, “If this entity says it, it is somehow less harmful than if this entity says it.” That is a two-tiered system and that will not lead to online safety, especially when those entities that are being given privilege are the most likely and largest sources and amplifiers of harmful content online. We sit on the frontlines of this every day, looking at social media, and we can point to countless examples from around the world that will show that, with these exemptions, exceptions and exclusions, you will actually empower those actors, because you explicitly say that they are special. You explicitly say that if they cause harm, it is somehow not as bad as if a normal user with six followers on Twitter causes harm. That is the inconsistency and incoherency in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

We are talking here about the press, not about politicians—

Kyle Taylor: Yes, but the press and media entities spread a lot of disinformation—

Chris Philp Portrait Chris Philp
- Hansard - -

Q I get that. You have mentioned Victor Orbán and the press already in your comments. There is a long-standing western tradition of treating freedom of the press as something that is sacrosanct and so foundational to the functioning of democracy that you should not infringe or impair it in any way. That is the philosophy that underpins this exclusion.

Kyle Taylor: Except that that is inconsistent in the Bill, because you are saying that for broadcast, they must have a licence, but for print press, they do not have to subscribe to an independent standards authority or code. Even within the media, there is this inconsistency within the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

That is a point that applies regardless of the Bill. The fact is that UK broadcast is regulated whereas UK newspapers are not regulated, and that has been the case for half a century. You can debate whether that is right or wrong, but—

Kyle Taylor: We are accepting that newspapers are not regulated then.

Chris Philp Portrait Chris Philp
- Hansard - -

Q That matter stands outside the scope of the Bill. If one was minded to tighten this up—I know that you have expressed a contrary view to the thing just being deleted—and if you were to accept that the freedom of the press is something pretty sacrosanct, but equally you don’t want it to be abused by people using it as a fig leaf to cover malfeasant activity, do you have any particular suggestions as to how we can improve the drafting of that clause?

Kyle Taylor: I am not suggesting that the freedom of the press is not sacrosanct. Actually, I am expressing the opposite, which is that I believe that it is so sacrosanct that it should be essential to the freedom-of-expression portion of the Bill, and that the press should be set to a standard that meets international human rights and journalistic standards. I want to be really clear that I absolutely believe in freedom of the press, and it is really important that we don’t leave here suggesting that we don’t think that the press should be free—

Chris Philp Portrait Chris Philp
- Hansard - -

Q I got that, but as I say, article 10 case law does treat the press a little differently. We are about to run out of time. I wanted to ask about algorithms, which I will probably not have a chance to do, but are there any specific changes to the clause that you would urge us to make?

Ellen Judson: To the media exemption—

Chris Philp Portrait Chris Philp
- Hansard - -

To clause 50, “Recognised news publisher”.

Ellen Judson: One of the changes that the Government have indicated that they are minded to make—please correct me if I misunderstood—is to introduce a right to appeal.

Chris Philp Portrait Chris Philp
- Hansard - -

Correct.

Ellen Judson: Content having to stay online while the appeal was taking place I would very much urge not to be introduced, on the grounds that the content staying online might then be found to be incredibly harmful, and by the time you have got through an appeals process, it will already have done the damage it was going to do. So, if there is a right to appeal—I would urge there not to be a particular right to appeal beyond what is already in the Bill, but if that is to be included, not having the restriction that the platforms must carry the content while the appeal process is ongoing would be important.

Kyle Taylor: You could require an independent standards code as a benchmark at least.

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions. It also brings us to the end of the day’s sitting. On behalf of the Committee, I thank the witnesses for your evidence. As you ran out of time and the opportunity to frame answers, if you want to put them in writing and offer them to the Minister, I am sure they will be most welcome. The Committee will meet again on Thursday at 11.30 am in this room to hear further evidence on the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Third sitting)

Chris Philp Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Q Thank you to the witnesses for joining us and giving us such thorough and clear responses to the various questions. I want to start on a topic that William Perrin and William Moy touched on—the exemption for recognised news publishers, set out in clause 50. You both said you have some views on how that is drafted. As you said, I asked questions on Tuesday about whether there are ways in which it could be improved to avoid loopholes—not that I am suggesting there are any, by the way. Mr Perrin and Mr Moy, could you elaborate on the specific areas where you think it might be improved?

William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.

Chris Philp Portrait Chris Philp
- Hansard - -

Q How would you change it to address that, if you think it is an issue?

William Moy: This would need a discussion. I have not come here with a draft amendment—frankly, that is the Government’s job. There are two areas of policy thinking over the last 10 years that provide the right seeds and the right material to go into. One is the line of thinking that has been done about public benefit journalism, which has been taken up in the House of Lords Communications and Digital Committee inquiry and the Cairncross review, and is now reflected in recent Charity Commission decisions. Part of Full Fact’s charitable remit is as a publisher of public interest journalism, which is a relatively new innovation, reflecting the Cairncross review. If you take that line of thinking, there might be some useful criteria in there that could be reflected in this clause.

I hate to mention the L-word in this context, but the other line of thinking is the criteria developed in the context of the Leveson inquiry for what makes a sensible level of self-regulation for a media organisation. Although I recognise that that is a past thing, there are still useful criteria in that line of thinking, which would be worth thinking about in this context. As I said, I would be happy to sit down, as a publisher of journalism, with your officials and industry representatives to work out a viable way of achieving your political objectives as effectively as possible.

William Perrin: Such a definition, of course, must satisfy those who are in the industry, so I would say that these definitions need to be firmly industry-led, not simply by the big beasts—for whom we are grateful, every day, for their incredibly incisive journalism—but by this whole spectrum of new types of news providers that are emerging. I have mentioned my experience many years ago of explaining what a blog was to DCMS.

The news industry is changing massively. I should declare an interest: I was involved in some of the work on public-benefit journalism in another capacity. We have national broadcasters, national newspapers, local papers, local broadcasters, local bloggers and local Twitter feeds, all of which form a new and exciting news media ecosystem, and this code needs to work for all of them. I suppose that you would need a very deep-dive exercise with those practitioners to ensure that they fit within this code, so that you achieve your policy objective.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Okay, thank you. I am not sure that I can take anything specific away from that. Perhaps that illustrates the difficulty of legislating. The clause, as drafted, obviously represents the best efforts, thus far, to deal with an obviously difficult and complicated issue.

We heard some commentary earlier—I think from Mr Moy—about the need to address misinformation, particularly in the context of a serious situation such as the recent pandemic. I think you were saying that there was a meeting, in March or April 2020, for the then Secretary of State and social media firms to discuss the issue and what steps they might take to deal with it. You said that it was a private meeting and that it should perhaps have happened more transparently.

Do you accept that the powers conferred in clause 146, as drafted, do, in fact, address that issue? They give the Secretary of State powers, in emergency situations—a public health situation or a national security situation, as set out in clause 146(1)—to address precisely that issue of misinformation in an emergency context. Under that clause, it would happen in a way that was statutory, open and transparent. In that context, is it not a very welcome clause?

William Moy: I am sorry to disappoint you, Minister, but no, I do not accept that. The clause basically attaches to Ofcom’s fairly weak media literacy duties, which, as we have already discussed, need to be modernised and made harms-based and safety-based.

However, more to the point, the point that I was trying to make is that we have normalised a level of censorship that was unimaginable in previous generations. A significant part of the pandemic response was, essentially, some of the main information platforms in all of our day-to-day lives taking down content in vast numbers and restricting what we can all see and share. We have started to treat that as a normal part of our lives, and, as someone who believes that the best way to inform debate in an open society is freedom of expression, which I know you believe, too, Minister, I am deeply concerned that we have normalised that. In fact, you referred to it in your Times article.

I think that the Bill needs to step in and prevent that kind of overreach, as well as the triggering of unneeded reactions. In the pandemic, the political pressure was all on taking down harmful health content; there was no countervailing pressure to ensure that the systems did not overreach. We therefore found ridiculous examples, such as police posts warning of fraud around covid being taken down by the internet companies’ automated systems because those systems were set to, essentially, not worry about overreach.

That is why we are saying that we need, in the Bill, a modern, open-society approach to misinformation. That starts with it recognising misinformation in the first place. That is vital, of course. It should then go on to create a modern, harms-based media literacy framework, and to prefer content-neutral and free-speech-based interventions over content-restricting interventions. That was not what was happening during the pandemic, and it is not what will happen by default. It takes Parliament to step in and get away from this habitual, content-restriction reaction and push us into an open-society-based response to misinformation.

William Perrin: Can I just add that it does not say “emergency”? It does not say that at all. It says “reasonable grounds” that “present a threat”—not a big threat—under “special circumstances”. We do not know what any of that means, frankly. With this clause, I get the intent—that it is important for national security, at times, to send messages—but this has not been done in the history of public communication before. If we go back through 50 or 60 years, even 70 years, of Government communication, the Government have bought adverts and put messages transparently in place. Apart from D-notices, the Government have never sought to interfere in the operations of media companies in quite the way that is set out here.

If this clause is to stand, it certainly needs a much higher threshold before the Secretary of State can act—such as who they are receiving advice from. Are they receiving advice from directors of public health, from the National Police Chiefs’ Council or from the national security threat assessment machinery? I should declare an interest; I worked in there a long time ago. It needs a higher threshold and greater clarity, but you could dispense with this by writing to Ofcom and saying, “Ofcom, you should have regard to these ‘special circumstances’. Why don’t you take actions that you might see fit to address them?”

Many circumstances, such as health or safety, are national security issues anyway if they reach a high enough level for intervention, so just boil it all down to national security and be done with it.

Professor Lorna Woods: If I may add something about the treatment of misinformation more generally, I suspect that if it is included in the regime, or if some subset such as health misinformation is included in the regime, it will be under the heading of “harmful to adults”. I am picking up on the point that Mr Moy made that the sorts of interventions will be more about friction and looking at how disinformation is incentivised and spread at an earlier stage, rather than reactive takedown.

Unfortunately, the measures that the Bill currently envisages for “harmful but legal” seem to focus more on the end point of the distribution chain. We are talking about taking down content and restricting access. Clause 13(4) gives the list of measures that a company could employ in relation to priority content harmful to adults.

I suppose that you could say, “Companies are free to take a wider range of actions”, but my question then is this: where does it leave Ofcom, if it is trying to assess compliance with a safety duty, if a company is doing something that is not envisaged by the Act? For example, taking bot networks offline, if that is thought a key factor in the spreading of disinformation—I see that Mr Moy is nodding. A rational response might be, “Let’s get rid of bot networks”, but that, as I read it, does not seem to be envisaged by clause 13(4).

I think that is an example of a more general problem. With “harmful but legal”, we would want to see less emphasis on takedown and more emphasis on friction, but the measures listed as envisaged do not go that far up the chain.

None Portrait The Chair
- Hansard -

Minister, we have just got a couple of minutes left, so perhaps this should be your last question.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes. On clause 13(4), the actions listed there are quite wide, given that they include not just “taking down the content”—as set out in clause 13(4)(a) —but also

“(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content.”

I would suggest that those actions are pretty wide, as drafted.

One of the witnesses—I think it was Mr Moy—talked about what were essentially content-agnostic measures to impede virality, and used the word “friction”. Can you elaborate a little bit on what you mean by that in practical terms?

William Moy: Yes, I will give a couple of quick examples. WhatsApp put a forwarding limit on WhatsApp messages during the pandemic. We knew that WhatsApp was a vector through which misinformation could spread, because forwarding is so easy. They restricted it to, I think, six forwards, and then you were not able to forward the message again. That is an example of friction. Twitter has a note whereby if you go to retweet something but you have not clicked on the link, it says, “Do you want to read the article before you share this?” You can still share it, but it creates that moment of pause for people to make a more informed decision.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. Would you accept that the level of specificity that you have just outlined there is very difficult, if not impossible, to put in a piece of primary legislation?

William Moy: But that is not what I am suggesting you do. I am suggesting you say that this Parliament prefers interventions that are content-neutral or free speech-based, and that inform users and help them make up their own minds, to interventions that restrict what people can see and share.

Chris Philp Portrait Chris Philp
- Hansard - -

Q But a piece of legislation has to do more than express a preference; it has to create a statutory duty. I am just saying that that is quite challenging in this context.

William Moy: I do not think it is any more challenging than most of the risk assessments, codes of practice and so on, but I am willing to spend as many hours as it takes to talk through it with you.

None Portrait The Chair
- Hansard -

Order. I am afraid that we have come to the end of our allotted time for questions. On behalf of the Committee, I thank the witnesses for all their evidence.

Examination of Witnesses

Danny Stone MBE, Stephen Kinsella OBE and Liron Velleman gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Would any other witness like to contribute? No.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes, and thank you for the work that you have done on this issue, together with Siobhan Baillie, my hon. Friend the Member for Stroud, which the Government adopted. Some of the areas that you have referred to could be dealt with in subsequent Ofcom codes of practice, but we are certainly happy to look at your submissions. Thank you for the work that you have done in this area.

Danny, we have had some fairly extensive discussions on the question of small but toxic platforms such as 4chan and BitChute—thank you for coming to the Department to discuss them. I heard your earlier response to the shadow Minister, but do you accept that those platforms should be subject to duties in the Bill in relation to content that is illegal and content that is already harmful to children?

Danny Stone: Yes, that is accurate. My position has always been that that is a good thing. The extent and the nature of the content that is harmful to adults on such platforms—you mentioned BitChute but there are plenty of others—require an additional level of regulatory burden and closer proximity to the regulator. Those platforms should have to account for it and say, “We are the platforms; we are happy that this harm is on our platform and”—as the Bill says—“we are promoting it.” You are right that it is captured to some degree; I think it could be captured further.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I understand; thank you. Liron, in an earlier answer, you referred to the protections for content of democratic importance and journalistic content, which are set out in clauses 15 and 16. You suggested and were concerned that they could act as a bar to hateful, prohibited or even illegal speech being properly enforced against. Do you accept that clauses 15 and 16 do not provide an absolute protection for content of democratic importance or journalistic content, and that they do not exempt such content from the Bill’s provisions? They simply say that in discharging duties under the Bill, operators must use

“proportionate systems and processes…to ensure that…content of democratic”—

or journalistic—

“importance is taken into account”.

That is not an absolute protection; it is simply a requirement to take into account and perform a proportionate and reasonable balancing exercise. Is that not reasonable?

Liron Velleman: I have a couple of things to say on that. First, we and others in civil society have spent a decade trying to de-platform some of the most harmful actors from mainstream social media companies. What we do not want to see after the Bill becomes an Act are massive test cases where we do not know which way they will go and where it will be up to either the courts or social media companies to make their own decisions on how much regard they place in those exemptions at the same time as all the other clauses.

Secondly, one of our main concerns is the time it takes for some of that content to be removed. If we have a situation in which there is an expediated process for complaints to be made, and for journalistic content to remain on the platform for an announced time until the platform is able take it down, that could move far outside the realms of that journalistic or democratically important content. Again, using the earlier examples, it does not take long for content such as a livestream of a terrorist attack to be up on the Sun or the Daily Mirror websites and for lots of people to modify that video and bypass content, which can then be shared and used to recruit new terrorists and allow copycat attacks to happen, and can go into the worst sewers of the internet. Any friction that is placed on stopping platforms being able to take down some of that harm is definitely of particular concern to us.

Finally, as we heard on Tuesday, social media platforms—I am not sure I would agree with much of what they would say about the Bill, but I think this is true—do not really understand what they are meant to do with these clauses. Some of them are talking about flowcharts and whether this is a point-scoring system that says, “You get plus one for being a journalist, but minus two for being a racist.” I am not entirely sure that platforms will exercise the same level of regard. If, with some of the better-faith actors in the social media space, we have successfully taken down huge reams of the most harmful content and moved it away from where millions of people can see it to where only tens of thousands can see it, we do not want in any way the potential to open up the risk that hundreds of people could argue that they should be back on platforms when they are currently not there.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Okay, thank you. My last question touches on those issues and is for each of the panel in turn. Some people have claimed—I think wrongly—that the provisions in the Bill in some way threaten free speech. As you will have seen in the article I wrote in The Times earlier this week, I do not think, for a number of reasons, that that is remotely true, but I would be interested in hearing the views of each of the panel members on whether there is any risk to freedom of speech in the work that the Bill does in terms of protecting people from illegal content, harm to children and content that is potentially harmful to adults.

Danny Stone: My take on this—I think people have misunderstood the Bill—is that it ultimately creates a regulated marketplace of harm. As a user, you get to determine how harmful a platform you wish to engage with—that is ultimately what it does. I do not think that it enforces content take-downs, except in relation to illegal material. It is about systems, and in some places, as you have heard today, it should be more about systems, introducing friction, risk-assessing and showing the extent to which harm is served up to people. That has its problems.

The only other thing on free speech is that we sometimes take too narrow a view of it. People are crowded out of spaces, particularly minority groups. If I, as a Jewish person, want to go on 4chan, it is highly unlikely that I will get a fair hearing there. I will be threatened or bullied out of that space. Free speech has to apply across the piece; it is not limited. We need to think about those overlapping harms when it comes to human rights—not just free speech but freedom from discrimination. We need to be thinking about free speech in its widest context.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. You made a very important point: there is nothing in the Bill that requires censorship or prohibition of content that is legal and harmless to children. That is a really important point.

Stephen Kinsella: I agree entirely with what Danny was saying. Of course, we would say that our proposals have no implications for free speech. What we are talking about is the freedom not to be shouted at—that is really what we are introducing.

On disinformation, we did some research in the early days of our campaign that showed that a vast amount of the misinformation and disinformation around the 5G covid conspiracy was spread and amplified by anonymous or unverified accounts, so they play a disproportionate role in disseminating that. They also play a disproportionate role in disseminating abuse, and I think you may have a separate session with Kick It Out and the other football bodies. They have some very good research that shows the extent to which abusive language is from unverified or anonymous accounts. So, no, we do not have any free speech concerns at Clean up the Internet.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Good. Thank you, Stephen. Liron?

Liron Velleman: We are satisfied that the Bill adequately protects freedom of speech. Our key view is that, if people are worried that it does not, beefing up the universal protections for freedom of speech should be the priority, instead of what we believe are potentially harmful exemptions in the Bill. We think that freedom of speech for all should be protected, and we very much agree with what Danny said—that the Bill should be about enhancing freedom of speech. There are so many communities that do not use social media platforms because of the harm that exists currently on platforms.

On children, the Bill should not be about limiting freedom of speech, but a large amount of our work covers the growth of youth radicalisation, particularly in the far right, which exists primarily online and which can then lead to offline consequences. You just have to look at the number of arrests of teenagers for far-right terrorism, and so much of that comes from the internet. Part of the Bill is about moderating online content, but it definitely serves to protect against some of the offline consequences of what exists on the platform. We would hope that if people are looking to strengthen freedom of speech, that is a universalist principle in the Bill, and not for some groups but not others.

Chris Philp Portrait Chris Philp
- Hansard - -

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - -

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.

None Portrait The Chair
- Hansard -

Thank you. As there are no further questions from Members, I thank the witnesses for their evidence. That concludes this morning’s sitting.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fourth sitting)

Chris Philp Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you very much. That is extremely helpful. From the perspective of privacy, how satisfied are you that the Bill as constructed gives the appropriate protections to users’ privacy?

Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. My final question is this: do you feel the Bill has been constructed in such a way that it works consistently with the data protection provisions, such as UK GDPR and the Data Protection Act 2018?

Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.

Chris Philp Portrait Chris Philp
- Hansard - -

That is very helpful. Thank you very much indeed.

None Portrait The Chair
- Hansard -

Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.

Examination of Witnesses

Sanjay Bhandari and Lynn Perry gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I am glad you welcome that. I have a question for both witnesses, briefly. You have commented in some detail on various aspects of the Bill, but do you feel that the Bill as a whole represents a substantial step forward in protecting children, in your case, Ms Perry, and those you speak for, Sanjay?

Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.

We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.

I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Before I turn to Ms Perry with the same question about the Bill’s general effect, Sanjay, you mentioned the terrible incidence of abuse that the three England footballers got after the penalties last summer. Do you think the social media firms’ response to that incident was adequate, or anywhere close to adequate? If not, does that underline the need for this legislation?

Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes. There are strong powers in the Bill for Ofcom to do precisely that. Ms Perry, may I ask you same general question? Do you feel that the Bill represents a very substantial step forward in protecting children?

Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you, Ms Perry. Finally, Mr Bhandari, some people have raised concerns about free speech. I do not share those concerns—in fact, I rebutted them a Times article earlier this week—but does the Bill cause you any concern from a free-speech perspective?

Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.

None Portrait The Chair
- Hansard -

Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.

Examination of Witnesses

Eva Hartshorn-Sanders and Poppy Wood gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you for joining us this afternoon and for giving us your evidence so far. At the beginning of your testimony, Ms Hartshorn-Sanders, I think you mentioned—I want to ensure I heard correctly—that you believe, or have evidence, that Instagram is still, even today, failing to take down 90% of inappropriate content that is flagged to it.

Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.

Chris Philp Portrait Chris Philp
- Hansard - -

Q That clearly illustrates the problem we have. Two parts of the Bill are designed to address this: first, the ability for designated user representation groups to raise super-complaints—an issue such as the one you just mentioned, a systemic issue, could be the subject of such a super-compliant to Ofcom, in this case about Instagram—and, secondly, at clause 18, the Bill imposes duties on the platforms to have proper complaints procedures, through which they have to deal with complaints properly. Do those two provisions, the super-complaints mechanism for representative groups and clause 18 on complaints procedures, go a long way towards addressing the issue that you helpfully and rightly identified?

Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.

There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Meta claimed in evidence to the Committee on Tuesday that it gave researchers good access to its data. Do you think that is true?

Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.

None Portrait The Chair
- Hansard -

Q Ms Wood, do you want to comment on any of this before we move on?

Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.

We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.

The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.

Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.

None Portrait The Chair
- Hansard -

Thank you. Last brief question, Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Goodness! There is a lot to ask about.

None Portrait The Chair
- Hansard -

Sorry, we are running out of time.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I appreciate that; thank you, Sir Roger. Ms Wood, you mentioned misinformation in your earlier remarks—I say “misinformation” rather than “state-sponsored disinformation”, which is a bit different. It is very difficult to define that in statute and to have an approach that does not lead to bias or to what might be construed as censorship. Do you have any particular thoughts on how misinformation could be concretely and tangibly addressed?

Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.

If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.

None Portrait The Chair
- Hansard -

Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.

Examination of Witnesses

Owen Meredith and Matt Rogerson gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. Can I move on to the question that Kim Leadbeater asked a moment ago, and that a number of Members have raised? You very kindly said a moment ago that you thought that clause 50, which sets out the definition of “recognised news publisher”, works as drafted. I would like to test that a bit, because some witnesses have said that it is quite widely drawn, and suggested that it would be relatively easy for somebody to set themselves up in a manner that met the test laid out in clause 50. Given the criticism that we have heard a few times today and on Tuesday, can you just expand for the Committee why you think that is not the case?

Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.

On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.

Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you. So you are categorically satisfied about the risks that we have heard articulated; that maleficent actors would not be able to set themselves up in such a way that they benefit from this exemption.

Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you; that is very helpful. I have only one other question. In relation to questions concerning freedom of speech, the Government believe, and I believe, that the Bill very powerfully protects freedom of speech. Indeed, it does so explicitly through clause 19, in addition to the protections for recognised news publishers that we have discussed already and the additional protections for content of journalistic and democratic importance, notwithstanding the definitional question that have been raised. Would you agree that this Bill respects and protects free speech, while also delivering the safety objectives that it quite rightly has?

Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes, but with the amendment committed to on Second Reading, you would say that the Bill does meet those freedom of speech objectives, subject to the detail.

Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.

Chris Philp Portrait Chris Philp
- Hansard - -

Thank you. That is very helpful. Mr Rogerson?

Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.

Chris Philp Portrait Chris Philp
- Hansard - -

Q From a general free speech perspective—which obviously includes the press’s freedom of speech, but everybody else’s as well—what do you think about the right enshrined in clause 19(2), where for the first time ever the platforms’ have to have regard to the importance of protecting users’ right to freedom of speech is put on the face of a Bill? Do you think that is helpful? It is a legal obligation they do not currently have, but they will have it after the passage of the Bill. In relation to “legal but harmful” duties, platforms will also have an obligation to be consistent in the application of their own terms and conditions, which they do not have to be at the moment. Very often, they are not consistent; very often, they are arbitrary. Do you think those two changes will help general freedom of speech?

Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.

Chris Philp Portrait Chris Philp
- Hansard - -

I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.

None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

None Portrait The Chair
- Hansard -

Q Mr Fassam, do you have the answer?

Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Mr Fassam, I will address those two questions, if I may. Search is covered by clause 35 and user-generated content is subject to the Bill’s general provisions on user-generated content. Included in the scope of that are the priority illegal offences defined in schedule 7. Among those are included, on page 185—not that I expect you to have memorised the Bill—financial services offences that include a number of those offences to do with pretending to carry out regulated financial activity when in fact you are not regulated. Also included are the fraud offences—the various offences under the Fraud Act 2006. Do come back if you think I have this wrong, but I believe that we have search covered in clause 35 and promoted user-generated content covered via schedule 7 page 185.

Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.

Chris Philp Portrait Chris Philp
- Hansard - -

Q In clause 35 there is the drafting point that we are looking at. It says “minimise the risk” instead of “prevent”. You are right to point out that drafting issue. In relation to the user-generated stuff, there is a duty on the platforms to proactively stop priority illegal content, as defined in schedule 7. I do take your drafting point on clause 35.

Tim Fassam: Thank you.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I want to pick up on Martin Lewis’s point about enforcement. He said that he had to sue Facebook himself, which was no doubt an onerous, painful and costly enterprise—at least costly initially, because hopefully you got your expenses back. Under the Bill, enforcement will fall to Ofcom. The penalties that social media firms could be handed by Ofcom for failing to meet the duties we have discussed include a fine amounting to 10% of global revenue as a maximum, which runs into billions of pounds. Do the witnesses feel that level of sanction—10% of global revenue and ultimately denial of service—is adequately punitive? Will it provide an adequate deterrent to the social media firms that we are considering?

None Portrait The Chair
- Hansard -

Mr Lewis, as you were named, I think you had better start.

Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.

The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.

Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.

Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.

Chris Philp Portrait Chris Philp
- Hansard - -

Q I have one final question that again relates to the question of reporting scams, which I think two or three witnesses have referred to. I will briefly outline the provisions in the Bill that address that. I would like to ask the witnesses if they think those provisions are adequate. First, in clause 18, the Bill imposes on large social media firms an obligation to have a proper complaints procedure so that complaints are not ignored, as appears to happen on a shockingly frequent basis. That is at the level of individual complaints. Of course, if social media firms do not do that, it will be for Ofcom to enforce against them.

Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?

Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.

We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.

Chris Philp Portrait Chris Philp
- Hansard - -

Q That is a really important point—you made it earlier—about the complaints process being hidden. Clause 18(2)(c) says that the complaints system must be

“easy to access, easy to use (including by children) and transparent.”

The previous paragraph (b) states that the system must

“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.

The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.

Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.

Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Well, your organisation, as one that represents firms in this space, could in fact be designated as a super-complainant to represent your members, as much as someone like Which? could be designated to represent the man on the street like you or me.

Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.

None Portrait The Chair
- Hansard -

Last word to Rocio Concha.

Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.

On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Yes, that is a very good question. Please do write to us about that. Clause 140, on super-complaints, refers to “regulated services”. My very quick, off-the-cuff interpretation is that that would include everything covered and regulated by the Bill. I notice that there is a reference to user-to-user services in clause 18. Do write to us on that point. We would be happy to look at it in detail. Do not take my comment as definitive, because I have only just looked at it in the last 20 seconds.

Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

I am very grateful. Thank you.

None Portrait The Chair
- Hansard -

Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.

Martin Lewis: I am interviewing the Chancellor in 15 minutes.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - -

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Chris Philp Portrait Chris Philp
- Hansard - -

Q When I was asking questions on Tuesday, the representative of Meta made a second claim that raised my eyebrow. He claimed that, in designing its algorithms, it did not primarily seek to optimise for engagement. Do you think that was true?

Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.

The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.

Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.

Chris Philp Portrait Chris Philp
- Hansard - -

Q The Bill contains provisions that require companies to do risk assessments that cover their algorithms, and then to be transparent about those risk assessments with Ofcom. Do you think those provisions will deliver the change required in the approach that the companies take?

Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.

One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”

Chris Philp Portrait Chris Philp
- Hansard - -

Q Let me offer a word of reassurance on that. In this Bill, the penalties are up to 10% of global revenue, not profit. Secondly, in relation to the provision of information to Ofcom, there is personal criminal liability for named executives, with a period of incarceration of up to two years, for the reason you mentioned.

Frances Haugen: Oh, good. That’s wonderful.

Chris Philp Portrait Chris Philp
- Hansard - -

We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.

My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?

Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.

Chris Philp Portrait Chris Philp
- Hansard - -

Thank you very much.

None Portrait The Chair
- Hansard -

Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.

The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fifth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
None Portrait The Chair
- Hansard -

Good morning, ladies and gentleman. If anybody wishes to take their jacket off, they are at liberty to do so when I am in the Chair—my co-Chairman is joining us, and I am sure she will adopt the same procedure. I have a couple of preliminary announcements. Please make sure that all mobile phones are switched off. Tea and coffee are not allowed in the Committee, I am afraid. I think they used to be available outside in the corridor, but I do not know whether that is still the case.

We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.

Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.

The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.

We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.

Clause 1

Overview of Act

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Thank you, Sir Roger; it is a pleasure to serve under your chairmanship once again. It may be appropriate to take this opportunity to congratulate my right hon. Friend the Member for Basingstoke on her damehood in the Queen’s birthday honours, which was very well deserved indeed.

This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.

None Portrait The Chair
- Hansard -

The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

Chris Philp Portrait Chris Philp
- Hansard - -

Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.

As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.

The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.

The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.

The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.

The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.

The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.

The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I am sure we will discuss this topic a bit more as the Bill progresses.

I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.

Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.

Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.

The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.

I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 2 accordingly ordered to stand part of the Bill.

Clause 3 ordered to stand part of the Bill.

Schedules 1 and 2 agreed to.

Clause 4 ordered to stand part of the Bill.

None Portrait The Chair
- Hansard -

Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.

As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?

It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.

Clause 5

Overview of Part 3

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.

The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.

The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.

My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.

There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - -

We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.

The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.

I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.

We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.

Question put and agreed to.

Clause 5 accordingly ordered to stand part of the Bill.

Clause 6

Providers of user-to-user services: duties of care

None Portrait The Chair
- Hansard -

Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”

This is slightly more complex. It is a very complex Bill, and I think I am right in saying that it is the first time in my experience that we are taking other clause stand parts as part of the groups of amendments, because there is an enormous amount of crossover between the clauses. That will make it, for all of us, slightly harder to regulate. It is for that reason—the Minister was kind enough to say that I was reasonably generous in allowing a broad-ranging debate—that I think we are going to have to do that with this group.

I, and I am sure Ms Rees, will not wish to be draconian in seeking to call Members to order if you stray slightly outside the boundaries of a particular amendment. However, we have to get on with this, so please try not to be repetitive if you can possibly avoid it, although I accept that there may well be some cases where it is necessary.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

That is a huge concern for us. As was brought up in our evidence sessions with Ofcom, it is recruiting, effectively, a fundraising officer for the regulator. That throws into question the potential longevity of the regulator’s funding and whether it is resourced effectively to properly scrutinise and regulate the online platforms. If that long-term resource is not available, how can the regulator effectively scrutinise and bring enforcement to bear against companies for enabling illegal activity?

Chris Philp Portrait Chris Philp
- Hansard - -

Just to reassure the shadow Minister and her hon. Friend the Member for Liverpool, Walton, the Bill confers powers on Ofcom to levy fees and charges on the sector that it is regulating—so, on social media firms—to recoup its costs. We will debate that in due course—I think it is in clause 71, but that power is in the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification and I look forward to debating that further as the Bill progresses.

Returning to the senior managers and certificate regime in the financial services industry, it states that senior managers must be preapproved by the regulator, have their responsibilities set out in a statement of responsibilities and be subject to enhanced conduct standards. Those in banks are also subject to regulatory requirements on their remuneration. Again, it baffles me that we are not asking the same for child safety from online platforms and companies.

The money laundering regulations also use the threat of criminal offences to drive culture change. Individuals can be culpable for failure of processes, as well as for intent. I therefore hope that the Minister will carefully consider the need for the same to apply to our online space to make children safe.

Amendment 70 is a technical amendment that we will be discussing later on in the Bill. However, I am happy to move it in the name of the official Opposition.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I congratulate my own Front Bench on this important amendment. I would like the Minister to respond to the issue of transparency and the reason why only the regulator would have sight of these risk assessments. It is fundamental that civil society groups and academics have access to them. Her Majesty’s Revenue and Customs is an example of where that works very well. HMRC publishes a lot of its data, which is then used by academics and researchers to produce reports and documents that feed back into the policy making processes and HMRC’s work. It would be a missed opportunity if the information and data gathered by Ofcom were not widely available for public scrutiny.

I would reinforce the earlier points about accountability. There are too many examples—whether in the financial crash or the collapse of companies such as Carillion—where accountability was never there. Without this amendment and the ability to hold individuals to account for the failures of companies that are faceless to many people, the legislation risks being absolutely impotent.

Finally, I know that we will get back to the issue of funding in a later clause but I hope that the Minister can reassure the Committee that funding for the enforcement of these regulations will be properly considered.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by speaking to clauses 6, 7, 21 and 22 stand part. I will then address the amendments moved by the shadow Minister.

None Portrait The Chair
- Hansard -

Order. I apologise for interrupting, Minister, but the stand part debates on clauses 7, 21 and 22 are part of the next grouping, not this one. I am fairly relaxed about it, but just be aware that you cannot have two debates on this.

Chris Philp Portrait Chris Philp
- Hansard - -

The grouping sheet I have here suggests that clause 7 stand part and clauses 21 and 22 stand part are in this grouping, but if I have misunderstood—

None Portrait The Chair
- Hansard -

No, there are two groups. Let me clarify this for everyone, because it is not as straightforward as it normally is. At the moment we are dealing with amendments 69 and 70. The next grouping, underneath this one on your selection paper, is the clause stand part debates—which is peculiar, as effectively we are having the stand part debate on clause 6 now. For the convenience of the Committee, and if the shadow Minister is happy, I am relaxed about taking all this together.

None Portrait The Chair
- Hansard -

The hon. Lady can be called again. The Minister is not winding up at this point.

Chris Philp Portrait Chris Philp
- Hansard - -

In the interests of simplicity, I will stick to the selection list and adapt my notes accordingly to confine my comments to amendments 69 and 70, and then we will come to the stand part debates in due course. I am happy to comply, Sir Roger.

Speaking of compliance, that brings us to the topic of amendments 69 and 70. It is worth reminding ourselves of the current enforcement provisions in the Bill, which are pretty strong. I can reassure the hon. Member for Liverpool, Walton that the enforcement powers here are far from impotent. They are very potent. As the shadow Minister acknowledged in her remarks, we are for the first time ever introducing senior management liability, which relates to non-compliance with information notices and offences of falsifying, encrypting or destroying information. It will be punishable by a prison sentence of up to two years. That is critical, because without that information, Ofcom is unable to enforce.

We have had examples of large social media firms withholding information and simply paying a large fine. There was a Competition and Markets Authority case a year or two ago where a large social media firm did not provide information repeatedly requested over an extended period and ended up paying a £50 million fine rather than providing the information. Let me put on record now that that behaviour is completely unacceptable. We condemn it unreservedly. It is because we do not want to see that happen again that there will be senior manager criminal liability in relation to providing information, with up to two years in prison.

In addition, for the other duties in the Bill there are penalties that Ofcom can apply for non-compliance. First, there are fines of up to 10% of global revenue. For the very big American social media firms, the UK market is somewhere just below 10% of their global revenue, so 10% of their global revenue is getting on for 100% of their UK revenue. That is a very significant financial penalty, running in some cases into billions of pounds.

In extreme circumstances—if those measures are not enough to ensure compliance—there are what amount to denial of service powers in the Bill, where essentially Ofcom can require internet service providers and others, such as payment providers, to disconnect the companies in the UK so that they cannot operate here. Again, that is a very substantial measure. I hope the hon. Member for Liverpool, Walton would agree that those measures, which are in the Bill already, are all extremely potent.

The question prompted by the amendment is whether we should go further. I have considered that issue as we have been thinking about updating the Bill—as hon. Members can imagine, it is a question that I have been debating internally. The question is whether we should go further and say there is personal criminal liability for breaches of the duties that go beyond information provision. There are arguments in favour, which we have heard, but there are arguments against as well. One is that if we introduce criminal liability for those other duties, that introduces a risk that the social media firms, fearing criminal prosecution, will become over-zealous and just take everything down because they are concerned about being personally liable. That could end up having a chilling effect on content available online and goes beyond what we in Parliament would intend.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

In a moment.

For those reasons, I think we have drawn the line in the right place. There is personal criminal liability for information provision, with fines of 10% of local revenue and service disruption—unplugging powers—as well. Having thought about it quite carefully, I think we have struck the balance in the right place. We do not want to deter people from offering services in the UK. If they worried that they might go to prison too readily, it might deter people from locating here. I fully recognise that there is a balance to strike. I feel that the balance is being struck in the right place.

I will go on to comment on a couple of examples we heard about Carillion and the financial crisis, but before I do so, I will give way as promised.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that the Minister says he has been swithering on this point—he has been trying to work out the correct place to draw the line. Given that we do not yet have a commitment for a standing committee—again, that is potentially being considered—we do not know how the legislation is going to work. Will the Minister, rather than accepting the amendment, give consideration to including the ability to make changes via secondary legislation so that there is individual criminal liability for different breaches? That would allow him the flexibility in the future, if the regime is not working appropriately, to add through secondary legislation individual criminal liability for breaches beyond those that are currently covered.

Chris Philp Portrait Chris Philp
- Hansard - -

I have not heard that idea suggested. I will think about it. I do not want to respond off the cuff, but I will give consideration to the proposal. Henry VIII powers, which are essentially what the hon. Lady is describing—an ability through secondary legislation effectively to change primary legislation—are obviously viewed askance by some colleagues if too wide in scope. We do use them, of course, but normally in relatively limited circumstances. Creating a brand new criminal offence via what amounts to a Henry VIII power would be quite a wide application of the power, but it is an idea that I am perfectly happy to go away and reflect on. I thank her for mentioning the idea.

A couple of examples were given about companies that have failed in the past. Carillion was not a financial services company and there was no regulatory oversight of the company at all. In relation to financial services regulation, despite the much stricter regulation that existed in the run-up to the 2008 financial crisis, that crisis occurred none the less. [Interruption.] We were not in government at the time. We should be clear-eyed about the limits of what regulation alone can deliver, but that does not deter us from taking the steps we are taking here, which I think are extremely potent, for all the reasons that I mentioned and will not repeat.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 7, as I have previously mentioned, we were all pleased to see the Government bring in more provisions to tackle pornographic content online, much of which is easily accessible and can cause harm to those viewing it and potentially to those involved in it.

As we have previously outlined, a statutory duty of care for social platforms online has been missing for far too long, but we made it clear on Second Reading that such a duty will only be effective if we consider the systems, business models and design choices behind how platforms operate. For too long, platforms have been abuse-enabling environments, but it does not have to be this way. The amendments that we will shortly consider are largely focused on transparency, as we all know that the duties of care will only be effective if platforms are compelled to proactively supply their assessments to Ofcom.

On clause 21, the duty of care approach is one that the Opposition support and it is fundamentally right that search services are subject to duties including illegal content risk assessments, illegal content assessments more widely, content reporting, complaints procedures, duties about freedom of expression and privacy, and duties around record keeping. Labour has long held the view that search services, while not direct hosts of potentially damaging content, should have responsibilities that see them put a duty of care towards users first, as we heard in our evidence sessions from HOPE not hate and the Antisemitism Policy Trust.

It is also welcome that the Government have committed to introducing specific measures for regulated search services that are likely to be accessed by children. However, those measures can and must go further, so we will be putting forward some important amendments as we proceed.

Labour does not oppose clause 22, either, but I would like to raise some important points with the Minister. We do not want to be in a position whereby those designing, operating and using a search engine in the United Kingdom are subject to a second-rate internet experience. We also do not want to be in a position where we are forcing search services to choose what is an appropriate design for people in the UK. It would be worrying indeed if our online experience vastly differed from that of, let us say, our friends in the European Union. How exactly will clause 22 ensure parity? I would be grateful if the Minister could confirm that before we proceed.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has already touched on the effect of these clauses: clause 6 sets out duties applying to user-to-user services in a proportionate and risk-based way; clause 7 sets out the scope of the various duties of care; and clauses 21 and 22 do the same in relation to search services.

In response to the point about whether the duties on search will end up providing a second-rate service in the United Kingdom, I do not think that they will. The duties have been designed to be proportionate and reasonable. Throughout the Bill, Members will see that there are separate duties for search and for user-to-user services. That is reflected in the symmetry—which appears elsewhere, too—of clauses 6 and 7, and clauses 21 and 22. We have done that because we recognise that search is different. It indexes the internet; it does not provide a user-to-user service. We have tried to structure these duties in a way that is reasonable and proportionate, and that will not adversely impair the experience of people in the UK.

I believe that we are ahead of the European Union in bringing forward this legislation and debating it in detail, but the European Union is working on its Digital Services Act. I am confident that there will be no disadvantage to people conducting searches in United Kingdom territory.

Question put and agreed to.

Clause 6 accordingly ordered to stand part of the Bill.

Clause 7 ordered to stand part of the Bill.

Clause 8

Illegal content risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 10, in clause 8, page 6, line 33, at end insert—

“(4A) A duty to publish the illegal content risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish an illegal content risk assessment and supply it to Ofcom.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Clause 8 sets out the risk assessment duties for illegal content, as already discussed, that apply to user-to-user services. Ofcom will issue guidance on how companies can undertake those. To comply with those duties, companies will need to take proportionate measures to mitigate the risks identified in those assessments. The clause lists a number of potential risk factors the providers must assess, including how likely it is that users will encounter illegal content, as defined later in the Bill,

“by means of the service”.

That phrase is quite important, and I will come to it later, on discussing some of the amendments, because it does not necessarily mean just on the service itself but, in a cross-platform point, other sites where users might find themselves via the service. That phrase is important in the context of some of the reasonable queries about cross-platform risks.

Moving on, companies will also need to consider how the design and operation of their service may reduce or increase the risks identified. Under schedule 3, which we will vote on, or at least consider, later on, companies will have three months to carry out risk assessments, which must be kept up to date so that fresh risks that may arise from time to time can be accommodated. Therefore, if changes are made to the service, the risks can be considered on an ongoing basis.

Amendment 10 relates to the broader question that the hon. Member for Liverpool, Walton posed about transparency. The Bill already contains obligations to publish summary risk assessments on legal but harmful content. That refers to some of the potentially contentious or ambiguous types of content for which public risk assessments would be helpful. The companies are also required to make available those risk assessments to Ofcom on request. That raises a couple of questions, as both the hon. Member for Liverpool, Walton mentioned and some of the amendments highlighted. Should companies be required to proactively serve up their risk assessments to Ofcom, rather than wait to be asked? Also, should those risk assessments all be published—probably online?

In considering those two questions, there are a couple of things to think about. The first is Ofcom’s capacity. As we have discussed, 25,000 services are in scope. If all those services proactively delivered a copy of their risk assessment, even if they are very low risk and of no concern to Ofcom or, indeed, any of us, they would be in danger of overwhelming Ofcom. The approach contemplated in the Bill is that, where Ofcom has a concern or the platform is risk assessed as being significant—to be clear, that would apply to all the big platforms—it will proactively make a request, which the platform will be duty bound to meet. If the platform does not do that, the senior manager liability and the two years in prison that we discussed earlier will apply.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentioned earlier that Ofcom would be adequately resourced and funded to cope with the regulatory duty set out in the Bill. If Ofcom is not able to receive risk assessments for all the platforms potentially within scope, even if those platforms are not deemed to be high risk, does that not call into question whether Ofcom has the resource needed to actively carry out its duties in relation to the Bill?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Of course, Ofcom is able to request any of them if it wants to—if it feels that to be necessary—but receiving 25,000 risk assessments, including from tiny companies that basically pose pretty much no risk at all and hardly anyone uses, would, I think, be an unreasonable and disproportionate requirement to impose. I do not think it is a question of the resources being inadequate; it is a question of being proportionate and reasonable.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

The point I was trying to get the Minister to think about was the action of companies in going through the process of these assessments and then making that information publicly available to civil society groups; it is about transparency. It is what the sector needs; it is the way we will find and root out the problems, and it is a great missed opportunity in this Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Just to be clear, we are saying that the only reason why we should not be encouraging the companies to do the risk assessment is that Ofcom might not be able to cope with dealing with all the risk assessments. But surely that is not a reason not to do it. The risk assessment is a fundamental part of this legislation. We have to be clear that there is no point in the companies having those risk assessments if they are not visible and transparent.

Chris Philp Portrait Chris Philp
- Hansard - -

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the rules are so bad that people can circumvent them, they are not good enough anyway and they need to be updated, but I have a specific question on this. The Minister says that Ofcom will be taking in the biggest risk assessments, looking at them and ensuring that they are adequate. Will he please give consideration to asking Ofcom to publish the risk assessments from the very biggest platforms? Then they will all be in one place. They will be easy for people to find and people will not have to rake about in the bottom sections of a website. And it will apply only in the case of the very biggest, most at risk platforms, which should be regularly updating their risk assessments and changing their processes on a very regular basis in order to ensure that people are kept safe.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for her intervention and for the—

None Portrait The Chair
- Hansard -

Order. I am sorry to interrupt the Minister, but I now have to adjourn the sitting until this afternoon, when the Committee will meet again, in Room 9 and with Ms Rees in the Chair.

Online Safety Bill (Sixth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Amendment 14, in clause 8, page 6, line 33, at end insert—

“(4A) A duty for the illegal content risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the illegal content risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.

Amendment 25, in clause 8, page 7, line 3, after the third “the” insert “production,”.

This amendment requires the risk assessment to take into account the risk of the production of illegal content, as well as the risk of its presence and dissemination.

Amendment 19, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Clause stand part.

Amendment 20, in clause 9, page 7, line 30, at end insert—

“, including by being directed while on the service towards priority illegal content hosted by a different service;”.

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 26, in clause 9, page 7, line 30, at end insert—

“(aa) prevent the production of illegal content by means of the service;”.

This amendment incorporates a requirement to prevent the production of illegal content within the safety duties.

Amendment 18, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 21, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content,”.

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Clause 9 stand part.

Amendment 30, in clause 23, page 23, line 24, after “facilitating” insert—

“the production of illegal content and”.

This amendment requires the illegal content risk assessment to consider the production of illegal content.

Clause 23 stand part.

Amendment 31, in clause 24, page 24, line 2, after “individuals” insert “producing or”.

This amendment expands the safety duty to include the need to minimise the risk of individuals producing certain types of search content.

Clause 24 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a great pleasure to serve under your chairmanship, Ms Rees, and I am glad that this afternoon’s Committee proceedings are being broadcast to the world.

Before we adjourned this morning, I was in the process of saying that one of the challenges with public publication of the full risk assessment, even for larger companies, is that the vulnerabilities in their systems, or the potential opportunities to exploit those systems for criminal purposes, would then be publicly exposed in a way that may not serve the public interest, and that is a reason for not requiring complete disclosure of everything.

However, I draw the Committee’s attention to the existing transparency provisions in clause 64. We will come on to them later, but I want to mention them now, given that they are relevant to amendment 10. The transparency duties state that, once a year, Ofcom must serve notice on the larger companies—those in categories 1, 2A and 2B—requiring them to produce a transparency report. That is not a power for Ofcom—it is a requirement. Clause 64(1) states that Ofcom

“must give every provider…a notice which requires the provider to produce…(a ‘transparency report’).”

The content of the transparency report is specified by Ofcom, as set out in subsection (3). As Members will see, Ofcom has wide powers to specify what must be included in the report. On page 186, schedule 8—I know that we will debate it later, but it is relevant to the amendment—sets out the scope of what Ofcom can require. It is an extremely long list that covers everything we would wish to see. Paragraph 1, for instance, states:

“The incidence of illegal content, content that is harmful to children and priority content that is harmful to adults on a service.”

Therefore, the transparency reporting requirement—it is not an option but a requirement—in clause 64 addresses the transparency point that was raised earlier.

Amendment 14 would require a provider’s board members or senior manager to take responsibility for the illegal content risk assessment. We agree with the Opposition’s point. Indeed, we agree with what the Opposition are trying to achieve in a lot of their amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

There is a “but” coming. We think that, in all cases apart from one, the Bill as drafted already addresses the matter. In the case of amendment 14, the risk assessment duties as drafted already explicitly require companies to consider how their governance structures may affect the risk of harm to users arising from illegal content. Ofcom will provide guidance to companies about how they can comply with those duties, which is very likely to include measures relating to senior-level engagement. In addition, Ofcom can issue confirmation decisions requiring companies to take specific steps to come into compliance. To put that simply, if Ofcom thinks that there is inadequate engagement by senior managers in relation to the risk assessment duties, it can require—it has the power to compel—a change of behaviour by the company.

I come now to clause 9—I think this group includes clause 9 stand part as well. The shadow Minister has touched on this. Clause 9 contains safety duties in relation to—

None Portrait The Chair
- Hansard -

Order. Minister, I do not think we are doing clause 9. We are on clause 8.

Chris Philp Portrait Chris Philp
- Hansard - -

I think the group includes clause 9 stand part, but I will of course be guided by you, Ms Rees.

None Portrait The Chair
- Hansard -

No, clause 9 is separate.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.

Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered

“by means of the service”.

That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,

“by means of the service”,

appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Specifically on that, I understand the point the hon. Gentleman is making and appreciate his clarification. However, on something such as Snapchat, if somebody takes a photo, it is sent to somebody else, then disappears immediately, because that is what Snapchat does—the photo is no longer present. It has been produced and created there, but it is not present on the platform. Can the Minister consider whether the Bill adequately covers all the instances he hopes are covered?

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady raises an interesting point about time. However, the clause 8(5)(d) uses the wording,

“the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content”

and so on. That presence can happen at any time, even fleetingly, as with Snapchat. Even when the image self-deletes after a certain period—so I am told, I have not actually used Snapchat—the presence has occurred. Therefore, that would be covered by clause 8(5)(d).

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Will the Minister explain how we would be able to prove, once the image is deleted, that it was present on the platform?

Chris Philp Portrait Chris Philp
- Hansard - -

The question of proof is a separate one, and that would apply however we drafted the clause. The point is that the clause provides that any presence of a prohibited image would fall foul of the clause. There are also duties on the platforms to take reasonable steps. In the case of matters such as child sexual exploitation and abuse images, there are extra-onerous duties that we have discussed before, for obvious and quite correct reasons.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister stress again that in this clause specifically he is talking about facilitating any presence? That is the wording that he has just used. Can he clarify exactly what he means? If the Minister were to do so, it would be an important point for the Bill as it proceeds.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Is that as clear as mud?

Chris Philp Portrait Chris Philp
- Hansard - -

I am happy to follow your direction, Ms Rees. I find that that is usually the wisest course of action.

I will speak to amendment 18, which is definitely on the agenda for this grouping and which the shadow Minister addressed earlier. It would oblige service providers to put in place systems and processes

“to minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

The Government completely support that objective, quite rightly promoted by the Opposition, but it is set out in the Bill as drafted. The companies in scope are obliged to take comprehensive measures to tackle CSEA content, including where a service directs users on the first service to the second service.

Amendment 21, in a similar spirit, talks about cross-platform collaboration. I have already mentioned the way in which the referral of a user from one platform to another is within the scope of the Bill. Again, under its provisions, service providers must put in place proportionate systems and processes to mitigate identified cross-platform harms and, where appropriate, to achieve that objective service providers would be expected to collaborate and communicate with one another. If Ofcom finds that they are not engaging in appropriate collaborative behaviour, which means they are not discharging their duty to protect people and children, it can intervene. While agreeing completely with the objective sought, the Bill already addresses that.

--- Later in debate ---
None Portrait The Chair
- Hansard -

They are in this group, so you may deal with them now.

Chris Philp Portrait Chris Philp
- Hansard - -

Obviously, I encourage the Committee to support those clauses standing part of the Bill. They impose duties on search services—we touched on search a moment ago—to assess the nature and risk to individuals of accessing illegal content via their services, and to minimise the risk of users encountering that illegal content. They are very similar duties to those we discussed for user-to-user services, but applied in the search context. I hope that that addresses all the relevant provisions in the group that we are debating.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to speak to amendments to clause 9 and to clauses 23 and 24, which I did not speak on earlier. I am also very grateful that we are being broadcast live to the world and welcome that transparency for all who might be listening.

On clause 9, it is right that the user-to-user services will be required to have specific duties and to take appropriate measures to mitigate and manage the risk of harm to individuals and their likelihood of encountering priority illegal content. Again, however, the Bill does not go far enough, which is why we are seeking to make these important amendments. On amendment 18, it is important to stress that the current scope of the Bill does not capture the range of ways in which child abusers use social networks to organise abuse, including to form offender networks. They post digital breadcrumbs that signpost to illegal content on third-party messaging apps and the dark web, and they share child abuse videos that are carefully edited to fall within content moderation guidelines. This range of techniques, known as child abuse breadcrumbing, is a significant enabler of online child abuse.

Our amendment would give the regulator powers to tackle breadcrumbing and ensure a proactive upstream response. The amendment would ensure that tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material will be brought into regulatory scope. It will not leave that as ambiguous. The amendment will also ensure that companies must tackle child abuse at the earliest possible stage. As it stands, the Bill would reinforce companies’ current focus only on material that explicitly reaches the criminal threshold. Because companies do not focus their approach on other child abuse material, abusers can exploit this knowledge to post carefully edited child abuse images and content that enables them to connect and form networks with other abusers. Offenders understand and can anticipate that breadcrumbing material will not be proactively identified or removed by the host site, so they are able to organise and link to child abuse in plain sight.

We all know that child abuse breadcrumbing takes many forms, but techniques include tribute sites where users create social media profiles using misappropriated identities of known child abuse survivors. These are used by offenders to connect with likeminded perpetrators to exchange contact information, form offender networks and signpost child abuse material elsewhere online. In the first quarter of 2021, there were 6 million interactions with such accounts.

Abusers may also use Facebook groups to build offender groups and signpost to child abuse hosted on third-party sites. Those groups are thinly veiled in their intentions; for example, as we heard in evidence sessions, groups are formed for those with an interest in children celebrating their 8th, 9th or 10th birthdays. Several groups with over 50,000 members remained alive despite being reported to Meta, and algorithmic recommendations quickly suggested additional groups for those members to join.

Lastly, abusers can signpost to content on third-party sites. Abusers are increasingly using novel forms of technology to signpost to online child abuse, including QR codes, immersive technologies such as the metaverse, and links to child abuse hosted on the blockchain. Given the highly agile nature of the child abuse threat and the demonstrable ability of sophisticated offenders to exploit new forms of technology, this amendment will ensure that the legislation is effectively futureproofed. Technological change makes it increasingly important that the ability of child abusers to connect and form offender networks can be disrupted at the earliest possible stage.

Turning to amendment 21, we know that child abuse is rarely siloed on a single platform or app. Well-established grooming pathways see abusers exploit the design features of social networks to contact children before they move communication across to other platforms, including livestreaming sites, as we have already heard, and encrypted messaging services. Offenders manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with a large number of children. They can then use direct messages to groom them and coerce children into sending sexual images via WhatsApp. Similarly, as we heard earlier, abusers can groom children through playing videogames and then bringing them on to another ancillary platform, such as Discord.

The National Society for the Prevention of Cruelty to Children has shared details of an individual whose name has been changed, and whose case particularly highlights the problems that children are facing in the online space. Ben was 14 when he was tricked on Facebook into thinking he was speaking to a female friend of a friend, who turned out to be a man. Using threats and blackmail, he coerced Ben into sending abuse images and performing sex acts live on Skype. Those images and videos were shared with five other men, who then bombarded Ben with further demands. His mum, Rachel, said:

“The abuse Ben suffered had a devastating impact on our family. It lasted two long years, leaving him suicidal.

It should not be so easy for an adult to meet and groom a child on one site then trick them into livestreaming their own abuse on another app, before sharing the images with like-minded criminals at the click of a button.

Social media sites should have to work together to stop this abuse happening in the first place, so other children do not have to go through what Ben did.”

The current drafting of the Bill does not place sufficiently clear obligations on platforms to co-operate on the cross-platform nature of child abuse. Amendment 21 would require companies to take reasonable and proportionate steps to share threat assessments, develop proportionate mechanisms to share offender intelligence, and create a rapid response arrangement to ensure that platforms develop a coherent, systemic approach to new and emerging threats. Although the industry has developed a systemic response to the removal of known child abuse images, these are largely ad hoc arrangements that share information on highly agile risk profiles. The cross-platform nature of grooming and the interplay of harms across multiple services need to be taken into account. If it is not addressed explicitly in the Bill, we are concerned that companies may be able to cite competition concerns to avoid taking action.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with the hon. Member, and appreciate her intervention. It is fundamental for this point to be captured in the Bill because, as we are seeing, this is happening more and more. More and more victims are coming forward who have been subject to livestreaming that is not picked up by the technology available, and is then recorded and posted elsewhere on smaller platforms.

Legal advice suggests that cross-platform co-operation is likely to be significantly impeded by the negative interplay with competition law unless there is a clear statutory basis for enabling or requiring collaboration. Companies may legitimately have different risk and compliance appetites, or may simply choose to hide behind competition law to avoid taking a more robust form of action.

New and emerging technologies are likely to produce an intensification of cross-platform risks in the years ahead, and we are particularly concerned about the child abuse impacts in immersive virtual reality and alternative-reality environments, including the metaverse. A number of high-risk immersive products are already designed to be platform-agnostic, meaning that in-product communication takes place between users across multiple products and environments. There is a growing expectation that these environments will be built along such lines, with an incentive for companies to design products in this way in the hope of blunting the ability of Governments to pursue user safety objectives.

Separately, regulatory measures that are being developed in the EU, but are highly likely to impact service users in the UK, could result in significant unintended safety consequences. Although the interoperability provisions in the Digital Markets Act are strongly beneficial when viewed through a competition lens—they will allow the competition and communication of multiple platforms—they could, without appropriate safety mitigations, provide new means for abusers to contact children across multiple platforms, significantly increase the overall profile of cross-platform risk, and actively frustrate a broad number of current online safety responses. Amendment 21 will provide corresponding safety requirements that can mitigate the otherwise significant potential for unintended consequences.

The Minister referred to clauses 23 and 24 in relation to amendments 30 and 31. We think a similar consideration should apply for search services as well as for user-to-user services. We implore that the amendments be made, in order to prevent those harms from occurring.

Chris Philp Portrait Chris Philp
- Hansard - -

I have already commented on most of those amendments, but one point that the shadow Minister made that I have not addressed was about acts that are essentially preparatory to acts of child abuse or the exchange of child sexual exploitation and abuse images. She was quite right to raise that issue as a matter of serious concern that we would expect the Bill to prevent, and I offer the Committee the reassurance that the Bill, as drafted, does so.

Schedule 6 sets out the various forms of child sexual exploitation and abuse that are designated as priority offences and that platforms have to take proactive steps to prevent. On the cross-platform point, that includes, as we have discussed, things that happen through a service as well as on a service. Critically, paragraph 9 of schedule 6 includes “inchoate offences”, which means someone not just committing the offence but engaging in acts that are preparatory to committing the offence, conspiring to commit the offence, or procuring, aiding or abetting the commission of the offence. The preparatory activities that the shadow Minister referred to are covered under schedule 6, particularly paragraph 9.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for giving way. I notice that schedule 6 includes provision on the possession of indecent photographs of children. Can he confirm that that provision encapsulates the livestreaming of sexual exploitation?

Chris Philp Portrait Chris Philp
- Hansard - -

Yes, I can.

Question put, That the amendment be made.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.

Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.

The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.

Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The amendment specifically mentions the level and rates of those images. I did not quite manage to follow through all the things that the Minister just spoke about, but does the clause specifically talk about the level of those things, rather than individual incidents, the possibility of incidents or some sort of threshold for incidents, as in some parts of the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

The risk assessments that clause 8 requires have to be suitable and sufficient; they cannot be perfunctory and inadequate in nature. I would say that suitable and sufficient means they must go into the kind of detail that the hon. Lady requests. More details, most of which relate to timing, are set out in schedule 3. Ofcom will be making sure that these risk assessments are not perfunctory.

Importantly, in relation to CSEA reporting, clause 59, which we will come to, places a mandatory requirement on in-scope companies to report to the National Crime Agency all CSEA content that they detect on their platforms, if it has not already been reported. Not only is that covered by the risk assessments, but there is a criminal reporting requirement here. Although the objectives of amendments 17 and 28 are very important, I submit to the Committee that the Bill delivers the intention behind them already, so I ask the shadow Minister to withdraw them.

Question put, That the amendment be made.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.

In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.

It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.

The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.

On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.

The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.

Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.

As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.

However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.

Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.

Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.

Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:

“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”

Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.

It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.

Chris Philp Portrait Chris Philp
- Hansard - -

As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.

On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.

Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.

Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.

Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.

Chris Philp Portrait Chris Philp
- Hansard - -

I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.

I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.

We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:

“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.

That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.

Chris Philp Portrait Chris Philp
- Hansard - -

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

As my hon. Friend the Member for Worsley and Eccles South has said, the point here is about cultural change, and the way to do that is through leadership. It is not about shutting the gate after the horse has bolted. Fining the companies might achieve something, but it does not tackle the root of the problem. It is about cultural change and leadership at these organisations. We all agree across the House that they are not doing enough, so how do we change that culture? It has to come from leadership.

Chris Philp Portrait Chris Philp
- Hansard - -

Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.

While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister is being very generous with his time, but I do not think he appreciates the nature of the issue. Mark Zuckerberg’s net worth is $71.5 billion. Elon Musk, who is reported to be purchasing Twitter, is worth $218 billion. Bill Gates is worth $125 billion. Money does not matter to these people.

The Minister discusses huge fines for the companies and the potential sanction of bringing down their platforms. They will just set up another one. That is what we are seeing with the smaller platforms: they are closing down and setting up new platforms. These measures do not matter. What matters and will actually make a difference to the safety of children and adults online is personal liability—holding people personally responsible for the direct harm they are causing to people here in the United Kingdom. That is what these amendments seek to do, and that is why we are pushing them so heavily. I urge the Minister to respond to that.

Chris Philp Portrait Chris Philp
- Hansard - -

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will my hon. Friend give way?

Chris Philp Portrait Chris Philp
- Hansard - -

One last time, because I am conscious that we need to make some progress this afternoon.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments, particularly about amendments 15 and 16, which the Minister has just spoken about at some length. I do not agree with the Government’s assessment that the governance subsection is adequate. It states that the risk assessment must take into account

“how the design and operation of the service (including the business model, governance, use of proactive technology…may reduce or increase the risks identified.”

It is actually an assessment of whether the governance structure has an impact on the risk assessment. It has no impact whatever on the level at which the risk assessment is approved or not approved; it is about the risks that the governance structure poses to children or adults, depending on which section of the Bill we are looking at.

The Minister should consider what is being asked in the amendment, which is about the decision-making level at which the risk assessments are approved. I know the Minister has spoken already, but some clarification would be welcome. Does he expect a junior tech support member of staff, or a junior member of the legal team, to write the risk assessment and then put it in a cupboard? Or perhaps they approve it themselves and then nothing happens with it until Ofcom asks for it. Does he think that Ofcom would look unfavourably on behaviour like that? If he was very clear with us about that, it might put our minds at rest. Does he think that someone in a managerial position or a board member, or the board itself, should take decisions, rather than a very junior member of staff? There is a big spread of people who could be taking decisions. If he could give us an indication of what Ofcom might look favourably on, it would be incredibly helpful for our deliberations.

Chris Philp Portrait Chris Philp
- Hansard - -

I am anxious about time, but I will respond to that point because it is an important one. The hon. Lady is right to say that clause 10(6)(h) looks to identify the risks associated with governance. That is correct —it is a risk assessment. However in clause 11(2)(a), there is a duty to mitigate those risks, having identified what the risks are. If, as she hypothesised, a very junior person was looking at these matters from a governance point of view, that would be identified as a risk. If it was not, Ofcom would find that that was not sufficient or suitable. That would breach clause 10(2), and the service would then be required to mitigate. If it did not mitigate the risks by having a more senior person taking the decision, Ofcom would take enforcement action for its failure under clause 11(2)(a).

For the record, should Ofcom or lawyers consult the transcript to ascertain Parliament’s intention in the course of future litigation, it is absolutely the Government’s view, as I think it is the hon. Lady’s, that a suitable level of decision making for a children’s risk assessment would be a very senior level. The official Opposition clearly think that, because they have put it in their amendment. I am happy to confirm that, as a Minister, I think that. Obviously the hon. Lady, who speaks for the SNP, does too. If the transcripts of the Committee’s proceedings are examined in the future to ascertain Parliament’s intention, Parliament’s intention will be very clear.

None Portrait The Chair
- Hansard -

Barbara Keeley, do you have anything to add?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

Chris Philp Portrait Chris Philp
- Hansard - -

Once again, the Government recognise the intent behind these amendments and support the concept that people with multiple intersecting characteristics, or those who are members of multiple groups, may experience—or probably do experience—elevated levels of harm and abuse online compared with others. We completely understand and accept that point, as clearly laid out by the hon. Member for Aberdeen North.

There is a technical legal reason why the use of the singular characteristic and group singular is adopted here. Section 6(c) of the Interpretation Act 1978 sets out how words in Bills and Acts are interpreted, namely that such words in the singular also cover the plural. That means that references in the singular, such as

“individuals with a certain characteristic”

in clause 10(6)(d), also cover characteristics in the plural. A reference to the singular implies a reference to the plural.

Will those compounded risks, where they exist, be taken into account? The answer is yes, because the assessments must assess the risk in front of them. Where there is evidence that multiple protected characteristics or the membership of multiple groups produce compounded risks, as the hon. Lady set out, the risk assessment has to reflect that. That includes the general sectoral risk assessment carried out by Ofcom, which is detailed in clause 83, and Ofcom will then produce guidance under clause 84.

The critical point is that, because there is evidence of high levels of compounded risk when people have more than one characteristic, that must be reflected in the risk assessment, otherwise it is inadequate. I accept the point behind the amendments, but I hope that that explains, with particular reference to the 1978 Act, why the Bill as drafted covers that valid point.

None Portrait The Chair
- Hansard -

Barbara Keeley?

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My apologies. I will rise later.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.

In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out

“a duty to operate a service using proportionate systems and processes”

that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.

In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.

My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to

“prevent children of any age from encountering”

harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.

I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?

Chris Philp Portrait Chris Philp
- Hansard - -

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 12, in clause 12, page 12, line 10, at end insert—

“(4A) A duty to publish the adults’ risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the adults’ risk assessment and supply it to Ofcom.

--- Later in debate ---
We have seen what years of no accountability has done to the online space. My hon. Friend referred to Frances Haugen’s experiences at Meta, which we all heard about recently in evidence sessions—none of it filled me with confidence. We know that those category 1 companies have the information, but they will not feel compelled to publish it until there is a statutory duty to do so. The Minister knows that would be an extremely welcome move; he would be commended by academics, stakeholders, parliamentarians and the public alike. Why exactly does that glaring omission still remain? If the Minister cannot answer me fully, and instead refers to platforms looking to Hansard in the future, then I am keen to press this amendment to a Division. I cannot see the benefits of withholding those risk assessments from the public and academics.
Chris Philp Portrait Chris Philp
- Hansard - -

Once again, I agree with the point about transparency and the need to have those matters brought into the light of day. We heard from Frances Haugen how Facebook—now Meta—actively resisted doing so. However, I point to two provisions already in the Bill that deliver precisely that objective. I know we are debating clause 12, but there is a duty in clause 13(2) for platforms to publish in their terms of service—a public document—the findings of the most recent adult risk assessment. That duty is in clause 13—the next clause we are going to debate—in addition to the obligations I have referred to twice already in clause 64, where Ofcom compels those firms to publish their transparency reports. I agree with the points that the shadow Minister made, but suggest that through clause 13(2) and clause 64, those objectives are met in the Bill as drafted.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for his comments, but sadly we do not feel that is appropriate or robust enough, which is why we will be pressing the amendment to a Division.

Question put, That the amendment be made.

The Committee divided.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

While I am at risk of parroting my hon. Friend the Member for Worsley and Eccles South on clause 11, it is important that adults and the specific risks they face online are considered in the clause. The Minister knows we have wider concerns about the specific challenges of the current categorisation system. I will come on to that at great length later, but I thought it would be helpful to remind him at this relatively early stage that the commitments to safety and risk assessments for category 1 services will only work if category 1 encapsulates the most harmful platforms out there. That being said, Labour broadly supports this clause and has not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - -

I am eagerly awaiting the lengthy representations that the shadow Minister just referred to, as are, I am sure, the whole Committee and indeed the millions watching our proceedings on the live broadcast. As the shadow Minister said, clause 13 sets out the safety duties in relation to adults. This is content that is legal but potentially harmful to adults, and for those topics specified in secondary legislation, it will require category 1 services to set out clearly what actions they might be taking—from the actions specified in subsection (4) —in relation to that content.

It is important to specify that the action they may choose to take is a choice for the platform. I know some people have raised issues concerning free speech and these duties, but I want to reiterate and be clear that this is a choice for the platform. They have to be publicly clear about what choices they are making, and they must apply those choices consistently. That is a significant improvement on where we are now, where some of these policies get applied in a manner that is arbitrary.

Question put and agreed to.

Clause 13 accordingly ordered to stand part of the Bill.

Clause 14

User empowerment duties

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 46, in clause 14, page 14, line 12, after “non-verified users” insert

“and to enable them to see whether another user is verified or non-verified.”

This amendment would make it clear that, as part of the User Empowerment Duty, users should be able to see which other users are verified and which are non-verified.

--- Later in debate ---
None Portrait The Chair
- Hansard -

If no other Member would like to speak to amendment 46, I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

I would be delighted to speak to the amendment, which would change the existing user empowerment duty in clause 14 to require category 1 services to enable adult users to see whether other users are verified. In effect, however, that objective already follows as a natural consequence of the duty in clause 14(6). When a user decides to filter out non-verified users, by definition such users will be able to see content only from verified users, so they could see from that who was verified and who was not. The effect intended by the amendment, therefore, is already achieved through clause 14(6).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am sorry to disagree with the Minister so vigorously, but that is a rubbish argument. It does not make any sense. There is a difference between wanting to filter out everybody who is not verified and wanting to actually see if someone who is threatening someone else online is a verified or a non-verified user. Those are two very different things. I can understand why a politician, for example, might not want to filter out unverified users but would want to check whether a person was verified before going to the police to report a threat.

Chris Philp Portrait Chris Philp
- Hansard - -

When it comes to police investigations, if something is illegal and merits a report to the police, users should report it, regardless of whether someone is verified or not—whatever the circumstances. I would encourage any internet user to do that. That effectively applies on Twitter already; some people have blue ticks and some people do not, and people should report others to the police if they do something illegal, whether or not they happen to have a blue tick.

Amendment 47 seeks to create a definition of identity verification in clause 189. In addition, it would compel the person’s real name to be displayed. I understand the spirit of the amendment, but there are two reasons why I would not want to accept it and would ask hon. Members not to press it. First, the words “identity verification” are ordinary English words with a clear meaning and we do not normally define in legislation ordinary English words with a clear meaning. Secondly, the amendment would add the new requirement that, if somebody is verified, their real name has to be displayed, but I do not think that that is the effect of the drafting as it stands. Somebody may be verified, and the company knows who they are—if the police go to the company, they will have the verified information—but there is no obligation, as the amendment is drafted, for that information to be displayed publicly. The effect of that part of the amendment would be to force users to choose between disclosing their identity to everyone or having no control over who they interact with. That may not have been the intention, but I am not sure that this would necessarily make sense.

New clause 8 would place requirements on Ofcom about how to produce guidance on user identity verification and what that guidance must contain. We already have provisions on that in clause 58, which we will no doubt come to, although probably not later on today—maybe on Thursday. Clause 58 allows Ofcom to include in its regulatory guidance the principles and standards referenced in the new clause, which can then assist service providers in complying with their duties. Of course, if they choose to ignore the guidelines and do not comply with their duties, they will be subject to enforcement action, but we want to ensure that there is flexibility for Ofcom, in writing those guidelines, and for companies, in following those guidelines or taking alternative steps to meet their duty.

This morning, a couple of Members talked about the importance of remaining flexible and being open to future changes in technology and a wide range of user needs. We want to make sure that flexibility is retained. As drafted, new clause 8 potentially undermines that flexibility. We think that the powers set out in clause 58 give Ofcom the ability to set the relevant regulatory guidance.

Clause 14 implements the proposals made by my hon. Friend the Member for Stroud in her ten-minute rule Bill and the proposals made, as the shadow Minister has said, by a number of third-party stakeholders. We should all welcome the fact that these new user empowerment duties have now been included in the Bill in response to such widespread parliamentary lobbying.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. I want to recount my own experience on this issue. He mentioned that anybody in receipt of anonymous abuse on social media should report it to the police, especially if it is illegal. On Thursday, I dared to tweet my opinions on the controversial Depp-Heard case in America. As a result of putting my head above the parapet, my Twitter mentions were an absolute sewer of rape threats and death threats, mainly from anonymous accounts. My Twitter profile was mocked up—I had devil horns and a Star of David on my forehead. It was vile. I blocked, deleted and moved on, but I also reported those accounts to Twitter, especially those that sent me rape threats and death threats.

That was on Thursday, and to date no action has been taken and I have not received any response from Twitter about any of the accounts I reported. The Minister said they should be reported to the police. If I reported all those accounts to the police, I would still be there now reporting them. How does he anticipate that this will be resourced so that social media companies can tackle the issue? That was the interaction resulting from just one tweet that I sent on Thursday, and anonymous accounts sent me a barrage of hate and illegal activity.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister raises a very good point. Of course, what she experienced on Twitter was despicable, and I am sure that all members of the Committee would unreservedly condemn the perpetrators who put that content on there. Once the Bill is passed, there will be legal duties on Twitter to remove illegal content. At the moment, they do not exist, and there is no legal obligation for Twitter to remove that content, even though much of it, from the sound of it, would cross one of various legal thresholds. Perhaps some messages qualify as malicious communication, and others might cross other criminal thresholds. That legal duty does not exist at the moment, but when this Bill passes, for the first time there will be that duty to protect not just the shadow Minister but users across the whole country.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Chris Philp Portrait Chris Philp
- Hansard - -

I have listened very carefully to both the hon. Member for Batley and Spen and the hon. Member for Aberdeen North. I agree with both of them that abuse and illegal activity directed at anyone, including people running for elected office, is unacceptable. I endorse and echo the comments they made in their very powerful and moving speeches.

In relation to the technicality of these amendments, what they are asking for is in the Bill already but in different places. This clause is about protecting content of “democratic importance” and concerns stopping online social media firms deleting content through over-zealous takedown. What the hon. Members are talking about is different. They are talking about abuse and illegal activities, such as rape threats, that people get on social media, particularly female MPs, as they both pointed out. I can point to two other places in the Bill where what they are asking for is delivered.

First, there are the duties around illegal content that we debated this morning. If there is content online that is illegal—some of the stuff that the shadow Minister referred to earlier sounds as if it would meet that threshold—then in the Bill there is a duty on social media firms to remove that content and to proactively prevent it if it is on the priority list. The route to prosecution will exist in future, as it does now, and the user-verification measures, if a user is verified, make it more likely for the police to identify the person responsible. In the context of identifying people carrying out abuse, I know the Home Office is looking at the Investigatory Powers Act 2016 as a separate piece of work that speaks to that issue.

So illegal content is dealt with in the illegal content provisions in the Bill, but later we will come to clause 150, which updates the Malicious Communications Act 1988 and creates a new harmful communications offence. Some of the communications that have been described may not count as a criminal offence under other parts of criminal law, but if they meet the test of harmful communication in clause 150, they will be criminalised and will therefore have to be taken down, and prosecution will be possible. In meeting the very reasonable requests that the hon. Members for Batley and Spen and for Aberdeen North have made, I would point to those two parts of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

But clause 150(5) says that if a message

“is, or is intended to be, a contribution to a matter of public interest”,

people are allowed to send it, which basically gives everybody a get-out clause in relation to anything to do with elections.

Chris Philp Portrait Chris Philp
- Hansard - -

No, it does not.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I know we are not discussing that part of the Bill, and if the Minister wants to come back to this when we get to clause 150, I have no problem with that.

Chris Philp Portrait Chris Philp
- Hansard - -

I will answer the point now, as it has been raised. Clause 150 categorically does not give a get-out-of-jail-free card or provide an automatic excuse. Clearly, there is no way that abusing a candidate for elected office with rape threats and so on could possibly be considered a matter of public interest. In fact, even if the abuse somehow could be considered as possibly contributing to public debate, clause 150(5) says explicitly in line 32 on page 127:

“but that does not determine the point”.

Even where there is some potentially tenuous argument about a contribution to a matter of public interest, which most definitely would not be the case for the rape threats that have been described, that is not determinative. It is a balancing exercise that gets performed, and I hope that puts the hon. Lady’s mind at rest.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The Minister makes a really valid point and is right about the impact on the individual. The point I am trying to make with the amendments is that this is about the impact on the democratic process, which is why I think it fits in with clause 15. It is not about how individuals feel; it is about the impact that that has on behaviours, and about putting the emphasis and onus on platforms to decide what is of democratic importance. In the evidence we had two weeks ago, the witnesses certainly did not feel comfortable with putting the onus on platforms. If we were to have a code of practice, we would at least give them something to work with on the issue of what is of democratic importance. It is about the impact on democracy, not just the harm to the individual involved.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, if a communication is sufficiently offensive that it meets the criminal threshold, it is covered, and that would obviously harm the democratic process as well. If a communication was sufficiently offensive that it breached the harmful communication offence in clause 150, it would also, by definition, harm the democratic process, so communications that are damaging to democracy would axiomatically be caught by one thing or the other. I find it difficult to imagine a communication that might be considered damaging to democracy but that would not meet one of those two criteria, so that it was not illegal and would not meet the definition of a harmful communication.

My main point is that the existing provisions in the Bill address the kinds of behaviours that were described in those two speeches—the illegal content provisions, and the new harmful communication offence in clause 150. On that basis, I hope the hon. Member for Batley and Spen will withdraw the amendment, safe in the knowledge that the Bill addresses the issue that she rightly and reasonably raises.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will speak to clauses 15 and 16 and to new clause 7. The duties outlined in the clause, alongside clause 16, require platforms to have special terms and processes for handling journalistic and democratically important content. In respect of journalistic content, platforms are also required to provide an expedited appeals process for removed posts, and terms specifying how they will define journalistic content. There are, however, widespread concerns about both those duties.

As the Bill stands, we feel that there is too much discretion for platforms. They are required to define “journalistic” content, a role that they are completely unsuited to and, from what I can gather, do not want. In addition, the current drafting leaves the online space open to abuse. Individuals intent on causing harm are likely to apply to take advantage of either of those duties; masquerading as journalists or claiming democratic importance in whatever harm they are causing, and that could apply to almost anything. In the evidence sessions, we also heard about the concerns expressed brilliantly by Kyle Taylor from Fair Vote and Ellen Judson from Demos, that the definitions as they stand in the Bill thus far are broad and vague. However, we will come on to those matters later.

Ultimately, treating “journalistic” and “democratically important” content differently is unworkable, leaving platforms to make impossible judgments over, for example, when and for how long an issue becomes a matter of reasonable public debate, or in what settings a person is acting as a journalist. As the Minister knows, the duties outlined in the clause could enable a far-right activist who was standing in an election, or potentially even just supporting candidates in elections, to use all social media platforms. That might allow far-right figures to be re-platformed on to social media sites where they would be free to continue spreading hate.

The Bill indicates that content will be protected if created by a political party ahead of a vote in Parliament, an election or a referendum, or when campaigning on a live political issue—basically, anything. Can the Minister confirm whether the clause means that far-right figures who have been de-platformed for hate speech already must be reinstated if they stand in an election? Does that include far-right or even neo-Nazi political parties? Content and accounts that have been de-platformed from mainstream platforms for breaking terms of service should not be allowed to return to those platforms via this potential—dangerous—loophole.

As I have said, however, I know that these matters are complex and, quite rightly, exemptions must be in place to allow for free discussion around matters of the day. What cannot be allowed to perpetuate is hate sparked by bad actors using simple loopholes to avoid any consequences.

On clause 16, the Minister knows about the important work that Hope not Hate is doing in monitoring key far-right figures. I pay tribute to it for its excellent work. Many of them self-define as journalists and could seek to exploit this loophole in the Bill and propagate hate online. Some of the most high-profile and dangerous far-right figures in the UK, including Stephen Yaxley-Lennon, also known as Tommy Robinson, now class themselves as journalists. There are also far-right and conspiracy-theory so-called “news companies” such as Rebel Media and Urban Scoop. Both those replicate mainstream news publishers, but are used to spread misinformation and discriminatory content. Many of those individuals and organisations have been de-platformed already for consistently breaking the terms of service of major social media platforms, and the exemption could see them demand their return and have their return allowed.

New clause 7 would require the Secretary of State to publish a report reviewing the effectiveness of clauses 15 and 16. It is a simple new clause to require parliamentary scrutiny of how the Government’s chosen means of protecting content of democratic importance and content of journalistic content are working.

Hacked Off provided me with a list of people it found who have claimed to be journalists and who would seek to exploit the journalistic content duty, despite being banned from social media because they are racists or bad actors. First is Charles C. Johnson, a far-right activist who describes himself as an “investigative journalist”. Already banned from Twitter for saying he would “take out” a civil rights activist, he is also alleged to be a holocaust denier.

Secondly, we have Robert Stacy McCain. Robert has been banned from Twitter for participating in targeted abuse. He was a journalist for The Washington Post, but is alleged to have also been a member of the League of the South, a far-right group known to include racists. Then, there is Richard B. Spencer, a far-right journalist and former editor, only temporary banned for using overlapping accounts. He was pictured making the Nazi salute and has repeated Nazi propaganda. When Trump became President, he encouraged people to “party like it’s 1933”. Sadly, the list goes on and on.

Transparency is at the very heart of the Bill. The Minister knows we have concerns about clauses 15 and 16, as do many of his own Back Benchers. We have heard from my hon. Friend the Member for Batley and Spen how extremist groups and individuals and foreign state actors are having a very real impact on the online space. If the Minister is unwilling to move on tightening up those concepts, the very least he could commit to is a review that Parliament will be able to formally consider.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the shadow Minister for her comments and questions. I would like to pick up on a few points on the clauses. First, there was a question about what content of democratic importance and content of journalistic importance mean in practice. As with many concepts in the Bill, we will look to Ofcom to issue codes of practice specifying precisely how we might expect platforms to implement the various provisions in the Bill. That is set out in clause 37(10)(e) and (f), which appear at the top of page 37, for ease. Clauses 15 and 16 on content of democratic and journalistic importance are expressly referenced as areas where codes of practice will have to be published by Ofcom, which will do further work on and consult on that. It will not just publish it, but will go through a proper process.

The shadow Minister expressed some understandable concerns a moment ago about various extremely unpleasant people, such as members of the far right who might somehow seek to use the provisions in clauses 15 and 16 as a shield behind which to hide, to enable them to continue propagating hateful, vile content. I want to make it clear that the protections in the Bill are not absolute—it is not that if someone can demonstrate that what they are saying is of democratic importance, they can say whatever they like. That is not how the clauses are drafted.

I draw attention to subsection (2) of both clauses 15 and 16. At the end of the first block of text, just above paragraph (a), it says “taken into account”: the duty is to ensure that matters concerning the importance of freedom of expression relating to content of democratic importance are taken into account when making decisions. It is not an absolute prohibition on takedown or an absolute protection, but simply something that has to be taken into account.

If someone from the far right, as the shadow Minister described, was spewing out vile hatred, racism or antisemitism, and tried to use those clauses, the fact that they might be standing in an election might well be taken into account. However, in performing that balancing exercise, the social media platforms and Ofcom acting as enforcers—and the court if it ever got judicially reviewed—would weigh those things up and find that taking into account content of democratic importance would not be sufficient to outweigh considerations around vile racism, antisemitism or misogyny.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentions that it would be taken into account. How long does he anticipate it would be taken into account for, especially given the nature of an election? A short campaign could be a number of weeks, or something could be posted a day before an election, be deemed democratically important and have very serious and dangerous ramifications.

Chris Philp Portrait Chris Philp
- Hansard - -

As I say, if content was racist, antisemitic or flagrantly misogynistic, the balancing exercise is performed and the democratic context may be taken into account. I do not think the scales would tip in favour of leaving the content up. Even during an election period, I think common sense dictates that.

To be clear on the timing point that the hon. Lady asked about, the definition of democratic importance is not set out in hard-edged terms. It does not say, “Well, if you are in a short election period, any candidate’s content counts as of democratic importance.” It is not set out in a manner that is as black and white as that. If, for example, somebody was a candidate but it was just racist abuse, I am not sure how even that would count as democratic importance, even during an election period, because it would just be abuse; it would not be contributing to any democratic debate. Equally, somebody might not be a candidate, or might have been a candidate historically, but might be contributing to a legitimate debate after an election. That might be seen as being of democratic importance, even though they were not actually a candidate. As I said, the concept is not quite as black and white as that. The main point is that it is only to be taken into account; it is not determinative.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate the Minister’s allowing me to come back on this. During the Committee’s evidence sessions, we heard of examples where bad-faith state actors were interfering in the Scottish referendum, hosting Facebook groups and perpetuating disinformation around the royal family to persuade voters to vote “Yes” to leave the United Kingdom. That disinformation by illegal bad-faith actors could currently come under both the democratic importance and journalistic exemptions, so would be allowed to remain for the duration of that campaign. Given the exemptions in the Bill, it could not be taken down but could have huge, serious ramifications for democracy and the security of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - -

I understand the points that the hon. Lady is raising. However, I do not think that it would happen in that way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

You don’t think?

Chris Philp Portrait Chris Philp
- Hansard - -

No, I don’t. First of all, as I say, it is taken into account; it is not determinative. Secondly, on the point about state-sponsored disinformation, as I think I mentioned yesterday in response to the hon. Member for Liverpool, Walton, there is, as we speak, a new criminal offence of foreign interference being created in the National Security Bill. That will criminalise the kind of foreign interference in elections that she referred to. Because that would then create a new category of illegal content, that would flow through into this Bill. That would not be overridden by the duty to protect content of democratic importance set out here. I think that the combination of the fact that this is a balancing exercise, and not determinative, and the new foreign interference offence being created in the National Security Bill, will address the issue that the hon. Lady is raising—reasonably, because it has happened in this country, as she has said.

I will briefly turn to new clause 7, which calls for a review. I understand why the shadow Minister is proposing a review, but there is already a review mechanism in the Bill; it is to be found in clause 149, and will, of course, include a review of the way that clauses 15 and 16 operate. They are important clauses; we all accept that journalistic content and content of democratic importance is critical to the functioning of our society. Case law relating to article 10 of the European convention on human rights, for example, recognises content of journalistic importance as being especially critical. These two clauses seek to ensure that social media firms, in making their decisions, and Ofcom, in enforcing the firms, take account of that. However, it is no more than that: it is “take account”, it is not determinative.

Question put and agreed to.

Clause 15 accordingly ordered to stand part of the Bill.

Clause 16 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Seventh sitting)

Chris Philp Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 27 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Good morning, Ms Rees. It is a pleasure to serve once again under your chairmanship. I wondered whether the shadow Minister, the hon. Member for Pontypridd, wanted to speak first—I am always happy to follow her, if she would prefer that.

Chris Philp Portrait Chris Philp
- Hansard - -

I do my best.

Clauses 17 and 27 have similar effects, the former applying to user-to-user services and the latter to search services. They set out an obligation on the companies to put in place effective and accessible content reporting mechanisms, so that users can report issues. The clauses will ensure that service providers are made aware of illegal and harmful content on their sites. In relation to priority illegal content, the companies must proactively prevent it in the first place, but in the other areas, they may respond reactively as well.

The clause will ensure that anyone who wants to report illegal or harmful content can do so in a quick and reasonable way. We are ensuring that everyone who needs to do that will be able to do so, so the facility will be open to those who are affected by the content but who are not themselves users of the site. For example, that might be non-users who are the subject of the content, such as a victim of revenge pornography, or non-users who are members of a specific group with certain characteristics targeted by the content, such as a member of the Jewish community reporting antisemitic content. There is also facility for parents and other adults with caring responsibility for children, and adults caring for another adult, to report content. Clause 27 sets out similar duties in relation to search. I commend the clauses to the Committee.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to raise an additional point about content reporting and complaints procedures. I met with representatives of Mencap yesterday, who raised the issue of the accessibility of the procedures that are in place. I appreciate that the Bill talks about procedures being accessible, but will the Minister give us some comfort about Ofcom looking at the reporting procedures that are in place, to ensure that adults with learning disabilities in particular can access those content reporting and complaints procedures, understand them and easily find them on sites?

That is a specific concern that Mencap raised on behalf of its members. A number of its members will be users of sites such as Facebook, but may find it more difficult than others to access and understand the procedures that are in place. I appreciate that, through the Bill, the Minister is making an attempt to ensure that those procedures are accessible, but I want to make sure they are accessible not just for the general public but for children, who may need jargon-free access to content reporting and complaints procedures, and for people with learning disabilities, who may similarly need jargon-free, easy-to-understand and easy-to-find access to those procedures.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me try to address some of the questions that have been raised in this short debate, starting with the question that the hon. Member for Aberdeen North quite rightly asked at the beginning. She posed the question, “What if somebody who is not an affected person encountered some content and wanted to report it?” For example, she might encounter some racist content on Twitter or elsewhere and would want to be able to report it, even though she is not herself the target of it or necessarily a member of the group affected. I can also offer the reassurance that my hon. Friend the Member for Wolverhampton North East asked for.

The answer is to be found in clause 17(2), which refers to

“A duty to operate a service using systems and processes that allow users and”—

I stress “and”—“affected persons”. As such, the duty to offer content reporting is to users and affected persons, so if the hon. Member for Aberdeen North was a user of Twitter but was not herself an affected person, she would still be able to report content in her capacity as a user. I hope that provides clarification.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that. That is key, and I am glad that this is wider than just users of the site. However, taking Reddit as an example, I am not signed up to that site, but I could easily stumble across content on it that was racist in nature. This clause would mean that I could not report that content unless I signed up to Reddit, because I would not be an affected person or a user of that site.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for her clarificatory question. I can confirm that in order to be a user of a service, she would not necessarily have to sign up to it. The simple act of browsing that service, of looking at Reddit—not, I confess, an activity that I participate in regularly—regardless of whether or not the hon. Lady has an account with it, makes her a user of that service, and in that capacity she would be able to make a content report under clause 17(2) even if she were not an affected person. I hope that clears up the question in a definitive manner.

The hon. Lady asked in her second speech about the accessibility of the complaints procedure for children. That is strictly a matter for clause 18, which is the next clause, but I will quickly answer her question. Clause 18 contains provisions that explicitly require the complaints process to be accessible. Subsection (2)(c) states that the complaints procedure has to be

“easy to access, easy to use (including by children) and transparent”,

so the statutory obligation that she requested is there in clause 18.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Can the Minister explain the logic in having that phrasing for the complaints procedure but not for the content-reporting procedure? Surely it would also make sense for the content reporting procedure to use the phrasing

“easy to access, easy to use (including by children) and transparent.”

Chris Philp Portrait Chris Philp
- Hansard - -

There is in clause 17(2)

“a duty to operate a service that allows users and affected persons to easily report content which they consider to be content of a…kind specified below”,

which, of course, includes services likely to be accessed by children, under subsection (4). The words “easily report” are present in clause 17(2).

I will move on to the question of children reporting more generally, which the shadow Minister raised as well. Clearly, a parent or anyone with responsibility for a child has the ability to make a report, but it is also worth mentioning the power in clauses 140 to 142 to make super-complaints, which the NSPCC strongly welcomed its evidence. An organisation that represents a particular group—an obvious example is the NSPCC representing children, but it would apply to loads of other groups—has the ability to make super-complaints to Ofcom on behalf of those users, if it feels they are not being well treated by a platform. A combination of the parent or carer being able to make individual complaints, and the super-complaint facility, means that the points raised by Members are catered for. I commend the clause to the Committee.

Question put and agreed to.

Clause 17 accordingly ordered to stand part of the Bill.

Clause 18

Duties about complaints procedures

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.

Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.

Clause 28 stand part.

New clause 1—Report on redress for individual complaints

“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—

(a) section 18; and

(b) section 28

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.

(3) The report must be laid before Parliament within six months of the commencement of this Act.”

--- Later in debate ---
Shaun Bailey Portrait Shaun Bailey (West Bromwich West) (Con)
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair, Ms Rees, and to make my first contribution in Committee—it will be a brief one. It is great to follow the hon. Member for Aberdeen North, and I listened intently to my right hon. Friend the Member for Basingstoke, from whom I have learned so much having sat with her in numerous Committees over the past two years.

I will speak to clause 18 stand part, in particular on the requirements of the technical specifications that the companies will need to use to ensure that they fulfil the duties under the clause. The point, which has been articulated well by numerous Members, is that we can place such a duty on service providers, but we must also ensure that the technical specifications in their systems allow them to follow through and deliver on it.

I sat in horror during the previous sitting as I listened to the hon. Member for Pontypridd talking about the horrendous abuse that she has to experience on Twitter. What that goes to show is that, if the intention of this clause and the Bill are to be fulfilled, we must ensure that the companies enable themselves to have the specifications in their systems on the ground to deliver the requirements of the Bill. That might mean that the secondary legislation is slightly more prescriptive about what those systems look like.

It is all well and good us passing primary legislation in this place to try to control matters, but my fear is that if those companies do not have systems such that they can follow through, there is a real risk that what we want will not materialise. As we proceed through the Bill, there will be mechanisms to ensure that that risk is mitigated, but the point that I am trying to make to my hon. Friend the Minister is that we should ensure that we are on top of this, and that companies have the technical specifications in their complaints procedures to meet the requirements under clause 18.

We must ensure that we do not allow the excuse, “Oh, well, we’re a bit behind the times on this.” I know that later clauses seek to deal with that, but it is important that we do not simply fall back on excuses. We must embed a culture that allows the provisions of the clause to be realised. I appeal to the Minister to ensure that we deal with that and embed a culture that looks at striding forward to deal with complaints procedures, and that these companies have the technical capabilities on the ground so that they can deal with these things swiftly and in the right way. Ultimately, as my right hon. Friend the Member for Basingstoke said, it is all well and good us making these laws, but it is vital that we ensure that they can be applied.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me address some of the issues raised in the debate. First, everyone in the House recognises the enormous problem at the moment with large social media firms receiving reports about harmful and even illegal content that they just flagrantly ignore. The purpose of the clause, and indeed of the whole Bill and its enforcement architecture, is to ensure that those large social media firms no longer ignore illegal and harmful content when they are notified about it. We agree unanimously on the importance of doing that.

The requirement for those firms to take the proper steps is set out in clause 18(2)(b), at the very top of page 18 —it is rather depressing that we are on only the 18th of a couple of hundred pages. That paragraph creates a statutory duty for a social media platform to take “appropriate action”—those are the key words. If the platform is notified of a piece of illegal content, or content that is harmful to children, or of content that it should take down under its own terms and conditions if harmful to adults, then it must do so. If it fails to do so, Ofcom will have the enforcement powers available to it to compel—ultimately, escalating to a fine of up to 10% of global revenue or even service disconnection.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I wanted to ask specifically about the resourcing of Ofcom, given the abilities that it will have under this clause. Will Ofcom have enough resource to be able to be that secondary line of defence?

Chris Philp Portrait Chris Philp
- Hansard - -

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

Chris Philp Portrait Chris Philp
- Hansard - -

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I will in a moment. The volume of complaints that gets generated is vast. The way that we will fix this is not by having an external policeman to enforce on individual complaints, but by ensuring that the systems and processes are set up correctly to deal with problems at this large scale. [Interruption.] The shadow Minister, the hon. Member for Pontypridd, laughs, but it is a question of practicality. The way we will make the internet safe is to make sure that the systems and processes are in place and effective. Ofcom will ensure that that happens. That will protect everyone, not just those who raise individual complaints with an ombudsman.

None Portrait Several hon. Members rose—
- Hansard -

Chris Philp Portrait Chris Philp
- Hansard - -

I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.

Chris Philp Portrait Chris Philp
- Hansard - -

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

Chris Philp Portrait Chris Philp
- Hansard - -

I give way first to the hon. Member for Aberdeen North—I think she was first on her feet—and then I will come to the hon. Member for Pontypridd.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the Minister not think this is going to work? He is creating this systems and processes approach, which he suggests will reduce the thousands of complaints—complaints will be made and complaints procedures will be followed. Surely, if it is going to work, in 10 years’ time we are going to need an ombudsman to adjudicate on the individual complaints that go wrong. If this works in the way he suggests, we will not have tens of millions of complaints, as we do now, but an ombudsman would provide individual redress. I get what he is arguing, but I do not know why he is not arguing for both things, because having both would provide the very best level of support.

Chris Philp Portrait Chris Philp
- Hansard - -

I will address the review clause now, since it is relevant. If, in due course, as I hope and expect, the Bill has the desired effect, perhaps that would be the moment to consider the case for an ombudsman. The critical step is to take a systemic approach, which the Bill is doing. That engages the question of new clause 1, which would create a mechanism, probably for the reason the hon. Lady just set out, to review how things are going and to see if, in due course, there is a case for an ombudsman, once we see how the Bill unfolds in practice.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

Let me finish the point. It is not a bad idea to review it and see how it is working in practice. Clause 149 already requires a review to take place between two and four years after Royal Assent. For the reasons that have been set out, it is pretty clear from this debate that we would expect the review to include precisely that question. If we had an ombudsman on day one, before the systems and processes had had a chance to have their effect, I fear that the ombudsman would be overwhelmed with millions of individual issues. The solution lies in fixing the problem systemically.

None Portrait Several hon. Members rose—
- Hansard -

Chris Philp Portrait Chris Philp
- Hansard - -

I think the shadow Minister wanted to intervene, unless I have answered her point already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wanted to reiterate the point that the hon. Member for Aberdeen North made, which the Minister has not answered. If he has such faith that the systems and processes will be changed and controlled by Ofcom as a result of the Bill, why is he so reluctant to put in an ombudsman? It will not be overwhelmed with complaints if the systems and processes work, and therefore protect victims. We have already waited far too long for the Bill, and now he says that we need to wait two to four years for a review, and even longer to implement an ombudsman to protect victims. Why will he not just put this in the Bill now to keep them safe?

Chris Philp Portrait Chris Philp
- Hansard - -

Because we need to give the new systems and processes time to take effect. If the hon. Lady felt so strongly that an ombudsman was required, she was entirely at liberty to table an amendment to introduce one, but she has not done so.

Jane Stevenson Portrait Jane Stevenson
- Hansard - - - Excerpts

I wonder whether Members would be reassured if companies were required to have a mechanism by which users could register their dissatisfaction, to enable an ombudsman, or perhaps Ofcom, to gauge the volume of dissatisfaction and bring some kind of group claim against the company. Is that a possibility?

Chris Philp Portrait Chris Philp
- Hansard - -

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

I was about to sit down, but of course I will give way.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.

Chris Philp Portrait Chris Philp
- Hansard - -

On this clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

Chris Philp Portrait Chris Philp
- Hansard - -

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 29 stand part.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

Chris Philp Portrait Chris Philp
- Hansard - -

I confirm what the hon. Lady has just said. In response to the hon. Member for Worsley and Eccles South, it is important to say that the duty in clause 19 is “to have regard”, which simply means that a balancing exercise must be performed. It is not determinative; it is not as if the rights in the clause trump everything else. They simply have to be taken into account when making decisions.

To repeat what we discussed on Tuesday, I can explicitly and absolutely confirm to the hon. Member for Aberdeen North that in my view and the Government’s, concerns about freedom of expression or privacy should not trump platforms’ ability to scan for child sexual exploitation and abuse images or protect children. It is our view that there is nothing more important than protecting children from exploitation and sexual abuse.

We may discuss this further when we come to clause 103, which develops the theme a little. It is also worth saying that Ofcom will be able to look at the risk assessments and, if it feels that they are not of an adequate standard, take that up with the companies concerned. We should recognise that the duty to have regard to freedom of expression is not something that currently exists. It is a significant step forward, in my view, and I commend clauses 19 and 29 to the Committee.

None Portrait The Chair
- Hansard -

With your indulgence, Minister, Nick Fletcher would like to speak.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

As I have said, at the moment there is nothing at all. Platforms such as Facebook can and do arbitrarily censor content with little if any regard for freedom of speech. Some platforms have effectively cancelled Donald Trump while allowing the Russian state to propagate shocking disinformation about the Russian invasion of Ukraine, so there is real inconsistency and a lack of respect for freedom of speech. This at least establishes something where currently there is nothing. We can debate whether “have regard to” is strong enough. We have heard the other point of view from the other side of the House, which expressed concern that it might be used to allow otherwise harmful content, so there are clearly arguments on both sides of the debate. The obligation to have regard does have some weight, because the issue cannot be completely ignored. I do not think it would be adequate to simply pay lip service to it and not give it any real regard, so I would not dismiss the legislation as drafted.

I would point to the clauses that we have recently discussed, such as clause 15, under which content of democratic importance—which includes debating current issues and not just stuff said by an MP or candidate—gets additional protection. Some of the content that my hon. Friend the Member for Don Valley referred to a second ago would probably also get protection under clause 14, under which content of democratic importance has to be taken in account when making decisions about taking down or removing particular accounts. I hope that provides some reassurance that this is a significant step forwards compared with where the internet is today.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I share the Minister’s sentiments about the Bill protecting free speech; we all want to protect that. He mentions some of the clauses we debated on Tuesday regarding democratic importance. Some would say that debating this Bill is of democratic importance. Since we started debating the Bill on Tuesday, and since I have mentioned some of the concerns raised by stakeholders and others about the journalistic exemption and, for example, Tommy Robinson, my Twitter mentions have been a complete sewer—as everyone can imagine. One tweet I received in the last two minutes states:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. Does the Minister agree that that is content of democratic importance, given we are debating this Bill, and that it should remain on Twitter?

Chris Philp Portrait Chris Philp
- Hansard - -

That sounds like a very offensive tweet. Could the hon. Lady read it again? I didn’t quite catch it.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes:

“I saw your vicious comments on Tommy Robinson…The only reason you want to suppress him is to bury the Pakistani Muslim rape epidemic”

in this country. It goes on:

“this is a toxic combination of bloc vote grubbing and woke”

culture, and there is a lovely GIF to go with it.

Chris Philp Portrait Chris Philp
- Hansard - -

I do not want to give an off-the-cuff assessment of an individual piece of content—not least because I am not a lawyer. It does not sound like it meets the threshold of illegality. It most certainly is offensive, and that sort of matter is one that Ofcom will set out in its codes of practice, but there is obviously a balance between freedom of speech and content that is harmful, which the codes of practice will delve into. I would be interested if the hon. Lady could report that to Twitter and then report back to the Committee on what action it takes.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes, I will do that right now and see what happens.

Chris Philp Portrait Chris Philp
- Hansard - -

At the moment, there is no legal obligation to do anything about it, which is precisely why this Bill is needed, but let us put it to the test.

Question put and agreed to.

Clause 19 accordingly ordered to stand part of the Bill.

Clause 20

Record-keeping and review duties

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has eloquently introduced the purpose and effect of the clause, so I shall not repeat what she has said. On her point about publication, I repeat the point that I made on Tuesday, which is that the transparency requirements—they are requirements, not options—set out in clause 64 oblige Ofcom to ensure the publication of appropriate information publicly in exactly the way she requests.

Question put and agreed to.

Clause 20 accordingly ordered to stand part of the Bill.

Clauses 21 to 24 ordered to stand part of the Bill.

Clause 25

Children’s risk assessment duties

Amendment proposed: 16, in clause 25, page 25, line 10, at end insert—

“(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.” —(Alex Davies-Jones.)

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Chris Philp Portrait Chris Philp
- Hansard - -

Colleagues have spoken eloquently to the purpose and effect of the various clauses and schedule 3 —the stand part component of this group. On schedule 3, the shadow Minister, the hon. Member for Worsley and Eccles South, asked about timing. The Government share her desire to get this done as quickly as possible. In its evidence a couple of weeks ago, Ofcom said it would be publishing its road map before the summer, which would set out the timetable for moving all this forward. We agree that that is extremely important.

I turn to one or two questions that arose on amendment 22. As always, the hon. Member for Aberdeen North asked a number of very good questions. The first was whether the concept of a “significant number” applied to a number in absolute terms or a percentage of the people using a particular service, and which is looked at when assessing what is significant. The answer is that it can be either—either a large number in absolute terms, by reference to the population of the whole United Kingdom, or a percentage of those using the service. That is expressed in clause 31(4)(a). Members will note the “or” there. It can be a number in proportion to the total UK population or the proportion using a service. I hope that answers the hon. Member’s very good question.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My concern is where services that meet neither of those criteria—they do not meet the “significant number” criterion in percentage terms because, say, only 0.05% of their users are children, and they do not meet it in population terms, because they are a pretty small platform and only have, say, 1,000 child users—but those children who use the platform are at very high risk because of the nature of the platform or the service provided. My concern is for those at highest risk where neither of the criteria are met and the service does not have to bother conducting any sort of age verification or access requirements.

Chris Philp Portrait Chris Philp
- Hansard - -

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Is the Minister coming on to say that he is accepting what we are saying here?

Chris Philp Portrait Chris Philp
- Hansard - -

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

Chris Philp Portrait Chris Philp
- Hansard - -

rose—[Interruption.]

None Portrait The Chair
- Hansard -

There is no hard and fast rule about moving the Adjournment motion. It is up to the Government Whip.

Chris Philp Portrait Chris Philp
- Hansard - -

I have a few more things to say, but I am happy to finish here if it is convenient.

Ordered, That the debate be now adjourned.—(Steve Double.)

Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Eighth sitting)

Chris Philp Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
None Portrait The Chair
- Hansard -

I remind the Committee that with this we are discussing the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

When the sitting was suspended for lunch, I was concluding my remarks and saying that where children are the victim of illegal activity or illegal content, all of that is covered in other aspects of the Bill. For areas such as gambling, we have separate legislation that protects children. In relation to potentially harmful content, the reason there is a “significant number” test for the child user condition that we are debating is that, without it, platforms that either would not have any children accessing them or had nothing of any concern on them—such as a website about corporation tax—would have an unduly burdensome and disproportionate obligation placed on them. That is why there is the test—just to ensure that there is a degree of proportionality in these duties. We find similar qualifications in other legislation; that includes the way the age-appropriate design code works. Therefore, I respectfully resist the amendment.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank my hon. Friend for his public service announcement. His constituent is incredibly lucky that my hon. Friend managed to act in that way and get the money back to her, because there are so many stories of people not managing to get their money back and losing their entire life savings as a result of scams. It is the case that not all those scams take place online—people can find scams in many other places—but we have the opportunity with the Bill to take action on scams that are found on the internet.

The other group I want to mention, and for whom highlighting advertising could make a positive difference, is people with learning disabilities. People with learning disabilities who use the internet may not understand the difference between adverts and search results, as the hon. Member for Worsley and Eccles South mentioned. They are a group who I would suggest are particularly susceptible to fraudulent advertising.

We are speaking a lot about search engines, but a lot of fraudulent advertising takes place on Facebook and so on. Compared with the majority of internet users, there is generally an older population on such sites, and the ability to tackle fraudulent advertising there is incredibly useful. We know that the sites can do it, because there are rules in place now around political advertising on Facebook, for example. We know that it is possible for them to take action; it is just that they have not yet taken proper action.

I am happy to support the amendments, but I am also glad that the Minister has put these measures in the Bill, because they will make a difference to so many of our constituents.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We are going to press amendments 23 and 24 to a vote because they are very important. I cited the example of earlier legislation that considered it important, in relation to selling tickets, to include the wording “anywhere in the world”. We know that ticket abuses happen with organisations in different parts of the world.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is perfectly entitled to press to a vote whatever amendments she sees fit, but in relation to amendments 24 and 25, the words she asks for,

“where the UK is a target market”,

are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:

“United Kingdom users form one of the target markets for the service”.

That applies to user-to-user and to search, so it is covered already.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The problem is that we are getting into the wording of the Bill. As with the child abuse clause that we discussed before lunch, there are limitations. Clause 3 states that a service has links with the United Kingdom if

“the service has a significant number of United Kingdom users”.

It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.

Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

That is of course our objective as well, but let me just return to the question of the definitions. The hon. Lady is right that clause 3(5)(a) says

“a significant number of United Kingdom users”,

but paragraph (b) just says,

“United Kingdom users form one of the target markets”.

There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Does the Minister want to say anything about the other points I made about advertisements?

Chris Philp Portrait Chris Philp
- Hansard - -

Not beyond the points I made previously, no.

Question put, That the amendment be made.

--- Later in debate ---
The Bill offers a chance to establish an important principle. People should be able to have confidence that the links they click on are for reputable regulated advice services. People should not have to be constantly on their guard against scams and other misleading promotions found on social media websites and in top-of-the-page search results. Without this amendment and the others to this chapter, we cannot be sure that those outcomes will be achieved.
Chris Philp Portrait Chris Philp
- Hansard - -

As we have heard already, these clauses are very important because they protect people from online fraudulent advertisements for the first time—something that the whole House quite rightly called for. As the shadow Minister said, the Government heard Parliament’s views on Second Reading, and the fact that the duties in clause 35 were not as strongly worded as those in clause 34 was recognised. The Government heard what Members said on Second Reading and tabled Government amendments 91 to 94, which make the duties on search firms in clause 35 as strong as those on user-to-user firms in clause 34. Opposition amendment 45 would essentially do the same thing, so I hope we can adopt Government amendments 91 to 94 without needing to move amendment 45. It would do exactly the same thing—we are in happy agreement on that point.

I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.

The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.

If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

No, we want to press this amendment to a vote. I have had further comment from the organisations that I quoted. They believe that we do need the amendment because it is important to stop harmful ads going up in the first place. They believe that strengthened provisions are needed for that. Guidance just puts the onus for protecting consumers on the other regulatory regimes that the Minister talked about. The view of organisations such as StepChange is that those regimes—the Advertising Standards Authority regime—are not particularly strong.

The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.

Question put and agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clause 35

Duties about fraudulent advertising: Category 2A services

Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—

“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;

(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;

(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”

This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).

Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.

This amendment is consequential on Amendment 91.

Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.

This amendment is consequential on Amendment 91.

Amendment 94, in clause 35, page 34, line 22, leave out

“does not include a reference”

and insert “do not include references”.—(Chris Philp.)

This amendment is consequential on Amendment 91.

Clause 35, as amended, ordered to stand part of the Bill.

Clause 36

Fraud etc offences

Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Lady for her contribution. Like me, she is a passionate campaigner for animal welfare. It was a pleasure to serve on the Committee that considered her Glue Traps (Offences) Act 2022, which I know the whole House was pleased to pass. She raises a very important point and one that the Bill later explores with regard to other types of content, such as antisemitic content and racist content in terms of education and history and fact. The Bill deals specifically with that later, and this content would be dealt with in the same way. We are talking about where content is used as an educational tool and a raising-awareness tool, compared with just images and videos of direct abuse.

To give hon. Members a real sense of the extent of the issue, I would like to share some findings from a recent survey of the RSPCA’s frontline officers. These are pretty shocking statistics, as I am sure Members will all agree. Eighty-one per cent. of RSPCA frontline officers think that more abuse is being caught on camera. Nearly half think that more cases are appearing on social media. One in five officers said that one of the main causes of cruelty to animals is people hurting animals just to make themselves more popular on social media. Some of the recent cruelty videos posted on social media include a video of a magpie being thrown across the road on Instagram in June 2021; a woman captured kicking her dog on TikTok in March 2021; a teenager being filmed kicking a dog, which was shared on WhatsApp in May 2021; and videos posted on Instagram of cockerels being forced to fight in March 2021.

I am sure that colleagues will be aware of the most recent high-profile case, which was when disturbing footage was posted online of footballer Kurt Zouma attacking his cat. There was, quite rightly, an outpouring of public anger and demands for justice. Footage uploaded to Snapchat on 6 February showed Zouma kicking his Bengal cat across a kitchen floor in front of his seven-year-old son. Zouma also threw a pair of shoes at his pet cat and slapped its head. In another video, he was heard saying:

“I swear I’ll kill it.”

In sentencing him following his guilty plea to two offences under the Animal Welfare Act 2006, district judge Susan Holdham described the incident as “disgraceful and reprehensible”. She added:

“You must be aware that others look up to you and many young people aspire to emulate you.”

What makes that case even more sad is the way in which the video was filmed and shared, making light of such cruelty. I am pleased that the case has now resulted in tougher penalties for filming animal abuse and posting it on social media, thanks to new guidelines from the Sentencing Council. The prosecutor in the Zouma case, Hazel Stevens, told the court:

“Since this footage was put in the public domain there has been a spate of people hitting cats and posting it on various social media sites.”

There have been many other such instances. Just a few months ago, the most abhorrent trend was occurring on TikTok: people were abusing cats, dogs and other animals to music and encouraging others to do the same. Police officers discovered a shocking 182 videos with graphic animal cruelty on mobile phones seized during an investigation. This sickening phenomenon is on the rise on social media platforms, provoking a glamorisation of the behaviour. The videos uncovered during the investigation showed dogs prompted to attack other animals such as cats, or used to hunt badgers, deer, rabbits and birds. Lancashire police began the investigation after someone witnessed two teenagers encouraging a dog to attack a cat on an estate in Burnley in March of last year. The cat, a pet named Gatsby, was rushed to the vet by its owners once they discovered what was going on, but unfortunately it was too late and Gatsby’s injuries were fatal. The photos and videos found on the boys’ phones led the police to discover more teenagers in the area who were involved in such cruel activities. The views and interactions that the graphic footage was attracting made it even more visible, as the platform was increasing traffic and boosting content when it received attention.

It should not have taken such a high-profile case of a professional footballer with a viral video to get this action taken. There are countless similar instances occurring day in, day out, and yet the platforms and authorities are not taking the necessary action to protect animals and people from harm, or to protect the young people who seek to emulate this behaviour.

I pay tribute to the hard work of campaigning groups such as the RSPCA, Action for Primates, Asia for Animals Coalition and many more, because they are the ones who have fought to keep animal rights at the forefront. The amendment seeks to ensure that such groups are given a voice at the table when Ofcom consults on its all-important codes of practice. That would be a small step towards reducing animal abuse content online, and I hope the Minister can see the merits in joining the cause.

I turn to amendment 60, which would bring offences to which animals are subject within the definition of illegal content, a point raised by the hon. Member for Ochil and South Perthshire. The Minister will recall the Animal Welfare (Sentencing) Act 2021, which received Royal Assent last year. Labour was pleased to see the Government finally taking action against those who commit animal cruelty offences offline. The maximum prison sentence for animal cruelty was increased from six months to five years, and the Government billed that move as them taking a firmer approach to cases such as dog fighting, abuse of puppies and kittens, illegally cropping a dog’s ears and gross neglect of farm animals. Why, then, have the Government failed to include offences against animals within the scope of illegal content online? We want parity between the online and offline space, and that seems like a sharp omission from the Bill.

Placing obligations on service providers to remove animal cruelty content should fall within both the spirit and the scope of the Bill. We all know that the scope of the Bill is to place duties on service providers to remove illegal and harmful content, placing particular emphasis on the exposure of children. Animal cruelty content is a depiction of illegality and also causes significant harm to children and adults.

If my inbox is anything to go by, all of us here today know what so many of our constituents up and down the country feel about animal abuse. It is one of the most popular topics that constituents contact me about. Today, the Minister has a choice to make about his Government's commitment to preventing animal cruelty and keeping us all safe online. I hope he will see the merit in acknowledging the seriousness of animal abuse online.

Amendment 66 would ensure that groups were able to make complaints about animal abuse videos. Labour welcomes clause 140, as the ability to make super-complaints is a vital part of our democracy. However, as my hon. Friend the Member for Worsley and Eccles South and other Members have mentioned, the current definition of an “eligible entity” is far too loose. I have set out the reasons as to why the Government must go further to limit and prevent animal abuse content online. Amendment 66 would ensure that dangerous animal abuse content is a reasonable cause for a super-complaint to be pursued.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister raises important issues to do with animal cruelty. The whole House and our constituents feel extremely strongly about this issue, as we know. She set out some very powerful examples of how this terrible form of abuse takes place.

To some extent, the offences are in the Bill’s scope already. It covers, for example, extreme pornography. Given that the content described by the hon. Lady would inflict psychological harm to children, it is, to that extent, in scope.

The hon. Lady mentioned the Government’s wider activities to prevent animal cruelty. That work goes back a long time and includes the last Labour Government’s Animal Welfare Act 2006. She mentioned the more recent update to the criminal sentencing laws that increased by a factor of 10 the maximum sentence for cruelty to animals. It used to be six months and has now been increased to up to five years in prison.

In addition, just last year the Department for Environment, Food and Rural Affairs announced an action plan for animal welfare, which outlines a whole suite of activities that the Government are taking to protect animals in a number of different areas—sentience, international trade, farming, pets and wild animals. That action plan will be delivered through a broad programme of legislative and non-legislative work.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

Chris Philp Portrait Chris Philp
- Hansard - -

This has been an important debate. I think there is unanimity on the objectives we are seeking to achieve, particularly protecting children from the risk of child sexual exploitation and abuse. As we have discussed two or three times already, we cannot allow end-to-end encryption to frustrate or prevent the protection of children.

I will talk about two or three of the issues that have arisen in the course of the debate. The first is new clause 20, a proposal requiring Ofcom to put together a report. I do not think that is strictly necessary, because the Bill already imposes a requirement to identify, assess and mitigate CSEA. There is no optionality here and no need to think about it; there is already a demand to prevent CSEA content, and Ofcom has to produce codes of practice explaining how it will do that. I think what is requested in new clause 20 is required already.

The hon. Member for Pontypridd mentioned the concern that Ofcom had to first of all prove that the CSEA risk existed. I think that might be a hangover from the previous draft of the Bill, where there was a requirement for the evidence to be “persistent and prevalent”—I think that might have been the phrase—which implied that Ofcom had to first prove that it existed before it could take action against it. So, for exactly the reason she mentioned, that it imposed a requirement to prove CSEA is there, we have changed the wording in the new version. Clause 103(1), at the top of page 87, instead of “persistent and prevalent”, now states “necessary and proportionate”. Therefore, if Ofcom simply considers something necessary, without needing to prove that it is persistent and prevalent—just if it thinks it is necessary—it can take the actions set out in that clause. For the reason that she mentioned, the change has been made already.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This is a mammoth part of the Bill, and I rise to speak to clause 39. Under the clause, Ofcom will submit a draft code of practice to the Secretary of State and, provided that the Secretary of State does not intend to issue a direction to Ofcom under clause 40, the Secretary of State would lay the draft code before Parliament. Labour’s main concern about the procedure for issuing codes of practice is that, without a deadline, they may not come into force for quite some time, and the online space needs addressing now. We have already waited far too long for the Government to bring forward the Bill. Parliamentary oversight is also fundamentally important, and the codes will have huge implications for the steps that service providers take, so it is vital that they are given due diligence at the earliest opportunity.

Amendment 48 would require Ofcom to prepare draft codes of practice within six months of the passing of the Act. This simple amendment would require Ofcom to bring forward these important codes of practice within an established time period—six months—after the Bill receives Royal Assent. Labour recognises the challenges ahead for Ofcom in both capacity and funding.

On this note, I must raise with the Minister something that I have raised previously. I find it most curious that his Department recently sought to hire an online safety regulator funding policy adviser. The job advert listed some of the key responsibilities:

“The post holder will support ministers during passage of the Online Safety Bill; secure the necessary funding for Ofcom and DCMS in order to set up the Online Safety regulator; and help implement and deliver a funding regime which is first of its kind in the UK.”

That raises worrying questions about how prepared Ofcom is for the huge task ahead. That being said, the Government have drafted the Bill in a way that brings codes of practice to its heart, so they cannot and should not be susceptible to delay.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is very kind in giving way—I was twitching to stand up. On the preparedness of Ofcom and its resources, Ofcom was given about £88 million in last year’s spending review to cover this and the next financial year—2022-23 and 2023-24—so that it could get ready. Thereafter, Ofcom will fund itself by raising fees, and I believe that the policy adviser will most likely advise on supporting the work on future fees. That does not imply that there will be any delay, because the funding for this year and next year has already been provided by the Government.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate that intervention, but the Minister must be aware that if Ofcom has to fundraise itself, that raises questions about its future capability as a regulator and its funding and resource requirements. What will happen if it does not raise those funds?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady’s use of the word “fundraise” implies that Ofcom will be going around with a collection tin on a voluntary basis.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is your word.

Chris Philp Portrait Chris Philp
- Hansard - -

I will find the relevant clause in a moment. The Bill gives Ofcom the legal power to make the regulated companies pay fees to finance Ofcom’s regulatory work. It is not voluntary; it is compulsory.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification. Perhaps he should make that more obvious in the job requirements and responsibilities.

Chris Philp Portrait Chris Philp
- Hansard - -

The fees requirements are in clauses 70 to 76, in particular clause 71, “Duty to pay fees”. The regulated companies have to pay the fees to Ofcom. It is not optional.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for that clarification.

The Government have drafted the Bill in a way that puts codes of practice at its heart, so they cannot and should not be susceptible to delay. We have heard from platforms and services that stress that the ambiguity of the requirements is causing concern. At least with a deadline for draft codes of practice, those that want to do the right thing will be able to get on with it in a timely manner.

The Age Verification Providers Association provided us with evidence in support of amendment 48 in advance of today’s sitting. The association agrees that early publication of the codes will set the pace for implementation, encouraging both the Secretary of State and Parliament to approve the codes swiftly. A case study it shared highlights delays in the system, which we fear will be replicated within the online space, too. Let me indulge Members with details of exactly how slow Ofcom’s recent record has been on delivering similar guidance required under the audio-visual media services directive.

The directive became UK law on 30 September 2020 and came into force on 1 November 2020. By 24 June 2021, Ofcom had issued a note as to which video sharing platforms were in scope. It took almost a year until, on 6 October 2021, Ofcom issued formal guidance on the measures.

In December 2021, Ofcom wrote to the verification service providers and

“signalled the beginning of a new phase of supervisory engagement”.

However, in March 2022 it announced that

“the information we collect will inform our Autumn 2022 VSP report, which intends to increase the public’s awareness of the measures platforms have in place to protect users from harm.”

There is still no indication that Ofcom intends to take enforcement action against the many VSPs that remain non-compliant with the directive. It is simply not good enough. I urge the Minister to carefully consider the aims of amendment 48 and to support it.

Labour supports the principles of clause 42. Ofcom must not drag out the process of publishing or amending the codes of practice. Labour also supports a level of transparency around the withdrawal of codes of practice, should that arise.

Labour also supports clause 43 and the principles of ensuring that Ofcom has a requirement to review its codes of practice. We do, however, have concerns over the Secretary of State’s powers in subsection (6). It is absolutely right that the Secretary of State of the day has the ability to make representations to Ofcom in order to prevent the disclosure of certain matters in the interests of national security, public safety or relations with the Government of a country outside the UK. However, I am keen to hear the Minister’s assurances about how well the Bill is drafted to prevent those powers from being used, shall we say, inappropriately. I hope he can address those concerns.

On clause 44, Ofcom should of course be able to propose minor amendments to its codes of practice. Labour does, however, have concerns about the assessment that Ofcom will have to make to ensure that the minor nature of changes will not require amendments to be laid before Parliament, as in subsection (1). As I have said previously, scrutiny must be at the heart of the Bill, so I am interested to hear from the Minister how exactly he will ensure that Ofcom is making appropriate decisions about what sorts of changes are allowed to circumvent parliamentary scrutiny. We cannot and must not get to a place where the Secretary of State, in agreeing to proposed amendments, actively prevents scrutiny from taking place. I am keen to hear assurances on that point from the Minister.

On clause 45, as I mentioned previously on amendment 65 to clause 37, as it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in subsection (1). However, providers could take alternative measures to comply, as outlined in subsection (5). Labour supports the clause in principle, but we are concerned that the definition of alternative measures is too broad. I would be grateful if the Minister could elaborate on his assessment of the instances in which a service provider may seek to comply via alternative measures. Surely the codes of practice should be, for want of a better phrase, best practice. None of us want to get into a position where service providers are circumnavigating their duties by taking the alternative measures route.

Again, Labour supports clause 46 in principle, but we feel that the provisions in subsection (1) could go further. We know that, historically, service providers have not always been transparent and forthcoming when compelled to be so by the courts. While we understand the reasoning behind subsection (3), we have broader concerns that service providers could, in theory, lean on their codes of practice as highlighting their best practice. I would be grateful if the Minister could address our concerns.

We support clause 47, which establishes that the duties in respect of which Ofcom must issue a code of practice under clause 37 will apply only once the first code of practice for that duty has come into force. However, we are concerned that this could mean that different duties will apply at different times, depending on when the relevant code for a particular duty comes into force. Will the Minister explain his assessment of how that will work in practice? We have concerns that drip feeding this information to service providers will cause further delay and confusion. In addition, will the Minister confirm how Ofcom will prioritise its codes of practice?

Lastly, we know that violence against women and girls has not a single mention in the Bill, which is an alarming and stark omission. Women and girls are disproportionately likely to be affected by online abuse and harassment. The Minister knows this—we all know this—and a number of us have spoken up on the issue on quite a few occasions. He also knows that online violence against women and girls is defined as including, but not limited to, intimate image abuse, online harassment, the sending of unsolicited explicit images, coercive sexting and the creation and sharing of deepfake pornography.

The Minister will also know that Carnegie UK is working with the End Violence Against Women coalition to draw up what a code of practice to tackle violence against women and girls could look like. Why has that been left out of the redraft of the Bill? What consideration has the Minister given to including a code of this nature in the Bill? If the Minister is truly committed to tackling violence against women and girls, why will he not put that on the face of the Bill?

--- Later in debate ---
None Portrait The Chair
- Hansard -

We are not debating clause 40, Dame Maria, but we will come to it eventually.

Chris Philp Portrait Chris Philp
- Hansard - -

I will do my best to make sure that we come to it very quickly indeed, by being concise in my replies on this group of amendments.

On amendment 48, which seeks to get Ofcom to produce its codes of practice within six months, obviously we are unanimous in wanting that to be done as quickly as possible. However, Ofcom has to go through a number of steps in order to produce those codes of practice. For example, first we have to designate in secondary legislation the priority categories of content that is harmful to children and content that is harmful to adults, and then Ofcom has to go through a consultation exercise before it publishes the codes. It has in the past indicated that it expects that to be a 12-month, rather than a six-month, process. I am concerned that a hard, six-month deadline may be either impossible to meet or make Ofcom rush and do it in a bad way. I accept the need to get this done quickly, for all the obvious reasons, but we also want to make sure that it is done right. For those reasons, a hard, six-month deadline would not help us very much.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Why does the Minister believe that six months is out of scope? Does he think that Ofcom is not adequately resourced to meet that deadline and make it happen as soon as possible?

Chris Philp Portrait Chris Philp
- Hansard - -

There are a number of steps to go through. Regardless of how well resourced Ofcom is and how fast it works, first, we have to designate the priority categories by secondary legislation, and there is a lead time for that. Secondly, Ofcom has to consult. Best practice suggests that consultations need to last for a certain period, because the consultation needs to be written, then it needs to open, and then the responses need to be analysed. Then, Ofcom obviously has to write the codes of practice. It might be counterproductive to set a deadline that tight.

There are quite a few different codes of practice to publish, and the hon. Lady asked about that. The ones listed in clause 47 will not all come out at the same time; they will be staggered and prioritised. Obviously, the ones that are most germane to safety, such as those on illegal content and children’s safety, will be done first. We would expect them to be done as a matter of extreme urgency.

I hope I have partly answered some of the questions that the hon. Member for Aberdeen North asked. The document to be published before the summer, which she asked about, is a road map. I understand it to be a sort of timetable that will set out the plan for doing everything we have just been debating—when the consultations will happen and when the codes of practice will be published. I guess we will get the road map in the next few weeks, if “before the summer” means before the summer recess. We will have all that set out for us, and then the formal process follows Royal Assent. I hope that answers the hon. Lady’s question.

There were one or two other questions from the hon. Member for Pontypridd. She asked whether a Secretary of State might misuse the power in clause 43(2)—a shocking suggestion, obviously. The power is only to request a review; it is nothing more sinister or onerous than that.

On clause 44, the hon. Lady asked what would happen if Ofcom and the Secretary of State between them—it would require both—conspired to allow through a change claiming it is minor when in fact it is not minor. First, it would require both of them to do that. It requires Ofcom to propose it and the Secretary of State to agree it, so I hope the fact that it is not the Secretary of State acting alone gives her some assurance. She asked what the redress is if both the Secretary of State and Ofcom misbehave, as it were. Well, the redress is the same as with any mis-exercise of a public power—namely, judicial review, which, as a former Home Office Minister, I have experienced extremely frequently—so there is legal redress.

The hon. Lady then asked about the alternative measures. What if a service provider, rather than meeting its duties via the codes of practice, does one of the alternative measures instead? Is it somehow wriggling out of what it is supposed to do? The thing that is legally binding, which it must do and about which there is no choice because there is a legal duty, is the duties that we have been debating over the past few days. Those are the binding requirements that cannot be circumvented. The codes of practice propose a way of meeting those. If the service provider can meet the duties in a different way and can satisfy Ofcom that it has met those duties as effectively as it would under the codes of practices, it is open to doing that. We do not want to be unduly prescriptive. The test is: have the duties been delivered? That is non-negotiable and legally binding.

I hope I have answered all the questions, while gently resisting amendment 48 and encouraging the Committee to agree that the various other clauses stand part of the Bill.

Question put, That the amendment be made.

The Committee divided:.

Online Safety Bill (Ninth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good morning, Ms Rees; it is, as always, a pleasure to serve under your chairship.

Amendment 84 would remove the Secretary of State’s ability to modify Ofcom codes of practice

“for reasons of public policy”.

Labour agrees with the Carnegie UK Trust assessment of this: the codes are the fulcrum of the regulatory regime and it is a significant interference in Ofcom’s independence. Ofcom itself has noted that the “reasons of public policy” power to direct might weaken the regime. If Ofcom has undertaken a logical process, rooted in evidence, to arrive at a draft code, it is hard to see how a direction based on “reasons of public policy” is not irrational. That then creates a vulnerability to legal challenge.

On clause 40 more widely, the Secretary of State should not be able to give Ofcom specific direction on non-strategic matters. Ofcom’s independence in day-to-day decision making is paramount to preserving freedom of expression. Independence of media regulators is the norm in developed democracies. The UK has signed up to many international statements in that vein, including as recently as April 2022 at the Council of Europe. That statement says that

“media and communication governance should be independent and impartial to avoid undue influence on policy making, discriminatory treatment and preferential treatment of powerful groups, including those with significant political or economic power.”

The Bill introduces powers for the Secretary of State to direct Ofcom on internet safety codes. These provisions should immediately be removed. After all, in broadcasting regulation, Ofcom is trusted to make powerful programme codes with no interference from the Secretary of State. Labour further notes that although the draft Bill permitted this

“to ensure that the code of practice reflects government policy”,

clause 40 now specifies that any code may be required to be modified

“for reasons of public policy”.

Although that is more normal language, it is not clear what in practice the difference in meaning is between the two sets of wording. I would be grateful if the Minister could confirm what that is.

The same clause gives the Secretary of State powers to direct Ofcom, on national security or public safety grounds, in the case of terrorism or CSEA—child sexual exploitation and abuse—codes of practice. The Secretary of State might have some special knowledge of those, but the Government have not demonstrated why they need a power to direct. In the broadcasting regime, there are no equivalent powers, and the Secretary of State was able to resolve the case of Russia Today, on national security grounds, with public correspondence between the Secretary of State and Ofcom.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Good morning, Ms Rees; it is a pleasure to serve under your chairmanship again. The SNP spokesman and the shadow Minister have already explained what these provisions do, which is to provide a power for the Secretary of State to make directions to Ofcom in relation to modifying a code of conduct. I think it is important to make it clear that the measures being raised by the two Opposition parties are, as they said, envisaged to be used only in exceptional circumstances. Of course the Government accept that Ofcom, in common with other regulators, is rightly independent and there should be no interference in its day-to-day regulatory decisions. This clause does not seek to violate that principle.

However, we also recognise that although Ofcom has great expertise as a regulator, there may be situations in which a topic outside its area of expertise needs to be reflected in a code of practice, and in those situations, it may be appropriate for a direction to be given to modify a code of conduct. A recent and very real example would be in order to reflect the latest medical advice during a public health emergency. Obviously, we saw in the last couple of years, during covid, some quite dangerous medical disinformation being spread—concerning, for example, the safety of vaccines or the “prudence” of ingesting bleach as a remedy to covid. There was also the purported and entirely false connection between 5G phone masts and covid. There were issues on public policy grounds—in this case, medical grounds—and it might have been appropriate to make sure that a code of conduct was appropriately modified.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

It was mentioned earlier that some of us were on previous Committees that made recommendations more broadly that would perhaps be in line with the amendment. Since that time, there has been lots of discussion around this topic, and I have raised it with the Minister and colleagues. I feel reassured that there is a great need to keep the clause as is because of the fact that exceptional circumstances do arise. However, I would like reassurances that directions would be made only in exceptional circumstances and would not override the Ofcom policy or remit, as has just been discussed.

Chris Philp Portrait Chris Philp
- Hansard - -

I can provide my hon. Friend with that reassurance on the exceptional circumstances point. The Joint Committee report was delivered in December, approximately six months ago. It was a very long report—I think it had more than 100 recommendations. Of course, members of the Committee are perfectly entitled, in relation to one or two of those recommendations, to have further discussions, listen further and adjust their views if they individually see fit.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me just finish this point and then I will give way. The shadow SNP spokesman, the hon. Member for Ochil and South Perthshire, asked about the Government listening and responding, and we accepted 66 of the Joint Committee’s recommendations —a Committee that he served on. We made very important changes to do with commercial pornography, for example, and fraudulent advertising. We accepted 66 recommendations, so it is fair to say we have listened a lot during the passage of this Bill. On the amendments that have been moved in Committee, often we have agreed with the amendments but the Bill has already dealt with the matter. I wanted to respond to those two points before giving way.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I am intrigued, as I am sure viewers will be. What is the new information that has come forward since December that has resulted in the Minister believing that he must stick with this? He has cited new information and new evidence, and I am dying to know what it is.

Chris Philp Portrait Chris Philp
- Hansard - -

I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Before the Minister moves off the point about exceptional circumstances, it was the case previously that an amendment of the law resolution was always considered with Finance Bills. In recent years, that has stopped on the basis of it being exceptional circumstances because a general election was coming up. Then the Government changed that, and now they never table an amendment of the law resolution because they have decided that that is a minor change. Something has gone from being exceptional to being minor, in the view of this Government.

The Minister said that he envisions that this measure will be used only in exceptional circumstances. Can he commit himself to it being used only in exceptional circumstances? Can he give the commitment that he expects that it will be used only in exceptional circumstances, rather than simply envisioning that it will be used in such circumstances?

Chris Philp Portrait Chris Philp
- Hansard - -

I have made clear how we expect the clause to be used. I am slightly hesitant to be more categorical simply because I do not want to make comments that might unduly bind a future Secretary of State—or, indeed, a future Parliament, because the measure is subject to the affirmative procedure—even were that Secretary of State, heaven forbid, to come from a party other than mine. Circumstances might arise, such as the pandemic, in which a power such as this needs to be exercised for good public policy reasons—in that example, public health. I would not want to be too categorical, which the hon. Lady is inviting me to be, lest I inadvertently circumscribe the ability of a future Parliament or a future Secretary of State to act.

The power is also limited in the sense that, in relation to matters that are not to do with national security or terrorism or CSEA, the power to direct can be exercised only at the point at which the code is submitted to be laid before Parliament. That cannot be done at any point. The power cannot be exercised at a time of the Secretary of State’s choosing. There is one moment, and one moment only, when that power can be exercised.

I also want to make it clear that the power will not allow the Secretary of State to direct Ofcom to require a particular regulated service to take a particular measure. The power relates to the codes of practice; it does not give the power to intrude any further, beyond the code of practice, in the arena of regulated activity.

I understand the points that have been made. We have listened to the Joint Committee, and we have made an important change, which is that to the affirmative procedure. I hope my explanation leaves the Committee feeling that, following that change, this is a reasonable place for clauses 40 and 41 to rest. I respectfully resist amendment 84 and new clause 12, and urge the Committee to allow clauses 40 and 41 to stand part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Given that the clause is clearly uncontentious, I will be extremely brief.

Chris Philp Portrait Chris Philp
- Hansard - -

I can see that that is the most popular thing I have said during the entire session—when you say, “And finally,” in a speech and the crowd cheers, you know you are in trouble.

Regulated user-to-user and search services will have duties to keep records of their risk assessments and the measures they take to comply with their safety duties, whether or not those are the ones recommended in the codes of practice. They must also undertake a children’s access assessment to determine whether children are likely to access their service.

Clause 48 places a duty on Ofcom to produce guidance to assist service providers in complying with those duties. It will help to ensure a consistent approach from service providers, which is essential in maintaining a level playing field. Ofcom will have a duty to consult the Information Commissioner prior to preparing this guidance, as set out in clause 48(2), in order to draw on the expertise of the Information Commissioner’s Office and ensure that the guidance is aligned with wider data protection and privacy regulation.

Question put and agreed to.

Clause 48 accordingly ordered to stand part of the Bill.

Clause 49

“Regulated user-generated content”, “user-generated content”, “news

publisher content”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 89, in clause 49, page 45, line 16, leave out subsection (e).

This amendment would remove the exemption for comments below news articles posted online.

--- Later in debate ---
We know—or I know, having some of my own—that children and young people cannot really be bothered to type things and much prefer to leave a voice message or something. I appreciate that voice messages do not count as live, but some conversations that will happen on platforms such as Discord are live, and those are those most harmful places where children can be encouraged to create child sexual abuse images, for example. I do not necessarily expect the Minister to have all the answers today, and I know there will be other opportunities to amend the Bill, but I would really appreciate it if he took a good look at the Bill and considered whether strengthening provisions can be put in place. If he desires to exempt one-to-one aural communications, he may still do that, while ensuring that child sexual abuse and grooming behaviour are considered illegal and within the scope of the Bill in whatever form they take place, whether in aural communications or in any other way.
Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by addressing the substance of the two amendments and then I will answer one or two of the questions that arose in the course of the debate.

As Opposition Members have suggested, the amendments would bring the comments that appear below the line on news websites such as The Guardian, MailOnline or the BBC into the scope of the Bill’s safety duties. They are right to point out that there are occasions when the comments posted on those sites are extremely offensive.

There are two reasons why comments below BBC, Guardian or Mail articles are excluded from the scope of the Bill. First, the news media publishers—newspapers, broadcasters and their representative industry bodies—have made the case to the Government, which we are persuaded by, that the comments section below news articles is an integral part of the process of publishing news and of what it means to have a free press. The news publishers—both newspapers and broadcasters that have websites—have made that case and have suggested, and the Government have accepted, that intruding into that space through legislation and regulation would represent an intrusion into the operation of the free press.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am sorry, but I am having real trouble buying that argument. If the Minister is saying that newspaper comments sections are exempt in order to protect the free press because they are an integral part of it, why do we need the Bill in the first place? Social media platforms could argue in the same way that they are protecting free speech. They could ask, “Why should we regulate any comments on our social media platform if we are protecting free speech?” I am sorry; that argument does not wash.

Chris Philp Portrait Chris Philp
- Hansard - -

There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.

There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.

Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

This issue comes down to a fundamental point—are we looking at volume or risk? There is no difference between an individual—a young person in this instance—seeing something about suicide or self-harm on a Facebook post or in the comments section of a newspaper article. The volume—whether it goes viral or not—does not matter if that individual has seen that content and it has directed them to somewhere that will create serious harm and lead them towards dangerous behaviour. The volume is not the point.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question on anonymity. On social media there will be a requirement to verify users’ identities, so if somebody posts on Twitter that they want to lynch me, it is possible to find out who that is, provided they do not have an anonymous account. There is no such provision for newspaper comment sections, so I assume it would be much more difficult for the police to find them, or for me not to see anonymous comments that threaten my safety below the line of newspaper articles—comments that are just as harmful, which threaten my safety on social media. Can the Minister can convince me otherwise?

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is correct in her analysis, I can confirm. Rather similar to the previous point, because of the interaction with freedom of the press—the argument that the newspapers and broadcasters have advanced—and because this is an inherently less viral environment, we have drawn the balance where we have. She is right to highlight a reasonable risk, but we have struck the balance in the way we have for that reason.

The shadow Minister, the hon. Member for Pontypridd, asked whether very harmful or illegal interactions in the metaverse would be covered or whether they have a metaphorical “get out of jail free” card owing to the exemption in clause 49(2)(d) for “one-to-one live aural communications”. In essence, she is asking whether, in the metaverse, if two users went off somewhere and interacted only with each other, that exemption would apply and they would therefore be outwith the scope of the Bill. I am pleased to tell her they would not, because the definition of live one-to-one aural communications goes from clause 49(2)(d) to clause 49(5), which defines “live aural communications”. Clause 49(5)(c) states that the exemption applies only if it

“is not accompanied by user-generated content of any other description”.

The actions of a physical avatar in the metaverse do constitute user-generated content of any other description. Owing to that fact, the exemption in clause 49(2)(d) would not apply to the metaverse.

I am happy to provide clarification on that. It is a good question and I hope I have provided an example of how, even though the metaverse was not conceived when the Bill was conceived, it does have an effect.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that point, when it comes to definition of content, we have tabled an amendment about “any other content”. I am not convinced that the definition of content adequately covers what the Minister stated, because it is limited, does not include every possible scenario where it is user-generated and is not future-proofed enough. When we get to that point, I would appreciate it if the Minister would look at the amendment and ensure that what he intends is what happens.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful to the hon. Lady for thinking about that so carefully. I look forward to her amendment. For my information, which clause does her amendment seek to amend?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will let the Minister know in a moment.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful. It is an important point.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

During the Joint Committee we were concerned about future-proofing. Although I appreciate it is not specifically included in the Bill because it is a House matter, I urge the setting up of a separate Online Safety Act committee that runs over time, so that it can continue to be improved upon and expanded, which would add value. We do not know what the next metaverse will be in 10 years’ time. However, I feel confident that the metaverse was included and I am glad that the Minister has confirmed that.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my hon. Friend for his service on the Joint Committee. I heard the representations of my right hon. Friend the Member for Basingstoke about a Joint Committee, and I have conveyed them to the higher authorities.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The amendment that the Minister is asking about is to clause 189, which states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

It is amendment 76 that, after “including”, would insert “but not limited to”, in order that the Bill is as future-proofed as it can be.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,

“anything communicated by means of an internet service”,

which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.

The remaining question—

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.

What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is helpful for understanding the rationale, but in the light of how people communicate online these days, although exempting telephone conversations makes sense, exempting what I am talking about does not. I would appreciate it if the Minister came back to me on that, and he does not have to give me an answer now. It would also help if he explained the difference between “aural” and “oral”, which are mentioned at different points in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

I will certainly come back with a more complete analysis of the point about protecting children—as parents, that clearly concerns us both. The literal definitions are that “aural” means “heard” and “oral” means “spoken”. They occur in different places in the Bill.

This is a difficult issue and legitimate questions have been raised, but as I said in response to the hon. Member for Batley and Spen, in this area as in others, there are balances to strike and different considerations at play—freedom of the press on the one hand, and the level of risk on the other. I think that the clause strikes that balance in an appropriate way.

Question put, That the amendment be made.

--- Later in debate ---
There is no simple, agreed definition of what constitutes a recognised news publisher, and even those who have given evidence on behalf of the press have conceded that, but we must find a way to navigate this challenge. As drafted, the Bill does not do that. I am open to working with colleagues from all parties to tweak and improve this amendment, and to find an acceptable and agreed way to secure the balance we all wish to see. However, so far I have not seen or heard a better way to tighten the definitions in the Bill so as to achieve this balance, and I believe this amendment is an important step in the right direction.
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Batley and Spen for her speech. There is agreement across the House, in this Committee and in the Joint Committee that the commitment to having a free press in this country is extremely important. That is why recognised news publishers are exempted from the provisions of the Bill, as the hon. Lady said.

The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.

While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.

In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.

The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

--- Later in debate ---
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

In its current form, the Online Safety Bill states that platforms do not have any duties relating to content from recognised media outlets and new publishers, and the outlets’ websites are also exempt from the scope of the Bill. However, the way the Bill is drafted means that hundreds of independently regulated specialist publishers’ titles will be excluded from the protections afforded to recognised media outlets and news publishers. This will have a long-lasting and damaging effect on an indispensable element of the UK’s media ecosystem.

Specialist publishers provide unparalleled insights into areas that broader news management organisations will likely not analyse, and it would surely be foolish to dismiss and damage specialist publications in a world where disinformation is becoming ever more prevalent. The former Secretary of State, the right hon. Member for Maldon (Mr Whittingdale), also raised this issue on Second Reading, where he stated that specialist publishers

“deserve the same level of protection.”—[Official Report, 19 April 2022; Vol. 712, c. 109.]

Part of the rationale for having the news publishers exemption in the Bill is that it means that the press will not be double-regulated. Special interest material is already regulated, so it should benefit from the same exemptions.

Chris Philp Portrait Chris Philp
- Hansard - -

For the sake of clarity, and for the benefit of the Committee and those who are watching, could the hon. Gentleman say a bit more about what he means by specialist publications and perhaps give one or two examples to better illustrate his point?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I would be delighted to do so. I am talking about specific and occasionally niche publications. Let us take an example. Gardeners’ World is not exactly a hotbed of online harm, and nor is it a purveyor of disinformation. It explains freely which weeds to pull up and which not to, without seeking to confuse people in any way. Under the Bill, however, such publications will be needlessly subjected to rules, creating a regulatory headache for the sector. This is a minor amendment that will help many businesses, and I would be interested to hear from the Minister why the Government will not listen to the industry on this issue.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Ochil and South Perthshire for his amendment and his speech. I have a couple of points to make in reply. The first is that the exemption is about freedom of the press and freedom of speech. Clearly, that is most pertinent and relevant in the context of news, information and current affairs, which is the principal topic of the exemption. Were we to expand it to cover specialist magazines—he mentioned Gardeners’ World—I do not think that free speech would have the same currency when it comes to gardening as it would when people are discussing news, current affairs or public figures. The free speech argument that applies to newspapers, and to other people commenting on current affairs or public figures, does not apply in the same way to gardening and the like.

That brings me on to a second point. Only a few minutes ago, the hon. Member for Batley and Spen drew the Committee’s attention to the risks inherent in the clause that a bad actor could seek to exploit. It was reasonable of her to do so. Clearly, however, the more widely we draft the clause—if we include specialist publications such as Gardeners’ World, whose circulation will no doubt soar on the back of this debate—the greater the risk of bad actors exploiting the exemption.

My third point is about undue burdens being placed on publications. To the extent that such entities count as social media platforms—in-scope services—the most onerous duties under the Bill apply only to category 1 companies, or the very biggest firms such as Facebook and so on. The “legal but harmful” duties and many of the risk assessment duties would not apply to many organisations. In fact, I think I am right to say that if the only functionality on their websites is user comments, they would in any case be outside the scope of the Bill. I have to confess that I am not intimately familiar with the functionality of the Gardeners’ World website, but there is a good chance that if all it does is to provide the opportunity to post comments and similar things, it would be outside the scope of the Bill anyway, because it does not have the requisite functionality.

I understand the point made by the hon. Member for Ochil and South Perthshire, we will, respectfully, resist the amendment for the many reasons I have given.

None Portrait The Chair
- Hansard -

John, do you wish to press the amendment to a vote?

--- Later in debate ---
My hon. Friend the Member for Ochil and South Perthshire mentioned Gardeners’ World. There are also websites and specialist online publications such as the British Medical Journal that are subject to specific regulation that is separate from the Bill; if they have any user-to-user functionality—I do not know whether the BMJ does—they will also be subject to the requirements described in the Bill. Such publications are inoffensive and provide a huge amount of important information to people; that is not necessarily to say that they should not be regulated, but it does not seem that there is a level playing field. Particularly during the pandemic, peer-reviewed scientific journals were incredibly important in spreading public service information; nevertheless, the Bill includes them in its scope, but not news publications. I am not sure why the Minister is drawing the line where he is on this issue, so a little more clarity would be appreciated.
Chris Philp Portrait Chris Philp
- Hansard - -

I made general comments about clause 50 during the debate on amendment 107; I will not try the Committee’s patience by repeating them, but I believe that in them, I addressed some of the issues that the shadow Minister, the hon. Member for Pontypridd, has raised.

On the hon. Member for Aberdeen North’s question about where the Bill states that sites with limited functionality—for example, functionality limited to comments alone—are out of scope, paragraph 4(1) of schedule 1 states that

“A user-to-user service is exempt if the functionalities of the service are limited, such that users are able to communicate by means of the service only in the following ways—

(a) posting comments or reviews relating to provider content;

(b) sharing such comments or reviews on a different internet service”.

Clearly, services where a user can share freely are in scope, but if they cannot share directly—if they can only share via another service, such as Facebook—that service is out of scope. This speaks to the point that I made to the hon. Member for Batley and Spen in a previous debate about the level of virality, because the ability of content to spread, proliferate, and be forced down people’s throats is one of the main risks that we are seeking to address through the Bill. I hope that paragraph 4(1) of schedule 1 is of assistance, but I am happy to discuss the matter further if that would be helpful.

Question put and agreed to.

Clause 50 accordingly ordered to stand part of the Bill.

Clause 51

“Search content”, “search results” etc

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour does not oppose the intention of the clause. It is important to define “search content” in order to understand the responsibilities that fall within search services’ remits.

However, we have issues with the way that the Bill treats user-to-user services and search services differently when it comes to risk-assessing and addressing legal harm—an issue that we will come on to when we debate schedule 10. Although search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars are fundamentally their responsibility. We do, however, accept that over the past 20 years, Google, for example, has developed mechanisms to provide a safer search experience for users while not curtailing access to lawful information. We also agree that search engines are critical to the proper functioning of the world wide web; they play a uniquely important role in facilitating access to the internet, and enable people to access, impart, and disseminate information.

Question put and agreed to.

Clause 51 accordingly ordered to stand part of the Bill.

Clause 52

“Illegal content” etc

--- Later in debate ---
I am concerned that we are missing an opportunity to tackle an issue that is an overwhelming problem for many women in this country, and I hope that when the Minister responds to this part of the debate, he can clearly set out the Government’s intention to tackle the issue. We all know that parliamentary time is in short supply: the Government have many Bills that they have to get through in this Session, before the next general election. I am concerned that this particular issue, which the Law Commission itself sees as so important, may not get the rapid legislation that we, as elected representatives, need to see happen. The foundation of the Bill is a duty of care, but that duty of care is only as good as the criminal law. If the criminal law is wanting when it comes to the publication online of intimate images, that is the taking, making and sharing of intimate images without consent—if that is not adequately covered in the criminal law—this legislation will not help the many people we want it to help. Will the Minister, in responding to the debate, outline in some detail, if possible, how he will handle the issue and when he hopes to make public the Law Commission recommendations, for which many people have been waiting for many years?
Chris Philp Portrait Chris Philp
- Hansard - -

I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.

A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”

What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.

The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.

The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.

The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.

As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.

My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.

Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.

My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.

Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.

Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.

Chris Philp Portrait Chris Philp
- Hansard - -

As always, the right hon. Lady makes an incredibly powerful point. She asked specifically about whether the Bill is a suitable legislative vehicle in which to implement any Law Commission recommendations—we do not yet have the final version of that report—and I believe that that would be in scope. A decision about legislative vehicles depends on the final form of the Law Commission report and the Ministry of Justice response to it, and on cross-Government agreement about which vehicle to use.

I hope that addresses all the questions that have been raised by the Committee. Although the shadow Minister is right to raise the question, I respectfully ask her to withdraw amendment 61 on the basis that those matters are clearly covered in clause 52(9). I commend the clause to the Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for his comments. The Labour party has concerns that clause 52(9) does not adequately get rid of the ambiguity around potential illegal online content. We feel that amendment 61 sets that out very clearly, which is why we will press it to a vote.

Chris Philp Portrait Chris Philp
- Hansard - -

Just to help the Committee, what is it in clause 52(9) that is unclear or ambiguous?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We just feel that amendment 61 outlines matters much more explicitly and leaves no ambiguity by clearly defining any

“offences committed overseas within the scope of relevant offences for the purposes of defining illegal content.”

Chris Philp Portrait Chris Philp
- Hansard - -

I think they say the same thing, but we obviously disagree.

Question put, That the amendment be made.

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Tenth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Good afternoon, Sir Roger; it is a pleasure, as ever, to serve under your chairship. I rise to speak to new clause 36, which has been grouped with amendment 142 and is tabled in the names of the hon. Members for Ochil and South Perthshire and for Aberdeen North.

I, too, pay tribute to Samaritans for all the work it has done in supporting the Bill and these amendments to it. As colleagues will be aware, new clause 36 follows a recommendation from the Law Commission dating back to July 2021. The commission recommended the creation of a new, narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. It identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions in the Bill to create a new offence of assisting or encouraging self-harm, despite the fact that other recommendations from the Law Commission report have been brought into the Bill, such as creating a new offence of cyber-flashing and prioritising tackling illegal suicide content.

We all know that harmful suicide and self-harm content is material that has the potential to cause or exacerbate self-harm and suicidal behaviours. Content relating to suicide and self-harm falls into both categories in the Bill—illegal content and legal but harmful content. Encouraging or assisting suicide is also currently a criminal offence in England and Wales under the Suicide Act 1961, as amended by the Coroners and Justice Act 2009.

Content encouraging or assisting someone to take their own life is illegal and has been included as priority illegal content in the Bill, meaning that platforms will be required to proactively and reactively prevent individuals from encountering it, and search engines will need to structure their services to minimise the risk to individuals encountering the content. Other content, including content that positions suicide as a suitable way of overcoming adversity or describes suicidal methods, is legal but harmful.

The Labour party’s Front-Bench team recognises that not all content falls neatly into the legal but harmful category. What can be helpful for one user can be extremely distressing to others. Someone may find it extremely helpful to share their personal experience of suicide, for example, and that may also be helpful to other users. However, the same material could heighten suicidal feelings and levels of distress in someone else. We recognise the complexities of the Bill and the difficulties in developing a way around this, but we should delineate harmful and helpful content relating to suicide and self-harm, and that should not detract from tackling legal but clearly harmful content.

In its current form, the Bill will continue to allow legal but clearly harmful suicide and self-harm content to be accessed by over-18s. Category 1 platforms, which have the highest reach and functionality, will be required to carry out risk assessments of, and set out in their terms and conditions their approach to, legal but harmful content in relation to over-18s. As the hon. Member for Ochil and South Perthshire outlined, however, the Bill’s impact assessment states that “less than 0.001%” of in-scope platforms

“are estimated to meet the Category 1 and 2A thresholds”,

and estimates that only 20 platforms will be required to fulfil category 1 obligations. There is no requirement on the smaller platforms, including those that actively encourage suicide, to do anything at all to protect over-18s. That simply is not good enough. That is why the Labour party supports new clause 36, and we urge the Minister to do the right thing by joining us.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.

If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.

On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.

The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

In a second, but I may be about to answer the hon. Lady’s question.

Those category 1 companies are likely to be small in number, as I think the shadow Minister said, but I would imagine—I do not have the exact number—that they cover well over 90% of all traffic. However, as I hinted on the Floor of the House on Second Reading—we may well discuss this later—we are thinking about including platforms that may not meet the category 1 size threshold but none the less pose high-level risks of harm. If that is done—I stress “if”—it will address the point raised by the hon. Member for Ochil and South Perthshire. That may answer the point that the hon. Member for Batley and Spen was going to raise, but if not, I happily give way.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It kind of does, but the Minister has raised some interesting points about children and adults and the risk of harm. To go back to the work of Samaritans, it is really important to talk about the fact that suicide is the biggest killer of young people aged 16 to 24, so it transcends the barrier between children and adults. With the right hon. Member for Basingstoke, the hon. Member for Aberdeen North, and the shadow Minister, my hon. Friend the Member for Pontypridd, we have rightly talked a lot about women, but it is really important to talk about the fact that men account for three quarters of all suicide. Men aged between 45 and 49 are most at risk of suicide—the rate among that group has been persistently high for years. It is important that we bring men into the discussion about suicide.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful for the element of gender balance that the hon. Member has introduced, and she is right to highlight the suicide risk. Inciting suicide is already a criminal offence under section 2 of the Suicide Act 1961 and we have named it a priority offence. Indeed, it is the first priority offence listed under schedule 7—it appears a third of the way down page 183—for exactly the reason she cited, and a proactive duty is imposed on companies by paragraph 1 of schedule 7.

On amendment 142 and the attendant new clause 36, the Government agree with the sentiment behind them—namely, the creation of a new offence of encouraging or assisting serious self-harm. We agree with the substance of the proposal from the hon. Member for Ochil and South Perthshire. As he acknowledged, the matter is under final consideration by the Law Commission and our colleagues in the Ministry of Justice. The offence initially proposed by the Law Commission was wider in scope than that proposed under new clause 36. The commission’s proposed offence covered the offline world, as well as the online one. For example, the new clause as drafted would not cover assisting a person to self-harm by providing them with a bladed article because that is not an online communication. The offence that the Law Commission is looking at is broader in scope.

The Government have agreed in principle to create an offence based on the Law Commission recommendation in separate legislation, and once that is done the scope of the new offence will be wider than that proposed in the new clause. Rather than adding the new clause and the proposed limited new offence to this Bill, I ask that we implement the offence recommended by the Law Commission, the wider scope of which covers the offline world as well as the online world, in separate legislation. I would be happy to make representations to my colleagues in Government, particularly in the MOJ, to seek clarification about the relevant timing, because it is reasonable to expect it to be implemented sooner rather than later. Rather than rushing to introduce that offence with limited scope under the Bill, I ask that we do it properly as per the Law Commission recommendation.

Once the Law Commission recommendation is enacted in separate legislation, to which the Government have already agreed in principle, it will immediately flow through automatically to be incorporated into clause 52(4)(d), which relates to illegal content, and under clause 176, the Secretary of State may, subject to parliamentary approval, designate the new offence as a priority offence under schedule 7 via a statutory instrument. The purpose of amendment 142 can therefore be achieved through a SI.

The Government publicly entirely agree with the intention behind the proposed new clause 36, but I think the way to do this is to implement the full Law Commission offence as soon as we can and then, if appropriate, add it to schedule 7 by SI. The Government agree with the spirit of the hon. Gentleman’s proposal, but I believe that the Government already have a plan to do a more complete job to create the new offence.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I have nothing to add and, having consulted my hon. Friend the Member for Aberdeen North, on the basis of the Minister’s assurances, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move amendment 116, in schedule 7, page 183, line 11, at end insert—

“1A An offence under section 13 of the Criminal Justice Act (Northern Ireland) 1966 (c. 20 (N.I.)) (assisting suicide etc).”

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 117 to 126.

Chris Philp Portrait Chris Philp
- Hansard - -

These amendments pick up a question asked by the hon. Member for Aberdeen North much earlier in our proceedings. In schedule 7 we set out the priority offences that exist in English and Welsh law. We have consulted the devolved Administrations in Scotland and Northern Ireland extensively, and I believe we have agreed with them a number of offences in Scottish and Northern Irish law that are broadly equivalent to the English and Welsh offences already in schedule 7. Basically, Government amendments 116 to 126 add those devolved offences to the schedule.

In future, if new Scottish or Northern Irish offences are created, the Secretary of State will be able to consult Scottish or Northern Irish Ministers and, by regulations, amend schedule 7 to add the new offences that may be appropriate if conceived by the devolved Parliament or Assembly in due course. That, I think, answers the question asked by the hon. Lady earlier in our proceedings. As I say, we consulted the devolved Administrations extensively and I hope that the Committee will assent readily to the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The amendments aim to capture all the criminal offences in other parts of the UK to be covered by the provisions of the Bill, as the Minister outlined. An offence in one part of the UK will be considered an offence elsewhere, for the purposes of the Bill.

With reference to some of the later paragraphs, I am keen for the Minister to explain briefly how this will work in the case of Scotland. We believe that the revenge porn offence in Scotland is more broadly drawn than the English version, so the level of protection for women in England and Wales will be increased. Can the Minister confirm that?

The Bill will not apply the Scottish offence to English offenders, but it means that content that falls foul of the law in Scotland, but not in England or Wales, will still be relevant regulated content for service providers, irrespective of the part of the UK in which the service users are located. That makes sense from the perspective of service providers, but I will be grateful for clarity from the Minister on this point.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I thank the Minister for tabling the amendments. In the evidence sessions, we heard about omissions in schedule 7 from not having Northern Irish and Scottish offences included. Such offences were included in schedule 6 but, at that point, not in schedule 7.

I appreciate that the Minister has worked with the devolved Administrations to table the amendments. I also appreciate the way in which amendment 126 is written, such that the Secretary of State “must consult” Scottish Ministers and the Department of Justice in Northern Ireland before making regulations that relate to legislation in either of the devolved countries. I am glad that the amendments have been drafted in this way and that the concern that we heard about in evidence no longer seems to exist, and I am pleased with the Minister’s decision about the way in which to make any future changes to legislation.

I agree with the position put forward by the hon. Member for Pontypridd. My understanding, from what we heard in evidence a few weeks ago, is that, legally, all will have to agree with the higher bar of the offences, and therefore anyone anywhere across the UK will be provided with the additional level of protection. She is right that the offence might not apply to everyone, but the service providers will be subject to the requirements elsewhere. Similarly, that is my view. Once again, I thank the Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

Briefly, I hope that the amendments provide further evidence to the Committee of the Government’s willingness to listen and to respond. I can provide the confirmation that the hon. Members for Aberdeen North and for Pontypridd requested: the effect of the clauses is a levelling up—if I may put it that way. Any of the offences listed effectively get applied to the UK internet, so if there is a stronger offence in any one part of the United Kingdom, that will become applicable more generally via the Bill. As such, the answer to the question is in the affirmative.

Amendment 116 agreed to.

None Portrait The Chair
- Hansard -

My custom with amendments to be moved formally is to call them by number. If Members wish to vote on them, they should shout; otherwise, I will rattle through them. It is quicker that way.

Amendments made: 117, in schedule 7, page 183, line 29, at end insert—

“4A An offence under section 50A of the Criminal Law (Consolidation) (Scotland) Act 1995 (racially-aggravated harassment).”

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

Amendment 118, in schedule 7, page 183, line 36, at end insert—

“5A An offence under any of the following provisions of the Protection from Harassment (Northern Ireland) Order 1997 (S.I. 1997/1180 (N.I. 9))—

(a) Article 4 (harassment);

(b) Article 6 (putting people in fear of violence).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 119, in schedule 7, page 184, line 2, at end insert—

“6A An offence under any of the following provisions of the Criminal Justice and Licensing (Scotland) Act 2010 (asp 13)—

(a) section 38 (threatening or abusive behaviour);

(b) section 39 (stalking).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 120, in schedule 7, page 184, line 38, at end insert—

“12A An offence under any of the following provisions of the Criminal Justice (Northern Ireland) Order 1996 (S.I. 1996/3160 (N.I. 24))—

(a) Article 53 (sale etc of knives);

(b) Article 54 (sale etc of knives etc to minors).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 121, in schedule 7, page 184, line 42, at end insert—

“13A An offence under any of the following provisions of the Firearms (Northern Ireland) Order 2004 (S.I. 2004/702 (N.I. 3))—

(a) Article 24 (sale etc of firearms or ammunition without certificate);

(b) Article 37(1) (sale etc of firearms or ammunition to person without certificate etc);

(c) Article 45(1) and (2) (purchase, sale etc of prohibited weapons);

(d) Article 63(8) (sale etc of firearms or ammunition to people who have been in prison etc);

(e) Article 66A (supplying imitation firearms to minors).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 122, in schedule 7, page 184, line 44, at end insert—

“14A An offence under any of the following provisions of the Air Weapons and Licensing (Scotland) Act 2015 (asp 10)—

(a) section 2 (requirement for air weapon certificate);

(b) section 24 (restrictions on sale etc of air weapons).”

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

Amendment 123, in schedule 7, page 185, line 8, at end insert—

“16A An offence under any of the following provisions of the Sexual Offences (Northern Ireland) Order 2008 (S.I. 2008/1769 (N.I. 2))—

(a) Article 62 (causing or inciting prostitution for gain);

(b) Article 63 (controlling prostitution for gain).”—(Chris Philp.)

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Schedule 7 sets out the list of criminal content that in-scope firms will be required to remove as a priority. Labour was pleased to see new additions to the most recent iteration, including criminal content relating to online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain. The Government’s consultation response suggests that the systems and processes that services may use to minimise illegal or harmful content could include user tools, content moderation and recommendation procedures.

More widely, although we appreciate that the establishment of priority offences online is the route the Government have chosen to go down with the Bill, we believe the Bill remains weak in relation to addressing harms to adults and wider societal harms. Sadly, the Bill remains weak in its approach and has seemingly missed a number of known harms to both adults and children that we feel are a serious omission. Three years on from the White Paper, the Government know where the gaps are, yet they have failed to address them. That is why we are pleased to support the amendment tabled by the hon. Members for Ochil and South Perthshire and for Aberdeen North.

Human trafficking offences are a serious omission from schedule 7 that must urgently be rectified. As we all know from whistleblower Frances Haugen’s revelations, Facebook stands accused, among the vast array of social problems, of profiting from the trade and sale of human beings—often for domestic servitude—by human traffickers. We also know that, according to internal documents, the company has been aware of the problems since at least 2018. As the hon. Member for Ochil and South Perthshire said, we know that a year later, on the heels of a BBC report that documented the practice, the problem was said to be so severe that Apple itself threatened to pull Facebook and Instagram from its app store. It was only then that Facebook rushed to remove content related to human trafficking and made emergency internal policy changes to avoid commercial consequences described as “potentially severe” by the company. However, an internal company report detailed that the company did not take action prior to public disclosure and threats from Apple—profit over people.

In a complaint to the US Securities and Exchange Commission first reported by The Wall Street Journal, whistleblower Haugen wrote:

“Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.”

I cannot believe that the Government have failed to commit to doing more to tackle such abhorrent practices, which are happening every day. I therefore urge the Minister to do the right thing and support amendment 90.

Chris Philp Portrait Chris Philp
- Hansard - -

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that the Minister thinks that there are other measures that cover this offence, but will he keep it under consideration going forward? I do not think that that is too much to ask. Part of the logic behind that is that some of the other issues, where the reasons behind them must be proved, are much more difficult to define or prove than the modern slavery offences that we are asking to be added here. Whether he accepts the amendment or not, will he commit to considering the matter and not just saying, “Absolutely no”? That would be helpful for us and the many organisations that are keen for such things to be included.

Chris Philp Portrait Chris Philp
- Hansard - -

I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I have nothing further to add. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Schedule 7, as amended, agreed to.

Clause 53

“Content that is harmful to children” etc

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions for the Minister. The first is about the interaction of subsection (4)(c) and subsection (5). I am slightly confused about how that, because subsection (4)(c) states that anything that is not within the terms of primary priority content or primary content but is harmful to

“an appreciable number of children”

is included as

“content that is harmful to children”.

That is completely reasonable. However, subsection (5) excludes illegal content and content with a “potential financial impact”. I appreciate that these provisions are drafted in quite a complicated way, but it would be useful to have an understanding of what that means. If it means there is no harm on the basis of things that are financial in nature, that is a problem, because that explicitly excludes gambling-type sites, loot boxes and anything of that sort, which by their nature are intentionally addictive and try to get children or adults to part with significant amounts of cash. If they are excluded, that is a problem.

How will clause 53 be future-proofed? I am not suggesting that there is no future proofing, but it would be helpful to me and fellow Committee members if the Minister explained how the clause will deal with new emerging harms and things that may not necessarily fall within the definitions that we set initially. How will those definitions evolve and change as the internet evolves and changes, and as the harms with which children are presented evolve and change?

And finally—I know that the Minister mentioned earlier that saying, “And finally”, in a speech is always a concern, but I am saying it—I am slightly concerned about the wording in subsection (4)(c), which refers to

“material risk of significant harm to an appreciable number of children”,

because I am not clear what an “appreciable number” is. If there is significant harm to one child from content, and content that is incredibly harmful to children is stumbled upon by a child, is it okay for that provider to have such content? It is not likely to accessed by an “appreciable number of children” and might be accessed by only a small number, but if the Minister could give us an understanding of what the word “appreciable” means in that instance, that would be greatly appreciated.

Chris Philp Portrait Chris Philp
- Hansard - -

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I am jumping ahead a bit, but I know that we will discuss clause 150, Zach’s law and epilepsy in particular at some point. Given the definition that my hon. Friend has just cited, am I correct to assume that the physical harm posed to those with epilepsy who might be targeted online will be covered, and that it is not just about psychological harm?

Chris Philp Portrait Chris Philp
- Hansard - -

I admire my hon. Friend’s attention to the debate. The definition of harm for the harmful communications offence in clause 150 is set out in clause 150(4). In that context, harm is defined slightly differently, as

“psychological harm amounting to at least serious distress”.

The definition of harm in clause 187 that I read out is the definition of harm used elsewhere in the Bill. However, as I said before in the House and in the evidence session, the Government’s belief and intention is that epilepsy trolling would fall in the scope of clause 150, because giving someone an epileptic fit clearly does have a physical implication, as my hon. Friend said, but also causes psychological harm. Being given an epileptic fit is physically damaging, but it causes psychological harm as well.

Despite the fact that the definition of harm in clause 187 does not apply in clause 150, which has its own definition of harm, I am absolutely categoric that epilepsy trolling is caught by clause 150 because of the psychological harm it causes. I commend my hon. Friend the Member for Watford for being so attentive on the question of epilepsy, and also in this debate.

Returning to the definition of harm in clause 187, besides the wide definition covering physical and psychological harm, clause 187(4) makes it clear that harm may also arise not just directly but if the content prompts individuals to

“act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”.

Clause 187(4)(b) covers content where the

“individuals do or say something to another individual that results in”

that individual suffering harm. I hope the shadow Minister is reassured that the definition of harm that applies here is extremely wide in scope.

There was a question about media literacy, which I think the hon. Member for Batley and Spen raised in an intervention. Media literacy duties on Ofcom already exist in the Communications Act 2003. The Government published a comprehensive and effective media literacy strategy about a year ago. In December—after the first version of the Bill was produced, but before the second and updated version—Ofcom updated its policy in a way that went beyond the duties contained in the previous version of the Bill. From memory, that related to the old clause 103, in the version of the Bill published in May last year, which is of course not the same clause in this version of the Bill, as it has been updated.

The hon. Member for Aberdeen North raised, as ever, some important points of detail. She asked about future proofing. The concept of harm expressed in the clause is a general concept of harm. The definition of harm is whatever is harmful to children, which includes things that we do not know about at the moment and that may arise in the future. Secondly, primary priority content and priority content that is harmful can be updated from time to time by a statutory instrument. If some new thing happens that we think deserves to be primary priority content or priority content that is harmful to children, we can update that using a statutory instrument.

The hon. Lady also asked about exclusions in clause 53(5). The first exclusion in subsection (5)(a) is illegal content, because that is covered elsewhere in the Bill—it is covered in clause 52. That is why it is excluded, because it is covered elsewhere. The second limb, subsection 5(b), covers some financial offences. Those are excluded because they are separately regulated. Financial services are separately regulated. The hon. Lady used the example of gambling. Gambling is separately regulated by the Gambling Act 2005, a review of which is imminent. There are already very strong provisions in that Act, which are enforced by the regulator, the Gambling Commission, which has a hard-edged prohibition on gambling if people are under 18.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

However, I do not think that loot boxes even existed in 2005 when that Act was published. Loot boxes are gambling. They may not be covered by that legislation, but they are gambling. Will the Minister consider whether those harms are unintentionally excluded by clause 53?

Chris Philp Portrait Chris Philp
- Hansard - -

We are getting into some detail here. In the unlikely event that any member of the Committee does not know what a loot box is, it is where someone playing a game can buy extra lives or enhance the game’s functionality somehow by paying some money. There have been some cases where children have stolen their parent’s credit card and bought these things in large numbers

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Having played lots of games, I can clarify that people do not know what they are getting with a loot box, so they are putting money forward but do not know whether they will get a really good piece of armour or a really crap piece of armour. It is literally gambling, because children do not know what will come out of the box, as opposed to just buying a really good piece of armour with £2.99 from their parent’s credit card.

Chris Philp Portrait Chris Philp
- Hansard - -

However, the reward is non-monetary in nature. For that reason, the Government’s view—if I can test your patience momentarily, Sir Roger, as we are straying somewhat outside this particular debate—is that loot boxes will not be covered by the gambling review, because we do not see them as gambling. However, we do see them as an issue that needs to be addressed, and that will happen via the online advertising programme, which will be overseen by the Minister for Media, Data and Digital Infrastructure, my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez). That will happen shortly and advertising legislation will follow, so loot boxes will be addressed in the online advertising programme and the subsequent legislation.

The other question raised by the hon. Member for Aberdeen North was about the definition of “an appreciable number”. I have a couple of points to make. By definition, anything that is illegal is covered already in schedule 7 or through clause 52(4)(d), which we have mentioned a few times. Content that is

“primary priority content that is harmful to children”

or

“priority content that is harmful to children”

is covered in clause 53(4)(a) and (b), so we are now left with the residue of stuff that is neither illegal nor primary priority content; it is anything left over that might be harmful. By definition, we have excluded all the serious harms already, because they would be either illegal or in the priority categories. We are left with the other stuff. The reason for the qualifier “appreciable” is to make sure that we are dealing only with the residual non-priority harmful matters. We are just making sure that the duty is reasonable. What constitutes “appreciable” will ultimately get set out through Ofcom guidance, but if it was a tiny handful of users and it was not a priority harm, and was therefore not considered by Parliament to be of the utmost priority, it would be unlikely to be applicable to such a very small number. Because it is just the residual category, that is a proportionate and reasonable approach to take.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given the Government’s ability to designate priority content and primary priority content through secondary legislation, the Minister is telling me that if they decided that loot boxes were not adequately covered by the future legislation coming through, and they were to discover that something like this was a big issue, they could add that to one of the two priority content designations.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.

Question put and agreed to.

Clause 53 accordingly ordered to stand part of the Bill.

Clause 54

“Content that is harmful to children” etc

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 83, in clause 54, page 50, line 39, at end insert—

“(2A) Priority content designated under subsection (2) must include content that contains health-related misinformation and disinformation, where such content is harmful to adults.”

This amendment would amend Clause 54 so that the Secretary of State’s designation of “priority content that is harmful to adults” must include a description of harmful health related misinformation or disinformation (as well as other priority content that might be designated in regulations by the Secretary of State).

The Bill requires category 1 service providers to set out how they will tackle harmful content on their platforms. In order for this to work, certain legal but harmful content must be designated in secondary legislation as

“priority content that is harmful to adults.”

As yet, however, it is not known what will be designated as priority content or when. There have been indications from Government that health-related misinformation and disinformation will likely be included, but there is no certainty. The amendment would ensure that harmful health-related misinformation and disinformation would be designated as priority content that is harmful to adults.

--- Later in debate ---
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

Thank you, Sir Roger. I think that the Minister would agree that this is probably one of the most contentious parts of the Bill. It concerns legal but harmful content, which is causing an awful lot of concern out there. The clause says that the Secretary of State may in regulations define as

“priority content that is harmful to adults”

content that he or she considers to present

“a material risk of significant harm to an appreciable number of adults”.

We have discussed this issue in other places before, but I am deeply concerned about freedom of speech and people being able to say what they think. What is harmful to me may not be harmful to any other colleagues in this place. We would be leaving it to the Secretary of State to make that decision. I would like to hear the Minister’s thoughts on that.

Chris Philp Portrait Chris Philp
- Hansard - -

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

My hon. Friend has talked about the Department’s counter-disinformation unit. Do the Government anticipate that that function to continue, or will they expect Ofcom to do it?

Chris Philp Portrait Chris Philp
- Hansard - -

The work of the counter-disinformation unit is valuable. We look at these things on a spending review by spending review basis, and as far as I am aware we intend to continue with the counter-disinformation unit over the current spending review period. Clearly, I cannot commit future Ministers in perpetuity, but my personal view—if I am allowed to express it—is that that unit performs a useful function and could valuably be continued into the future. I think it is useful for the Government, as well as Ofcom, to directly have eyes on this issue, but I cannot speak for future Ministers. I can only give my right hon. Friend my own view.

I hope that I have set out my approach. We have heard the calls to publish the list so that parliamentarians can scrutinise it, and we also heard them on Second Reading.

I will now turn to the question raised by my hon. Friend the Member for Don Valley regarding freedom of expression. Those on one side of the debate are asking us to go further and to be clearer, while those on the other side have concerns about freedom of expression. As I have said, I honestly do not think that these legal but harmful provisions infringe on freedom of speech, for three reasons. First, even when the Secretary of State decides to designate content and Parliament approves of that decision through the affirmative procedure—Parliament gets to approve, so the Secretary of State is not acting alone—that content is not being banned. The Bill does not say that content designated as legal but harmful should immediately be struck from every corner of the internet. It simply says that category 1 companies—the big ones—have to do a proper risk assessment of that content and think about it properly.

Secondly, those companies have to have a policy to deal with that content, but that policy is up to them. They could have a policy that says, “It is absolutely fine.” Let us say that health disinformation is on the list, as one would expect it to be. A particular social media firm could have a policy that says, “We have considered this. We know it is risky, but we are going to let it happen anyway.” Some people might say that that is a weakness in the Bill, while others might say that it protects freedom of expression. It depends on one’s point of view, but that is how it works. It is for the company to choose and set out its policy, and the Bill requires it to enforce it consistently. I do not think that the requirements I have laid out amount to censorship or an unreasonable repression of free speech, because the platforms can still set their own terms and conditions.

There is also the general duty to have regard to free speech, which is introduced in clause 19(2). At the moment, no such duty exists. One might argue that the duty could be stronger, as my hon. Friend suggested previously, but it is unarguable that, for the first time ever, there is a duty on the platforms to have regard to free speech.

--- Later in debate ---
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

The argument has been made that the social media companies are doing this anyway, but two wrongs don’t make a right. We need to stop them doing it. I understand what we are trying to do here. We can see straight away that the Opposition want to be tighter on this. At a later date, if the Bill goes through as it is, freedom of speech will be gradually suppressed, and I am really concerned about that. My hon. Friend said that it would come back to Parliament, which I am pleased about. Are the priorities going to be written into the Bill? Will we be able to vote on them? If the scope is extended at any point in time, will we be able to vote on that, or will the Secretary of State just say, “We can’t have that so we’re just going to ban it”?

Chris Philp Portrait Chris Philp
- Hansard - -

I will answer the questions in reverse order. The list of harms will not be in the Bill. The amendment seeks to put one of the harms in the Bill but not the others. So no, it will not be in the Bill. The harms—either the initial list or any addition to or subtraction from the list—will be listed in an affirmative statutory instrument, which means that the House will be able to look at it and, if it wants, to vote on it. So Parliament will get a chance to look at the initial list, when it is published in an SI. If anything is to be added in one, two or three years’ time, the same will apply.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

So will we be able to vote on any extension of the scope of the Bill at any time? Will that go out to public consultation as well?

Chris Philp Portrait Chris Philp
- Hansard - -

Yes. There is an obligation on the Secretary of State to consult—[Interruption.] Did I hear someone laugh?—before proposing a statutory instrument to add things. There is a consultation first and then, if extra things are going to be added—in my hon. Friend’s language, if the scope is increased—that would be votable by Parliament because it is an affirmative SI. So the answer is yes to both questions. Yes there will be consultation in advance, and yes, if this Government or a future Government wanted to add anything, Parliament could vote on it if it wanted to because it will be an affirmative SI. That is a really important point.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

In a moment; I want to answer the other point made by my hon. Friend the Member for Don Valley first. He said that two wrongs don’t make a right. I am not defending the fact that social media firms act in a manner that is arbitrary and censorious at the moment. I am not saying that it is okay for them to carry on. The point that I was making was a different one. I was saying that they act censoriously and arbitrarily at times at the moment. The Bill will diminish their ability to do that in a couple of ways. First, for the legal but harmful stuff, which he is worried about, they will have a duty to act consistently. If they do not, Ofcom will be able to enforce against them. So their liberty to behave arbitrarily, for this category of content at least, will be circumscribed. They will now have to be consistent. For other content that is outside the scope of this clause —which I guess therefore does not worry my hon. Friend—they can still be arbitrary, but for this they have got to be consistent.

There is also the duty to have regard to freedom of expression, and there is a protection of democratic and journalistic importance in clauses 15 and 16. Although those clauses are not perfect and some people say they should be stronger, they are at least better than what we have now. When I say that this is good for freedom of speech, I mean that nothing here infringes on freedom of speech, and to the extent that it moves one way or the other, it moves us somewhat in the direction of protecting free speech more than is the case at the moment, for the reasons I have set out. I will be happy to debate the issue in more detail either in this Committee or outside, if that is helpful and to avoid trying the patience of colleagues.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the Minister for giving way; I think that is what he was doing as he sat down.

Chris Philp Portrait Chris Philp
- Hansard - -

indicated assent.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Just for clarity, the hon. Member for Don Valley and the Minister have said that Labour Members are seeking to curtail or tighten freedom of expression and freedom of speech, but that is not the case. We fundamentally support free speech, as we always have been. The Bill addresses systems and processes, and that is what it should do—the Minister, the Labour party and I are in full alignment on that. We do not think that the Bill should restrict freedom of speech. I would just like to put that on the record.

We also share the concerns expressed by the hon. Member for Don Valley about the Secretary of State’s potential powers, the limited scope and the extra scrutiny that Parliament might have to undertake on priority harms, so I hope he will support some of our later amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful to the shadow Minister for confirming her support for free speech. Perhaps I could take this opportunity to apologise to you, Sir Roger, and to Hansard for turning round. I will try to behave better in future.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I find myself not entirely reassured, so I think we should press the amendment to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I have heard my right hon. Friend’s points about a standing Joint Committee for post-legislative implementation scrutiny. On the comments about the time, I agree that the Ofcom review needs to be far enough into the future that it can be meaningful, hence the three-year time period.

On the substance of amendment 62, tabled by the shadow Minister, I can confirm that the Government are already undertaking research and working with stakeholders on identifying what the priority harms will be. That consideration includes evidence from various civil society organisations, victims organisations and many others who represent the interests of users online. The wider consultation beyond Ofcom that the amendment would require is happening already as a matter of practicality.

We are concerned, however, that making this a formal consultation in the legal sense, as the amendment would, would introduce some delays while we do so, because a whole sequence of things have to happen after Royal Assent. First, we have to designate the priority harms by statutory instrument, and then Ofcom has to publish its risk assessments and codes of practice. If we insert into that a formal legal consultation step, it would add at least four or even six months into the process of implementing the Act. I know that that was not the hon. Lady’s intention and that she is concerned about getting the Act implemented quickly. For that reason, the Government do not want to insert a formal legal consultation step into the process, but I am happy to confirm that we are engaging in the consultation already on an informal basis and will continue to do so. I ask respectfully that amendment 62 be withdrawn.

The purpose of clauses 55 and 56 has been touched on already, and I have nothing in particular to add.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s comments on the time that these things would take. I cannot see how they could not happen succinctly along with the current consultation, and why it would take an additional four to six months. Could he clarify that?

Chris Philp Portrait Chris Philp
- Hansard - -

A formal statutory consultation could happen only after the passage of the Bill, whereas the informal non-statutory consultation we can do, and are doing, now.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

Chris Philp Portrait Chris Philp
- Hansard - -

We have heard quite a lot about the merits of clause 57, and I am grateful to colleagues on both side for pointing those out. The hon. Member for Pontypridd asked about the effectiveness of the user identity verification processes and how those might occur—whether they would be done individually by each company for their own users, or whether a whole industry would develop even further, with third parties providing verification that could then be used across a whole number of companies.

Some of those processes exist already in relation to age verification, and I think that some companies are already active in this area. I do not think that it would be appropriate for us, in Parliament, to specify those sorts of details. It is ultimately for Ofcom to issue that guidance under clause 58, and it is, in a sense, up to the market and to users to develop their own preferences. If individual users prefer to verify their identity once and then have that used across multiple platforms, that will itself drive the market. I think that there is every possibility that that will happen. [Interruption.]

None Portrait The Chair
- Hansard -

Order. There is a Division on the Floor of the House. The Committee will sit again in 15 minutes. As far as I am aware, there will only be one vote on this; if there are two, we will return 15 minutes later than that.

--- Later in debate ---
On resuming
Chris Philp Portrait Chris Philp
- Hansard - -

I was just concluding my remarks on clause stand part, Sir Roger. User choice and Ofcom guidance will ultimately determine the shape of this market.

The shadow Minister, the hon. Member for Pontypridd, expressed concerns about privacy. That is of course why the list of people Ofcom must consult—at clause 58(3)(a)—specifies the Information Commissioner, to ensure that Ofcom’s guidance properly protects the privacy of users, for the reasons that the shadow Minister referred to in her speech.

Finally, on competition, if anyone attempts to develop an inappropriate monopoly position in this area, the Competition and Markets Authority’s usual powers will apply. On that basis, I commend the clause to the Committee.

Question put and agreed to.

Clause 57 accordingly ordered to stand part of the Bill.

Clause 58

OFCOM’s guidance about user identity verification

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we have said previously, it is absolutely right that Ofcom produces guidance for providers of category 1 services to assist with their compliance with the duty. We very much welcome the inclusion and awareness of identity verification forms for vulnerable adult users in subsection (2); once again, however, we feel that that should go further, as outlined in new clause 8.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.

It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.

For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?

Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.

Chris Philp Portrait Chris Philp
- Hansard - -

To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.

The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.

To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.

Question put and agreed to.

Clause 59 accordingly ordered to stand part of the Bill.

Clause 60

Regulations about reports to the NCA

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 61 stand part.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.

On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

I asked the Minister what that would look like.

Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.

Chris Philp Portrait Chris Philp
- Hansard - -

I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.

As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.

Question put and agreed to.

Clause 60 accordingly ordered to stand part of the Bill.

Clause 61 ordered to stand part of the Bill.

Clause 62

Offence in relation to CSEA reporting

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move amendment 1, in clause 62, page 55, line 14, leave out “maximum summary term for either-way offences” and insert “general limit in a magistrates’ court”.

Amendments 1 to 5 relate to the maximum term of imprisonment on summary conviction of an either-way offence in England and Wales. Amendments 1 to 4 insert a reference to the general limit in a magistrates’ court, meaning the time limit in section 224(1) of the Sentencing Code, which, currently, is 12 months.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider Government amendments 4, 2, 3 and 5.

Chris Philp Portrait Chris Philp
- Hansard - -

These amendments make some technical drafting changes to the Bill in relation to sentencing penalties for either-way offences in the courts of England and Wales. They bring the Bill into line with recent changes implemented following the passage of the Judicial Review and Courts Act 2022. The change uses the new term

“general limit in a magistrates’ court”

to account for any future changes to the sentencing limit in the magistrates court. The 2022 Act includes a secondary power to switch, by regulations, between a 12-month and six-month maximum sentence in the magistrates court, so we need to use the more general language in this Bill to ensure that changes back and forth can be accommodated. If we just fix a number, it would become out of sync if switches are made under the 2022 Act.

Amendment 1 agreed to.

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 63 stand part.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Clause 62 creates an offence, as we discussed earlier, of knowingly or recklessly providing inaccurate information to the NCA in relation to CSEA reporting, the penalty for which is imprisonment, a fine or both. Where a company seeks to evade its responsibility, or disregards the importance of the requirement to report CSEA by providing inaccurate information, it will be liable for prosecution. We are backing the requirement to report CSEA with significant criminal powers.

Clause 63 provides definitions for the terms used in chapter 2 of part 4, in relation to the requirement to report CSEA. In summary, a UK provider of a regulated service is defined as a provider that is

“incorporated or formed under the law of any part of the United Kingdom”

or where it is

“individuals who are habitually resident in the United Kingdom”.

The shadow Minister asked about the test and what counts, and I hope that provides the answer. We are defining CSEA content as content that a company becomes aware of containing CSEA. A company can become aware of that by any means, including through the use of automated systems and processes, human moderation or user reporting.

With regard to the definition of UK-linked CSEA, which the shadow Minister also asked about, that refers to content that may have been published and shared in the UK, or where the nationality or location of a suspected offender or victim is in the UK. The definition of what counts as a UK link is quite wide, because it includes not only the location of the offender or victim but where the content is shared. That is a wide definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a specific question—the Minister answered a similar question from me earlier. The Bill says that the location of the child “is” in the UK. Would it be reasonable to expect that if a company suspected the child “was” in the UK, although not currently, that would be in scope as something required to be reported? I know that is technical, but if the “was” is included in the “is” then that is much wider and more helpful than just including the current location.

Chris Philp Portrait Chris Philp
- Hansard - -

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?

Chris Philp Portrait Chris Philp
- Hansard - -

I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.

Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.

I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.

Question put and agreed to.

Clause 62, as amended, accordingly ordered to stand part of the Bill.

Clause 63 ordered to stand part of the Bill.

Clause 64

Transparency reports about certain Part 3 services

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.

This amendment would change the requirement for transparency report notices from once a year to twice a year.

--- Later in debate ---
The third additional transparency disclosure is to show how companies make decisions about service design. Preventing harm to the public would be impossible unless both the regulator and civil society know what is happening inside these large tech companies. We know that if something cannot be detected, it clearly cannot be reported. Knowing how companies make decisions will allow for greater scrutiny of the information they disclose. Without it, there is a risk that Ofcom receives skewed figures and an incomplete picture. Amendment 55 would be a step in the right direction towards making the online environment more transparent, fair and safe for those working to tackle harms, and I hope the Minister will consider its merits.
Chris Philp Portrait Chris Philp
- Hansard - -

To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.

Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to

“amend subsection (1) so as to change the frequency of the transparency reporting process.”

If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.

I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover

“systems and processes for users to report content which they consider to be illegal”

or “harmful”, and so on. Paragraph 6 mentions:

“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,

and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.

I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.

Chris Philp Portrait Chris Philp
- Hansard - -

I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I associate myself with the comments made by the hon. Member for Pontypridd and apologise on behalf of my hon. Friend the Member for Ochil and South Perthshire, who is currently in the Chamber dealing with the Channel 4 privatisation. I am sure that, given his position on the Joint Committee, he would have liked to comment on the clause and would have welcomed its inclusion in the Bill, but, unfortunately, he cannot currently do so.

Chris Philp Portrait Chris Philp
- Hansard - -

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

As a member of the Joint Committee, on which I worked with the hon. Member for Ochil and South Perthshire, I thank the Minister for including this clause on a point that was debated at length by the Joint Committee. Its inclusion is crucial to organisations in my constituency such as Dignify—a charity that works to raise awareness and campaign on this important point, to protect children but also wider society. As this is one of the 66 recommendations that the Minister took forward in the Bill, I would like to thank him; it is very welcome, and I think that it will make a huge difference to children and to society.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my hon. Friend for his intervention and for his work on the Joint Committee, which has had a huge impact, as we have seen. I hope that colleagues will join me in thanking the members of the Joint Committee for their work.

My final point on this important clause is in response to a question that the shadow Minister raised about clause 66(3), which makes reference to

“a person acting on behalf of the provider”.

That is just to ensure that the clause is comprehensively drafted without any loopholes. If the provider used an agent or engaged some third party to disseminate content on their behalf, rather than doing so directly, that would be covered too. We just wanted to ensure that there was absolutely no loophole—no chink of light—in the way that the clause was drafted. That is why that reference is there.

I am delighted that these clauses seem to command such widespread support. It therefore gives me great pleasure to commend them to the Committee.

Question put and agreed to.

Clause 66 accordingly ordered to stand part of the Bill.

Clause 67 ordered to stand part of the Bill.

Schedule 9 agreed to.

Clause 68

Duties about regulated provider pornographic content

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 114, in clause 68, page 60, line 13, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has laid out compellingly how awful the displaying of images of children on pornography websites and the displaying of images where the consent of the person has not been obtained are. Let me take each of those in turn, because my answers will be a bit different in the two cases.

First, all material that contains the sexual abuse of children or features children at all—any pornographic content featuring children is, by definition, sexual abuse—is already criminalised through the criminal law. Measures such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Coroners and Justice Act 2009 provide a range of criminal offences that include the taking, making, circulating, possessing with a view to distributing, or otherwise possessing indecent photos or prohibited images of children. As we would expect, everything that the hon. Lady described is already criminalised under existing law.

This part of the Bill—part 5—covers publishers and not the user-to-user stuff we talked about previously. Because they are producing and publishing the material themselves, publishers of such material are covered by the existing criminal law. What they are doing is already illegal. If they are engaged in that activity, they should—and, I hope, will—be prosecuted for doing it.

The new clause and the amendments essentially seek to duplicate what is already set out very clearly in criminal law. While their intentions are completely correct, I do not think it is helpful to have duplicative law that essentially tries to do the same thing in a different law. We have well established and effective criminal laws in these areas.

In relation to the separate question of people whose images are displayed without their consent, which is a topic that my right hon. Friend the Member for Basingstoke has raised a few times, there are existing criminal offences that are designed to tackle that, including the recent revenge pornography offences in particular, as well as the criminalisation of voyeurism, harassment, blackmail and coercive or controlling behaviour. There is then the additional question of intimate image abuse, where intimate images are produced or obtained without the consent of the subject, and are then disseminated.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I think it would cover some of them. If, for example, someone in a relationship had a video taken that was then made available on a commercial pornography site, that would clearly be in scope. I am not saying that the revenge pornography legislation covers all examples, but it covers some of them. We have discussed already that clause 150 will criminalise a great deal of the content referred to here if the intention of that content or the communication concerned is to cause harm—meaning

“psychological harm amounting to at least serious distress”—

to the subject. That will capture a lot of this as well.

My right hon. Friend the Member for Basingstoke has made a point about needing to remove the intent requirement. Any sharing of an intimate image without consent should be criminalised. As we have discussed previously, that is being moved forward under the auspices of the Ministry of Justice in connection with the Law Commission’s proposed offence. That work is in flight, and I would anticipate it delivering legislative results. I think that is the remaining piece of the puzzle. With the addition of that piece of legislation, I think we will cover the totality of possible harms in relation to images of people whose consent has not been given.

In relation to material featuring children, the legislative pattern is complete already; it is already criminal. We do not need to do anything further to add any criminal offences; it is already illegal, as it should be. In relation to non-consensual images, the picture is largely complete. With the addition of the intimate image abuse offence that my right hon. Friend the Member for Basingstoke has been rightly campaigning for, the picture will be complete. Given that that is already in process via the Law Commission, while I again agree with what the Opposition are trying to do here, we have a process in hand that will sort this out. I hope that that makes the Government’s position on the amendments and the new clause clear.

Clause 68 is extremely important. It imposes a legally binding duty to make sure that children are not normally able to encounter pornographic content in a commercial context, and it makes it clear that one of the ways that can be achieved is by using age verification. If Ofcom, in its codes of practice, directs companies to use age verification, or if there is no other effective means of preventing children from seeing pornographic content, the clause makes it clear that age verification is expressly authorised by Parliament in primary legislation. There will be no basis upon which a porn provider could try to legally challenge Ofcom, because it is there in black and white in the Bill. It is clearly Parliament’s intention that hard-edged age verification will be legal. By putting that measure in the Bill as an example of the way that the duty can be met, we immunise the measure from legal challenge should Ofcom decide it is the only way of delivering the duty. I make that point explicitly for the avoidance of doubt, so that if this point is ever litigated, Parliament’s intention is clear.

Online Safety Bill (Eleventh sitting)

Chris Philp Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I beg to move amendment 127, in clause 69, page 60, line 26, after “must” insert—

“within six months of this Act being passed”.

As ever, it is a pleasure to serve under your chairship, Sir Roger. The thoughts and prayers of us all are with my hon. Friend the Member for Batley and Spen and all her friends and family.

Labour welcomes the clause, which sets out Ofcom’s duties to provide guidance to providers of internet services. It is apparent, however, that we cannot afford to kick the can down the road and delay implementation of the Bill any further than necessary. With that in mind, I urge the Minister to support the amendment, which would give Ofcom an appropriate amount of time to produce this important guidance.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure, once again, to serve under your august chairmanship, Sir Roger. I associate the Government with the remarks that you and the shadow Minister made, marking the anniversary of Jo Cox’s appalling murder, which shook the entire House when it happened. She will never be forgotten.

The Government are sympathetic to the intent of the amendment, which seeks to ensure that guidance for providers on protecting children from online pornography is put in place as quickly as possible. We of course sympathise with that objective, but we feel that the Secretary of State must retain the power to determine when to bring in the provisions of part 5, including the requirement under the clause for Ofcom to produce guidance, to ensure that implementation of the framework comprehensively and effectively regulates all forms of pornography online. That is the intention of the whole House and of this Committee.

Ofcom needs appropriate time and flexibility to get the guidance exactly right. We do not want to rush it and consequently see loopholes, which pornography providers or others might seek to exploit. As discussed, we will be taking a phased approach to bringing duties under the Bill into effect. We expect prioritisation for the most serious harms as quickly as possible, and we expect the duties on illegal content to be focused on most urgently. We have already accelerated the timescales for the most serious harms by putting priority illegal content in the various schedules to the Bill.

Ofcom is working hard to prepare implementation. We are all looking forward to the implementation road map, which it has committed to produce before the summer. For those reasons, I respectfully resist the amendment.

Question put, That the amendment be made.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.

The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where

“OFCOM consider that an exemption…is appropriate”

and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.

On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.

The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.

Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.

Chris Philp Portrait Chris Philp
- Hansard - -

I assume that the hon. Lady is asking about the £88 million.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

indicated assent.

Chris Philp Portrait Chris Philp
- Hansard - -

That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.

Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.

Question put and agreed to.

Clause 70 accordingly ordered to stand part of the Bill.

Clauses 71 to 76 ordered to stand part of the Bill.

Clause 77

General duties of OFCOM under section 3 of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 78 and 79 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome clause 77, which is an important clause that seeks to amend Ofcom’s existing general duties in the Communications Act 2003. Given the prevalence of illegal harms online, as we discussed earlier in proceedings, it is essential that the Communications Act is amended to reflect the important role that Ofcom will have as a new regulator.

As the Minister knows, and as we will discuss shortly when we reach amendments to clause 80, we have significant concerns about the Government’s approach to size versus harm when categorising service providers. Clause 77(4) amends section 3 of the Communications Act by inserting new subsection (4A). New paragraph (4A)(d) outlines measures that are proportionate to

“the size or capacity of the provider”,

and to

“the level of risk of harm presented by the service in question, and the severity of the potential harm”.

We know that harm, and the potential of accessing harmful content, is what is most important in the Bill—it says it in the name—so I am keen for my thoughts on the entire categorisation process to be known early on, although I will continue to press this issue with the Minister when we debate the appropriate clause.

Labour also supports clause 78. It is vital that Ofcom will have a duty to publish its proposals on strategic priorities within a set time period, and ensuring that that statement is published is a positive step towards transparency, which has been so crucially missing for far too long.

Similarly, Labour supports clause 79, which contains a duty to carry out impact assessments. That is vital, and it must be conveyed in the all-important Communications Act.

Chris Philp Portrait Chris Philp
- Hansard - -

As the shadow Minister has set out, these clauses ensure that Ofcom’s duties under the Communications Act 2003 are updated to reflect the new duties that we are asking it to undertake—I think that is fairly clear from the clauses. On the shadow Minister’s comment about size and risk, I note her views and look forward to debating that more fully in a moment.

Question put and agreed to.

Clause 77 accordingly ordered to stand part of the Bill.

Clauses 78 and 79 ordered to stand part of the Bill.

Clause 80

Meaning of threshold conditions etc

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 80, in schedule 10, page 192, line, at end insert—

“(c) the assessed risk of harm arising from that part of the service.”

This amendment, together with Amendments 81 and 82, widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

Amendment 81, in schedule 10, page 192, line 39, after “functionality” insert—

“and at least one specified condition about the assessed risk of harm”

This amendment is linked to Amendment 80.

Amendment 82, in schedule 10, page 192, line 41, at end insert—

‘(4A) At least one specified condition about the assessed risk of harm must provide for a service assessed as posing a very high risk of harm to its users to meet the Category 1 threshold.”

This amendment is linked to Amendment 80, it widens Category 1 to include those services which pose a very high risk of harm, regardless of the number of users.

That schedule 10 be the Tenth schedule to the Bill.

Clause 81 stand part.

Clause 82 stand part.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.

There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.

Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.

We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?

We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.

Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established

“as soon as reasonably practicable”,

could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?

Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.

Chris Philp Portrait Chris Philp
- Hansard - -

As we have heard, the clauses set out how different platforms will be categorised with the purpose of ensuring duties are applied in a reasonable and proportionate way that avoids over-burdening smaller businesses. However, it is worth being clear that the Online Safety Bill, as drafted, requires all in-scope services, regardless of their user size, to take action against content that is illegal and where it is necessary to protect children. It is important to re-emphasise the fact that there is no size qualification for the illegal content duties and the duties on the protection of children.

It is also important to stress that under schedule 10 as drafted there is flexibility, as the shadow Minister said, for the Secretary of State to change the various thresholds, including the size threshold, so there is an ability, if it is considered appropriate, to lower the size thresholds in such a way that more companies come into scope, if that is considered necessary.

It is worth saying in passing that we want these processes to happen quickly. Clearly, it is a matter for Ofcom to work through the operations of that, but our intention is that this will work quickly. In that spirit, in order to limit any delays to the process, Ofcom can rely on existing research, if that research is fit for purpose under schedule 10 requirements, rather than having to do new research. That will greatly assist moving quickly, because the existing research is available off the shelf immediately, whereas commissioning new research may take some time. For the benefit of Hansard and people who look at this debate for the application of the Bill, it is important to understand that that is Parliament’s intention.

I will turn to the points raised by the hon. Member for Aberdeen North and the shadow Minister about platforms that may be small and fall below the category 1 size threshold but that are none the less extremely toxic, owing to the way that they are set up, their rules and their user base. The shadow Minister mentioned several such platforms. I have had meetings with the stakeholders that she mentioned, and we heard their evidence. Other Members raised this point on Second Reading, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy). As the hon. Member for Aberdeen North said, I signalled on Second Reading that the Government are listening carefully, and our further work in that area continues at pace.

I am not sure that amendment 80 as drafted would necessarily have the intended effect. Proposed new sub-paragraph (c) to schedule 10(1) would add a risk condition, but the conditions in paragraph (1) are applied with “and”, so they must all be met. My concern is that the size threshold would still apply, and that this specific drafting of the amendment would not have the intended effect.

We will not accept the amendments as drafted, but as I said on Second Reading, we have heard the representations—the shadow Minister and the hon. Member for Aberdeen North have made theirs powerfully and eloquently—and we are looking carefully at those matters. I hope that provides some indication of the Government’s thinking. I thank the stakeholders who engaged and provided extremely valuable insight on those issues. I commend the clause to the Committee.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his comments. I still think that such platforms are too dangerous not to be subject to more stringent legislation than similar-sized platforms. For the Chair’s information, I would like to press amendment 80 to a vote. If it falls, I will move straight to pressing amendment 82 to a vote, missing out amendment 81. Does that makes sense, Chair, and is it possible?

--- Later in debate ---
None Portrait The Chair
- Hansard -

I did not indicate at the start of the debate that I would take the clause stand part and clause 84 stand part together, but I am perfectly relaxed about it and very happy to do so, as the hon. Lady has spoken to them. If any other colleague wishes to speak to them, that is fine by me.

Chris Philp Portrait Chris Philp
- Hansard - -

Perhaps I might start with amendment 34, which the shadow Minister just spoke to. We agree that it is very important to consider the risks posed to victims who are outside of the territory of the United Kingdom. However, for the reasons I will elaborate on, we believe that the Bill as drafted achieves that objective already.

First, just to remind the Committee, the Bill already requires companies to put in place proportionate systems and processes to prevent UK users from encountering illegal content. Critically, that includes where a UK user creates illegal content via an in-scope platform, but where the victim is overseas. Let me go further and remind the Committee that clause 9 requires platforms to prevent UK users from encountering illegal content no matter where that content is produced or published. The word “encounter” is very broadly defined in clause 189 as meaning

“read, view, hear or otherwise experience content”.

As such, it will cover a user’s contact with any content that they themselves generate or upload to a service.

Critically, there is another clause, which we have discussed previously, that is very important in the context of overseas victims, which the shadow Minister quite rightly raises. The Committee will recall that subsection (9) of clause 52, which is the important clause that defines illegal content, makes it clear that that content does not have to be generated, uploaded or accessed in the UK, or indeed to have anything to do with the UK, in order to count as illegal content towards which the company has duties, including risk assessment duties. Even if the illegal act—for example, sexually abusing a child—happens in some other country, not the UK, it still counts as illegal content under the definitions in the Bill because of clause 52(9). It is very important that those duties will apply to that circumstance. To be completely clear, if an offender in the UK uses an in-scope platform to produce content where the victim is overseas, or to share abuse produced overseas with other UK users, the platform must tackle that, both through its risk assessment duties and its other duties.

As such, the entirely proper intent behind amendment 34 is already covered by the Bill as drafted. The shadow Minister, the hon. Member for Pontypridd, has already referred to the underlying purpose of clauses 83 and 84. As we discussed before, the risk assessments are central to the duties in the Bill. It is essential that Ofcom has a proper picture of the risks that will inform its various regulatory activities, which is why these clauses are so important. Clause 84 requires Ofcom to produce guidance to services to make sure they are carrying out those risk assessments properly, because it is no good having a token risk assessment or one that does not properly deal with the risks. The guidance published under clause 84 will ensure that happens. As such, I will respectfully resist amendment 34, on the grounds that its contents are already covered by the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s clarification. Given his assurances that its contents are already covered by the Bill, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 83 ordered to stand part of the Bill.

Clause 84 ordered to stand part of the Bill.

Clause 85

Power to require information

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a brief comment echoing the shadow Minister’s welcome for the inclusion of senior managers and named people in the Bill. I agree that that level of personal liability and responsibility is the only way that we will be able to hold some of these incredibly large, unwieldy organisations to account. If they could wriggle out of this by saying, “It’s somebody else’s responsibility,” and if everyone then disagreed about whose responsibility it was, we would be in a much worse place, so I also support the inclusion of these clauses and schedule 11.

Chris Philp Portrait Chris Philp
- Hansard - -

I am delighted by the strong support that these clauses have received from across the aisle. I hope that proves to be a habit-forming development.

On the shadow Minister’s point about publishing the risk assessments, to repeat the point I made a few days ago, under clause 64, which we have already debated, Ofcom has the power—indeed, the obligation—to compel publication of transparency reports that will make sure that the relevant information sees the light of day. I accept that publication is important, but we believe that objective is achieved via the transparency measures in clause 64.

On the point about senior management liability, which again we debated near the beginning of the Bill, we believe—I think we all agree—that this is particularly important for information disclosure. We had the example, as I mentioned at the time, of one of the very large companies refusing to disclose information to the Competition and Markets Authority in relation to a competition matter and simply paying a £50 million fine rather than complying with the duties. That is why criminal liability is so important here in relation to information disclosure.

To reassure the shadow Minister, on the point about when that kicks in, it was in the old version of the Bill, but potentially did not commence for two years. In this new version, updated following our extensive and very responsive listening exercise—I am going to get that in every time—the commencement of this particular liability is automatic and takes place very shortly after Royal Assent. The delay and review have been removed, for the reason the hon. Lady mentioned, so I am pleased to confirm that to the Committee.

The shadow Minister described many of the provisions. Clause 85 gives Ofcom powers to require information, clause 86 gives the power to issue notices and clause 87 the important power to require an entity to name that relevant senior manager, so they cannot wriggle out of their duty by not providing the name. Clause 88 gives the power to require companies to undergo a report from a so-called skilled person. Clause 89 requires full co-operation with Ofcom when it opens an investigation, where co-operation has been sadly lacking in many cases to date. Clause 90 requires people to attend an interview, and the introduction to schedule 11 allows Ofcom to enter premises to inspect or audit the provider. These are very powerful clauses and will mean that social media companies can no longer hide in the shadows from the scrutiny they so richly deserve.

Question put and agreed to.

Clause 85 accordingly ordered to stand part of the Bill.

Clauses 86 to 91 ordered to stand part of the Bill.

Schedule 11

OFCOM’s powers of entry, inspection and audit

Amendment made: 4, in schedule 11, page 202, line 17, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”.—(Chris Philp.)

Schedule 11, as amended, agreed to.

Clause 92

Offences in connection with information notices

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister will be pleased to hear that we, again, support these clauses. We absolutely support the Bill’s aims to ensure that information offences and penalties are strong enough to dissuade non-compliance. However, as we said repeatedly, we feel that the current provisions are lacking.

As it stands, senior managers can be held criminally liable only for technical offences, such as failing to supply information to the regulator. I am grateful that the Minister has confirmed that the measures will come into force with immediate effect following Royal Assent, rather than waiting two years. That is welcome news. The Government should require that top bosses at social media companies be criminally liable for systemic and repeated failures on online safety, and I am grateful for the Minister’s confirmation on that point.

As these harms are allowed to perpetuate, tech companies cannot continue to get away without penalty. Will the Minister confirm why the Bill does not include further penalties, in the form of criminal offences, should a case of systemic and repeated failures arise? Labour has concerns that, without stronger powers, Ofcom may not feel compelled or equipped to sanction those companies who are treading the fine line of doing just enough to satisfy the requirements outlined in the Bill as it stands.

Labour also welcomes clause 93, which sets out the criminal offences that can be committed by named senior managers in relation to their entity’s information obligations. It establishes that senior managers who are named in a response to an information notice can be held criminally liable for failing to prevent the relevant service provider from committing an information offence. Senior managers can only be prosecuted under the clause where the regulated provider has already been found liable for failing to comply with Ofcom’s information request. As I have already stated, we feel that this power needs to go further if we are truly to tackle online harm. For far too long, those at the very top have known about the harm that exists on their platforms, but they have failed to take action.

Labour supports clause 94 and we have not sought to amend at this stage. It is vital that provisions are laid in the Bill, such as those in subsection (3), which specify actions that a person may take to commit an offence of this nature. We all want to see the Bill keep people safe online, and at the heart of doing so is demanding a more transparent approach from those in silicon valley. My hon. Friend the Member for Worsley and Eccles South made an excellent case for the importance of transparency earlier in the debate but, as the Minister knows, and as I have said time and again, the offences must go further than just applying to simple failures to provide information. We must consider a systemic approach to harm more widely, and that goes far beyond simple information offences.

There is no need to repeat myself. Labour supports the need for clause 95 as it stands and we support clause 96, which is in line with penalties for other information offences that already exist.

Chris Philp Portrait Chris Philp
- Hansard - -

I am delighted to discover that agreement with the Governments clauses continues to provoke a tsunami of unanimity across the Committee. I sense a gathering momentum behind these clauses.

As the shadow Minister mentioned, the criminal offences here are limited to information provision and disclosure. We have debated the point before. The Government’s feeling is that going beyond the information provision into other duties for criminal liability would potentially go a little far and have a chilling effect on the companies concerned.

Also, the fines that can be levied—10% of global revenue—run into billions of pounds, and there are the denial of service provisions, where a company can essentially be disconnected from the internet in extreme cases; these do provide more than adequate enforcement powers for the other duties in the Bill. The information duties are so fundamental—that is why personal criminal liability is needed. Without the information, we cannot really make any further assessment of whether the duties are being met.

The shadow Minister has set out what the other clauses do: clause 92 creates offences; clause 93 introduces senior managers’ liability; clause 94 sets out the offences that can be committed in relation to audit notices issued by Ofcom; clause 95 creates offences for intentionally obstructing or delaying a person exercising Ofcom’s power; and clause 96 sets out the penalties for the information offences set out in the Bill, which of course include a term of imprisonment of up to two years. Those are significant criminal offences, which I hope will make sure that executives working for social media firms properly discharge those important duties.

Question put and agreed to.

Clause 92 accordingly ordered to stand part of the Bill.

Clauses 93 to 95 ordered to stand part of the Bill.

Clause 96

Penalties for information offences

Amendment made: 2, in clause 96, page 83, line 15, leave out

“maximum summary term for either-way offences”

and insert

“general limit in a magistrates’ court”—(Chris Philp.)

Clause 96, as amended, ordered to stand part of the Bill.

Clause 97

Co-operation and disclosure of information: overseas regulators

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 98 to 102 stand part.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I am delighted that support for the Government’s position on the clauses continues and that cross-party unanimity is taking an ever stronger hold. I am sure the Whips Office will find that particularly reassuring.

The shadow Minister asked a question about clause 100. Clause 100 amends section 24B of the Communications Act 2003, which allows Ofcom to provide information to the Secretary of State to assist with the formulation of policy. She asked me to clarify what that means, which I am happy to do. In most circumstances, Ofcom will be required to obtain the consent of providers in order to share information relating to their business. This clause sets out two exceptions to that principle. If the information required by the Secretary of State was obtained by Ofcom to determine the proposed fees threshold, or in response to potential threats to national security or to the health or safety of the public, the consent of the business is not required. In those instances, it would obviously not be appropriate to require the provider’s consent.

It is important that users of regulated services are kept informed of developments around online safety and the operation of the regulatory framework.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This specifically relates to the Secretary of State, but would the Minister expect both Ofcom and his Department to be working with the Scottish Government and the Northern Ireland Executive? I am not necessarily talking about sharing all the information, but where there are concerns that it is very important for those jurisdictions to be aware of, will he try to ensure that he has a productive relationship with both devolved Administrations?

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for her question. Where the matter being raised or disclosed touches on matters of devolved competence—devolved authority—then yes, I would expect that consultation to take place. Matters concerning the health and safety of the public are entirely devolved, I think, so I can confirm that in those circumstances it would be appropriate for the Secretary of State to share information with devolved Administration colleagues.

The shadow Minister has eloquently, as always, touched on the purpose of the various other clauses in this group. I do not wish to try the patience of the Committee, particularly as lunchtime approaches, by repeating what she has ably said already, so I will rest here and simply urge that these clauses stand part of the Bill.

Question put and agreed to.

Clause 97 accordingly ordered to stand part of the Bill.

Clauses 98 to 102 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Twelfth sitting)

Chris Philp Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I have a few questions, concerns and suggestions relating to these clauses. I think it was the hon. Member for Don Valley who asked me last week about the reports to the National Crime Agency and how that would work—about how, if a human was not checking those things, there would be an assurance that proper reports were being made, and that scanning was not happening and reports were not being made when images were totally legal and there was no problem with them. [Interruption.] I thought it was the hon. Member for Don Valley, although it may not have been. Apologies—it was a Conservative Member. I am sorry for misnaming the hon. Member.

The hon. Member for Pontypridd made a point about the high level of accuracy of the technologies. That should give everybody a level of reassurance that the reports that are and should be made to the National Crime Agency on child sexual abuse images will be made on a highly accurate basis, rather than a potentially inaccurate one. Actually, some computer technology—particularly for scanning for images, rather than text—is more accurate than human beings. I am pleased to hear those particular statistics.

Queries have been raised on this matter by external organisations—I am particularly thinking about the NSPCC, which we spoke about earlier. The Minister has thankfully given a number of significant reassurances about the ability to proactively scan. External organisations such as the NSPCC are still concerned that there is not enough on the face of the Bill about proactive scanning and ensuring that the current level of proactive scanning is able—or required—to be replicated when the Bill comes into action.

During an exchange in an earlier Committee sitting, the Minister gave a commitment—I am afraid I do not have the quote—to being open to looking at amending clause 103. I am slightly disappointed that there are no Government amendments, but I understand that there has been only a fairly short period; I am far less disappointed than I was previously, when the Minister had much more time to consider the actions he might have been willing to take.

The suggestion I received from the NSPCC is about the gap in the Bill regarding the ability of Ofcom to take action. These clauses allow Ofcom to take action against individual providers about which it has concerns; those providers will have to undertake duties set out by Ofcom. The NSPCC suggests that there could be a risk register, or that a notice could be served on a number of companies at one time, rather than Ofcom simply having to pick one company, or to repeatedly pick single companies and serve notices on them. Clause 83 outlines a register of risk profiles that must be created by Ofcom. It could therefore serve notice on all the companies that fall within a certain risk profile or all the providers that have common functionalities.

If there were a new, emerging concern, that would make sense. Rather than Ofcom having to go through the individual process with all the individual providers when it knows that there is common functionality—because of the risk assessments that have been done and Ofcom’s oversight of the different providers—it could serve notice on all of them in one go. It could not then accidentally miss one out and allow people to move to a different platform that had not been mentioned. I appreciate the conversation we had around this issue earlier, and the opportunity to provide context in relation to the NSPCC’s suggestions, but it would be great if the Minister would be willing to consider them.

I have another question, to which I think the Minister will be able to reply in the affirmative, which is on the uses of the technology as it evolves. We spoke about that in an earlier meeting. The technology that we have may not be what we use in the future to scan for terrorist-related activity or child sexual abuse material. It is important that the Bill adequately covers future conditions. I think that it does, but will the Minister confirm that, as technology advances and changes, these clauses will adequately capture the scanning technologies that are required, and any updates in the way in which platforms work and we interact with each other on the internet?

I have fewer concerns about future-proofing with regard to these provisions, because I genuinely think they cover future conditions, but it would be incredibly helpful and provide me with a bit of reassurance if the Minister could confirm that. I very much look forward to hearing his comments on clause 103.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

Let me start by addressing some questions raised by hon. Members, beginning with the last point made by the hon. Member for Aberdeen North. She sought reconfirmation that the Bill will keep up with future developments in accredited technology that are not currently contemplated. The answer to her question can be found in clause 105(9), in which the definition of accredited technology is clearly set out, as technology that is

“accredited (by OFCOM or another person appointed by OFCOM) as meeting minimum standards of accuracy”.

That is not a one-off determination; it is a determination, or an accreditation, that can happen from time to time, periodically or at any point in the future. As and when new technologies emerge that meet the minimum standards of accuracy, they can be accredited, and the power in clause 103 can be used to compel platforms to use those technologies. I hope that provides the reassurance that the hon. Member was quite rightly asking for.

The shadow Minister, the hon. Member for Pontypridd, asked a related question about the process for publishing those minimum standards. The process is set out in clause 105(10), which says that Ofcom will give advice to the Secretary of State on the appropriate minimum standards, and the minimum standards will then be

“approved…by the Secretary of State, following advice from OFCOM.”

We are currently working with Ofcom to finalise the process for setting those standards, which of course will need to take a wide range of factors into account.

Let me turn to the substantive clauses. Clause 103 is extremely important, because as we heard in the evidence sessions and as Members of the Committee have said, scanning messages using technology such as hash matching, to which the shadow Minister referred, is an extremely powerful way of detecting CSEA content and providing information for law enforcement agencies to arrest suspected paedophiles. I think it was in the European Union that Meta—particularly Facebook and Facebook Messenger—stopped using this scanner for a short period time due to misplaced concerns about privacy laws, and the number of referrals of CSEA images and the number of potential paedophiles who were referred to law enforcement dropped dramatically.

A point that the hon. Member for Aberdeen North and I have discussed previously is that it would be completely unacceptable if a situation arose whereby these messages—I am thinking particularly about Facebook Messenger—did not get scanned for CSEA content in a way that they do get scanned today. When it comes to preventing child sexual exploitation and abuse, in my view there is no scope for compromise or ambiguity. That scanning is happening at the moment; it is protecting children on a very large scale and detecting paedophiles on quite a large scale. In my view, under no circumstances should that scanning be allowed to stop. That is the motivation behind clause 103, which provides Ofcom with the power to make directions to require the use of accredited technology.

As the hon. Member for Aberdeen North signalled in her remarks, given the importance of this issue the Government are of course open to thinking about ways in which the Bill can be strengthened if necessary, because we do not want to leave any loopholes. I urge any social media firms watching our proceedings never to take any steps that degrade or reduce the ability to scan for CSEA content. I thank the hon. Member for sending through the note from the NSPCC, which I have received and will look at internally.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I echo the sentiments that have been expressed by the shadow Minister, and thank her and her colleagues for tabling this amendment and giving voice to the numerous organisations that have been in touch with us about this matter. The Scottish National party is more than happy to support the amendment, which would make the Bill stronger and better, and would better enable Ofcom to take action when necessary.

Chris Philp Portrait Chris Philp
- Hansard - -

I understand the spirit behind these amendments, focusing on the word “presence” rather than “prevalence” in various places. It is worth keeping in mind that throughout the Bill we are requiring companies to implement proportionate systems and processes to protect their users from harm. Even in the case of the most harmful illegal content, we are not placing the duty on companies to remove every single piece of illegal content that has ever appeared online, because that is requesting the impossible. We are asking them to take reasonable and proportionate steps to create systems and processes to do so. It is important to frame the legally binding duties in that way that makes them realistically achievable.

As the shadow Minister said, amendments 35, 36, 39 and 40 would replace the word “prevalence” with “presence”. That would change Ofcom’s duty to enforce not just against content that was present in significant numbers—prevalent—but against a single instance, which would be enough to engage the clause.

We mutually understand the intention behind these amendments, but we think the significant powers to compel companies to adopt certain technology contained in section 103 should be engaged only where there is a reasonable level of risk. For example, if a single piece of content was present on a platform, if may not be reasonable or proportionate to force the company to adopt certain new technologies, where indeed they do not do so at the moment. The use of “prevalence” ensures that the powers are used where necessary.

It is clear—there is no debate—that in the circumstances where scanning technology is currently used, which includes on Facebook Messenger, there is enormous prevalence of material. To elaborate on a point I made in a previous discussion, anything that stops that detection happening would be unacceptable and, in the Government’s view, it would not be reasonable to lose the ability to detect huge numbers of images in the service of implementing encryption, because there is nothing more important than scanning against child sexual exploitation images.

However, we think adopting the amendment and replacing the word “prevalence” with “presence” would create an extremely sensitive trigger that would be engaged on almost every site, even tiny ones or where there was no significant risk, because a single example would be enough to trigger the amendment, as drafted. Although I understand the spirit of the amendment, it moves away from the concepts of proportionality and reasonableness in the systems and processes that the Bill seeks to deliver.

Amendment 37 seeks to widen the criteria that Ofcom must consider when deciding to use section 103 powers. It is important to ensure that Ofcom considers a wide range of factors, taking into account the harm occurring, but clause 104(2)(f) already requires Ofcom to consider

“the level of risk of harm to individuals in the United Kingdom presented by relevant content, and the severity of that harm”.

Therefore, the Bill already contains provision requiring Ofcom to take those matters into account, as it should, but the shadow Minister is right to draw attention to the issue.

Finally, amendment 38 seeks to amend clause 116 to require Ofcom to consider the risk of harm posed by individuals in the United Kingdom, in relation to adults and children in the UK or elsewhere, through the production, publication and dissemination of illegal content. In deciding whether to make a confirmation decision requiring the use of technology, it is important that Ofcom considers a wide range of factors. However, clause 116(6)(e) already proposes to require Ofcom to consider, in particular, the risk and severity of harm to individuals in the UK. That is clearly already in the Bill.

I hope that this analysis provides a basis for the shadow Minister to accept that the Bill, in this area, functions as required. I gently request that she withdraw her amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments, but if we truly want the Bill to be world-leading, as the Government and the Minister insist it will be, and if it is truly to keep children safe, surely one image of child sexual exploitation and abuse on a platform is one too many. We do not need to consider prevalence over presence. I do not buy that argument. I believe we need to do all we can to make this Bill as strong as possible. I believe the amendments would do that.

Question put, That the amendment be made.

--- Later in debate ---

Division 34

Ayes: 3


Labour: 2
Scottish National Party: 1

Noes: 5


Conservative: 5

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move amendment 6, in clause 104, page 89, line 14, after “(2)(f)” insert “, (g)”

This amendment ensures that subsection (3) of this clause (which clarifies what “relevant content” in particular paragraphs of subsection (2) refers to in relation to different kinds of services) applies to the reference to “relevant content” in subsection (2)(g) of this clause.

This technical amendment will ensure that the same definition of “relevant content” used in subsection (2) is used in subsection (3).

Amendment 6 agreed to.

Clause 104, as amended, ordered to stand part of the Bill.

Clauses 105 and 106 ordered to stand part of the Bill.

Clause 107

OFCOM’s guidance about functions under this Chapter

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question for the Minister about the timelines in relation to the guidance and the commitment that Ofcom gave to producing a road map before this coming summer. When is that guidance likely to be produced? Does that road map relate to the guidance in this clause, as well as the guidance in other clauses? If the Minister does not know the answer, I have no problem with receiving an answer at a later time. Does the road map include this guidance as well as other guidance that Ofcom may or may not be publishing at some point in the future?

Chris Philp Portrait Chris Philp
- Hansard - -

I welcome the cross-party support for the provisions set out in these important clauses. Clause 107 points out the requirement for Ofcom to publish guidance, which is extremely important. Clause 108 makes sure that it publishes an annual report. Clause 109 covers the interpretations.

The hon. Member for Aberdeen North asked the only question, about the contents of the Ofcom road map, which in evidence it committed to publishing before the summer. I cannot entirely speak for Ofcom, which is of course an independent body. In order to avoid me giving the Committee misleading information, the best thing is for officials at the Department for Digital, Culture, Media and Sport to liaise with Ofcom and ascertain what the exact contents of the road map will be, and we can report that back to the Committee by letter.

It will be fair to say that the Committee’s feeling—I invite hon. Members to intervene if I have got this wrong—is that the road map should be as comprehensive as possible. Ideally, it would lay out the intended plan to cover all the activities that Ofcom would have to undertake in order to make the Bill operational, and the more detail there is, and the more comprehensive the road map can be, the happier the Committee will be.

Officials will take that away, discuss it with Ofcom and we can revert with fuller information. Given that the timetable was to publish the road map prior to the summer, I hope that we are not going to have to wait very long before we see it. If Ofcom is not preparing it now, it will hopefully hear this discussion and, if necessary, expand the scope of the road map a little bit accordingly.

Question put and agreed to.

Clause 107 accordingly ordered to stand part of the Bill

Clauses 108 and 109 ordered to stand part of the Bill.

Clause 110

Provisional notice of contravention

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

Chris Philp Portrait Chris Philp
- Hansard - -

First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.

I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.

My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.

I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.

Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree 100%. The testimony of Frances Haugen, the Facebook whistleblower, highlighted the fact that expert researchers and academics will need to examine the data and look at what is happening behind social media platforms if we are to ensure that the Bill is truly fit for purpose and world leading. That process should be carried out as quickly as possible, and Ofcom must also be encouraged to publish guidance on how access to data will work.

Ultimately, the amendments make a simple point: civil society and researchers should be able to access data, so why will the Minister not let them? The Bill should empower independently verified researchers and civil society to request tech companies’ data. Ofcom should be required to publish guidance as soon as possible —within months, not years—on how data may be accessed. That safety check would hold companies to account and make the internet a safer and less divisive space for everyone.

The process would not be hard or commercially ruinous, as the platforms claim. The EU has already implemented it through its Digital Services Act, which opens up the secrets of tech companies’ data to Governments, academia and civil society in order to protect internet users. If we do not have that data, researchers based in the EU will be ahead of those in the UK. Without more insight to enable policymaking, quality research and harm analysis, regulatory intervention in the UK will stagnate. What is more, without such data, we will not know Instagram’s true impact on teen mental health, nor the reality of violence against women and girls online or the risks to our national security.

We propose amending the Bill to accelerate data sharing provisions while mandating Ofcom to produce guidance on how civil society and researchers can access data, not just on whether they should. As I said, that should happen within months, not years. The provisions should be followed by a code of practice, as outlined in the amendment, to ensure that platforms do not duck and dive in their adherence to transparency requirements. A code of practice would help to standardise data sharing in a way that serves platforms and researchers.

The changes would mean that tech companies can no longer hide in the shadows. As Frances Haugen said of the platforms in her evidence a few weeks ago:

“The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 188, Q320.]

Chris Philp Portrait Chris Philp
- Hansard - -

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.

Chris Philp Portrait Chris Philp
- Hansard - -

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has committed to Ofcom being fully resourced to do what it needs to do under the Bill, but he has spoken about time constraints. If Ofcom were to receive 25,000 risk assessments, for example, there simply would not be enough people to go through them. Does he agree that, in cases in which Ofcom is struggling to manage the volume of data and to do the level of assessment required, it may be helpful to augment that work with the use of independent researchers? I am not asking him to commit to that, but to consider the benefits.

Chris Philp Portrait Chris Philp
- Hansard - -

Yes, I would agree that bona fide academic independent researchers do have something to offer and to add in this area. The more we have highly intelligent, experienced and creative people looking at a particular problem or issue, the more likely we are to get a good and well-informed result. They may have perspectives that Ofcom does not. I agree that, in principle, independent researchers can add a great deal, but we need to ensure that we get that set up in a thoughtful and proper way. I understand the desire to get it done quickly, but it is important to take the time to do it not just quickly, but right. It is an area that does not exist already—at the moment, there is no concept of independent researchers getting access to the innards of social media companies’ data vaults—so we need to make sure that it is done in the right way, which is why it is structured as it is. I ask the Committee to stick with the drafting, whereby there will be a report and then Ofcom will have the power. I hope we end up in the same place—well, the same place, but a better place. The process may be slightly slower, but we may also end up in a better place for the consideration and thought that will have to be given.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I appreciate where the Minister is coming from. It seems that he wants to back the amendment, so I am struggling to see why he will not, especially given that the DSA—the EU’s new legislation—is already doing this. We know that the current wording in the Bill is far too woolly. If providers can get away with it, they will, which is why we need to compel them, so that we are able to access this data. We need to put that on the face of the Bill. I wish that we did not have to do so, but we all wish that we did not have to have this legislation in the first place. Unless we put it in the Bill, however, the social media platforms will carry on regardless, and the internet will not be a safe place for children and adults in the UK. That is why I will push amendment 53 to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Chris Philp Portrait Chris Philp
- Hansard - -

May I first say a brief word about clause stand part, Sir Roger?

None Portrait The Chair
- Hansard -

Yes.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Thank you. Clause 111 sets out and defines the “enforceable requirements” in this chapter—the duties that Ofcom is able to enforce against. Those are set out clearly in the table at subsection (2) and the requirements listed in subsection (3).

The amendment speaks to a different topic. It seeks to impose or police standards for people employed as subcontractors of the various companies that are in scope of the Bill, for example people that Facebook contracts; the shadow Minister, the hon. Member for Pontypridd, gave the example of the gentleman from Kenya she met yesterday. I understand the point she makes and I accept that there are people in those supply chains who are not well treated, who suffer PTSD and who have to do extraordinarily difficult tasks. I do not dispute at all the problems she has referenced. However, the Government do not feel that the Bill is the right place to address those issues, for a couple of reasons.

First, in relation to people who are employed in the UK, we have existing UK employment and health and safety laws. We do not want to duplicate or cut across those. I realise that they relate only to people employed in the UK, but if we passed the amendment as drafted, it would apply to people in the UK as much as it would apply to people in Kenya.

Secondly, the amendment would effectively require Ofcom to start paying regard to employment conditions in Kenya, among other places—indeed, potentially any country in the world—and it is fair to say that that sits substantially outside Ofcom’s area of expertise as a telecoms and communications regulator. That is the second reason why the amendment is problematic.

The third reason is more one of principle. The purpose of the Bill is to keep users safe online. While I understand the reasonable premise for the amendment, it seeks essentially to regulate working conditions in potentially any country in the world. I am just not sure that it is appropriate for an online safety Bill to seek to regulate global working conditions. Facebook, a US company, was referenced, but only 10% of its activity—very roughly speaking—is in the UK. The shadow Minister gave the example of Kenyan subcontractors. Compelling though her case was, I am not sure it is appropriate that UK legislation on online safety should seek to regulate the Kenyan subcontractor of a United States company.

The Government of Kenya can set their own employment regulations and President Biden’s Government can impose obligations on American companies. For us, via a UK online safety Bill, to seek to regulate working conditions in Kenya goes a long way beyond the bounds of what we are trying to do, particularly when we take into account that Ofcom is a telecommunications and communications regulator. To expect it to regulate working conditions anywhere in the world is asking quite a lot.

I accept that a real issue is being raised. There is definitely a problem, and the shadow Minister and the hon. Member for Aberdeen North are right to raise it, but for the three principal reasons that I set out, I suggest that the Bill is not the place to address these important issues.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister mentions workers in the UK. I am a proud member of the Labour party and a proud trade unionist; we have strong protections for workers in the UK. There is a reason why Facebook and some of these other platforms, which are incredibly exploitative, will not have human moderators in the UK looking at this content: because they know they would be compelled to treat them a hell of a lot better than they do the workers around the world that they are exploiting, as they do in Kenya, Dublin and the US.

To me, the amendment speaks to the heart of the Bill. This is an online safety Bill that aims to keep the most vulnerable users safe online. People around the world are looking at content that is created here in the UK and having to moderate it; we are effectively shipping our trash to other countries and other people to deal with it. That is not acceptable. We have the opportunity here to keep everybody safe from looking at this incredibly harmful content. We have a duty to protect those who are looking at content created in the UK in order to keep us safe. We cannot let those people down. The amendment and new clause 11 give us the opportunity to do that. We want to make the Bill world leading. We want the UK to stand up for those people. I urge the Minister to do the right thing and back the amendment.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.

Chris Philp Portrait Chris Philp
- Hansard - -

I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We support clause 112, which gives Ofcom the power to issue a confirmation decision if, having followed the required process—for example, in clause 110—its final decision is that a regulated service has breached an enforceable requirement. As we know, this will set out Ofcom’s final decision and explain whether Ofcom requires the recipient of the notice to take any specific steps and/or pay a financial penalty. Labour believes that this level of scrutiny and accountability is vital to an Online Safety Bill that is truly fit for purpose, and we support clause 112 in its entirety.

We also support the principles of clause 113, which outlines the steps that a person may be required to take either to come into compliance or to remedy the breach that has been committed. Subsection (5) in particular is vital, as it outlines how Ofcom can require immediate action when the breach has involved an information duty. We hope this will be a positive step forward in ensuring true accountability of big tech companies, so we are happy to support the clause unamended.

It is right and proper that Ofcom has powers when a regulated provider has failed to carry out an illegal content or children’s risk assessment properly or at all, and when it has identified a risk of serious harm that the regulated provider is not effectively mitigating or managing. As we have repeatedly heard, risk assessments are the very backbone of the Bill, so it is right and proper that Ofcom is able to force a company to take measures to comply in the event of previously failing to act.

Children’s access assessments, which are covered by clause 115, are a crucial component of the Bill. Where Ofcom finds that a regulated provider has failed to properly carry out an assessment, it is vital that it has the power and legislative standing to force the company to do more. We also appreciate the inclusion of a three-month timeframe, which would ensure that, in the event of a provider re-doing the assessment, it would at least be completed within a specific—and small—timeframe.

While we recognise that the use of proactive technologies may come with small issues, Labour ultimately feels that clause 116 is balanced and fair, as it establishes that Ofcom may require the use of proactive technology only on content that is communicated publicly. It is fair that content in the public domain is subject to those important safety checks. It is also right that under subsection (7), Ofcom may set a requirement forcing services to review the kind of technology being used. That is a welcome step that will ensure that platforms face a level of scrutiny that has certainly been missing so far.

Labour welcomes and is pleased to support clause 117, which allows Ofcom to impose financial penalties in its confirmation decision. That is something that Labour has long called for, as we believe that financial penalties of this nature will go some way towards improving best practice in the online space and deterring bad actors more widely.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has set out the provisions in the clauses, and I am grateful for her support. In essence, clauses 112 to 117 set out the processes around confirmation decisions and make provisions to ensure that those are effective and can be operated in a reasonable and fair way. The clauses speak largely for themselves, so I am not sure that I have anything substantive to add.

Question put and agreed to.

Clause 112 accordingly ordered to stand part of the Bill.

Clauses 113 to 117 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Dean Russell.)

Online Safety Bill (Thirteenth sitting)

Chris Philp Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Bore da, Ms Rees. It is, as ever, a pleasure to serve under your chairship. I rise to speak to clauses 118 to 121 and Government amendments 154 to 157.

As we all know, clause 118 is important and allows Ofcom to impose a financial penalty on a person who fails to complete steps that have been required by Ofcom in a confirmation decision. This is absolutely vital if we are to guarantee that regulated platforms take seriously their responsibilities in keeping us all safe online. We support the use of fines. They are key to overall behavioural change, particularly in the context of personal liability. We welcome clause 118, which outlines the steps Ofcom can take in what we hope will become a powerful deterrent.

Labour also welcomes clause 119. It is vital that Ofcom has these important powers to impose a financial penalty on a person who fails to comply with a notice that requires technology to be implemented to identify and deal with content relating to terrorism and child sexual exploitation and abuse on their service. These are priority harms and the more that can be done to protect us on these two points the better.

Government amendments 155 and 157 ensure that Ofcom has the power to impose a monetary penalty on a provider of a service who fails to pay a fee that it is required to pay under new schedule 2. We see these amendments as crucial in giving Ofcom the important powers it needs to be an effective regulator, which is something we all require. We have some specific observations around new schedule 2, but I will save those until we consider that schedule. For now, we support these amendments and I look forward to outlining our thoughts shortly.

We support clause 120, which allows Ofcom to give a penalty notice to a provider of a regulated service who does not pay the fee due to Ofcom in full. This a vital provision that also ensures that Ofcom’s process to impose a penalty can progress only when it has given due notice to the provider and once the provider has had fair opportunity to make fair representations to Ofcom. This is a fair approach and is central to the Bill, which is why we have not sought to amend.

Finally, we support clause 121, which ensures that Ofcom must state the reasons why it is imposing a penalty, the amount of the penalty and any aggravating or mitigating factors. Ofcom must also state when the penalty must be paid. It is imperative that when issuing a notice Ofcom is incentivised to publish information about the amount, aggravating or mitigating factors and when the penalty must be paid. We support this important clause and have not sought to amend.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure to serve under your chairmanship once again, Ms Rees, and I congratulate Committee members on evading this morning’s strike action.

I am delighted that the shadow Minister supports the intent behind these clauses, and I will not speak at great length given the unanimity on this topic. As she said, clause 118 allows Ofcom to impose a financial penalty for failure to take specified steps by a deadline set by Ofcom. The maximum penalty that can be imposed is the greater of £18 million or 10% of qualifying worldwide revenue. In the case of large companies, it is likely to be a much larger amount than £18 million.

Clause 119 enables Ofcom to impose financial penalties if the recipient of a section 103 notice does not comply by the deadline. It is very important to ensure that section 103 has proper teeth. Government amendments 154 to 157 make changes that allow Ofcom to recover not only the cost of running the service once the Bill comes into force and into the future but also the preparatory cost of setting up for the Bill to come into force.

As previously discussed, £88 million of funding is being provided to Ofcom in this financial year and next. We believe that something like £20 million of costs that predate these financial years have been funded as well. That adds up to around £108 million. However, the amount that Ofcom recovers will be the actual cost incurred. The figure I provided is simply an indicative estimate. The actual figure would be based on the real costs, which Ofcom would be able to recoup under these measures. That means that the taxpayer—our constituents —will not bear any of the costs, including the set-up and preparatory cost. This is an equitable and fair change to the Bill.

Clause 120 sets out that some regulated providers will be required to pay a regulatory fee to Ofcom, as set out in clause 71. Clause 120 allows Ofcom to impose a financial penalty if a regulated provider does not pay its fee by the deadline it sets. Finally, clause 121 sets out the information that needs to be included in these penalty notices issued by Ofcom.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I have questions about the management of the fees and the recovery of the preparatory cost. Does the Minister expect that the initial fees will be higher as a result of having to recoup the preparatory cost and will then reduce? How quickly will the preparatory cost be recovered? Will Ofcom recover it quickly or over a longer period of time?

Chris Philp Portrait Chris Philp
- Hansard - -

The Bill provides a power for Ofcom to recover those costs. It does not specify over what time period. I do not think they will be recouped over a period of years. Ofcom can simply recoup the costs in a single hit. I would imagine that Ofcom would seek to recover these costs pretty quickly after receiving these powers. The £108 million is an estimate. The actual figure may be different once the reconciliation and accounting is done. It sounds like a lot of money, but it is spread among a number of very large social media firms. It is not a large amount of money for them in the context of their income, so I would expect that recouping to be done on an expeditious basis—not spread over a number of years. That is my expectation.

Question put and agreed to.

Clause 118 accordingly ordered to stand part of the Bill.

Clause 119 ordered to stand part of the Bill.

Clause 120

Non-payment of fee

Amendments made: 154, in clause 120, page 102, line 20, after “71” insert:

“or Schedule (Recovery of OFCOM’s initial costs)”.

This amendment, and Amendments 155 to 157, ensure that Ofcom have the power to impose a monetary penalty on a provider of a service who fails to pay a fee that they are required to pay under NS2.

Amendment 155, in clause 120, page 102, line 21, leave out “that section” and insert “Part 6”.

Amendment 156, in clause 120, page 102, line 26, after “71” insert—

“or Schedule (Recovery of OFCOM’s initial costs)”

Amendment 157, in clause 120, page 103, line 12, at end insert—

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

Clause 120, as amended, ordered to stand part of the Bill.

Clause 121 ordered to stand part of the Bill.

Clause 122

Amount of penalties etc

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss:

Government amendment 158.

That schedule 12 be the Twelfth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 122 and schedule 12, which set out in detail the financial penalties that Ofcom may impose, including the maximum penalty that can be imposed. Labour has long supported financial penalties for those failing to comply with the duties in the Bill. We firmly believe that tough action is needed on online safety, but we feel the sanctions should go further and that there should be criminal liability for offences beyond just information-related failures. We welcome clause 122 and schedule 12. It is vital that Ofcom is also required to produce guidelines around how it will determine penalty amounts. Consistency across the board is vital, so we feel this is a positive step forward and have not sought to amend the clause.

Paragraph 8 of schedule 12 requires monetary penalties to be paid into the consolidated fund. There is no change to that requirement, but it now appears in new clause 43, together with the requirement to pay fees charged under new schedule 2 into the consolidated fund. We therefore support the amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

I have nothing further to add on these amendments. The shadow Minister has covered them, so I will not detain the Committee further.

Question put and agreed to.

Clause 122 accordingly ordered to stand part of the Bill.

Schedule 12

Penalties imposed by OFCOM under Chapter 6 of Part 7

Amendment made: 158, in schedule 12, page 206, line 43, leave out paragraph 8.—(Chris Philp.)

Paragraph 8 of Schedule 12 requires monetary penalties to be paid into the Consolidated Fund. There is no change to that requirement, but it now appears in NC43 together with the requirement to pay fees charged under NS2 into the Consolidated Fund.

Schedule 12, as amended, agreed to.

Clause 123

Service restriction orders

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 50, in clause 123, page 106, line 36, at end insert—

“(9A) OFCOM may apply to the court for service restriction orders against multiple regulated services with one application, through the use of a schedule of relevant services which includes all the information required by subsection (5).”

This amendment would give Ofcom the ability to take action against a schedule of non-compliant sites, while still preserving the right of those sites to oppose the application for, and/or appeal through the courts against any, orders to block access or support services.

--- Later in debate ---
None Portrait The Chair
- Hansard -

If no other Members wish to speak to amendments 50 and 51 and clauses 123 to 127, I will call the Minister to respond.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start with amendments 50 and 51, which were introduced by the shadow Minister and supported by the SNP spokesperson. The Government recognise the valid intent behind the amendments, namely to make sure that applications can be streamlined and done quickly, and that Ofcom can make bulk applications if large numbers of service providers violate the new duties to the extent that interim service restriction orders or access restriction orders become necessary.

We want a streamlined process, and we want Ofcom to deal efficiently with it, including, if necessary, by making bulk applications to the court. Thankfully, however, procedures under the existing civil procedure rules already allow so-called multi-party claims to be made. Those claims permit any number of claimants, any number of defendants or respondents and any number of claims to be covered in a single form. The overriding objective of the CPR is that cases are dealt with justly and proportionately. Under the existing civil procedure rules, Ofcom can already make bulk applications to deal with very large numbers of non-compliant websites and service providers in one go. We completely agree with the intent behind the amendments, but their content is already covered by the CPR.

It is worth saying that the business disruption measures—the access restriction orders and the service restriction orders—are intended to be a last resort. They effectively amount to unplugging the websites from the internet so that people in the United Kingdom cannot access them and so that supporting services, such as payment services, do not support them. The measures are quite drastic, although necessary and important, because we do not want companies and social media firms ignoring our legislation. It is important that we have strong measures, but they are last resorts. We would expect Ofcom to use them only when it has taken reasonable steps to enforce compliance using other means.

If a provider outside the UK ignores letters and fines, these measures are the only option available. As the shadow Minister, the hon. Member for Pontypridd, mentioned, some pornography providers probably have no intention of even attempting to comply with our regulations; they are probably not based in the UK, they are never going to pay the fine and they are probably incorporated in some obscure, offshore jurisdiction. Ofcom will need to use these powers in such circumstances, possibly on a bulk scale—I am interested in her comment that that is what the German authorities had to do—but the powers already exist in the CPR.

It is also worth saying that in its application to the courts, Ofcom must set out the information required in clauses 123(5) and 125(3), so evidence that backs up the claim can be submitted, but that does not stop Ofcom doing this on a bulk basis and hitting multiple different companies in one go. Because the matter is already covered in the CPR, I ask the shadow Minister to withdraw the amendment.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am interested to know whether the Minister has anything to add about the other clauses. I am happy to give way to him.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the shadow Minister for giving way. I do not have too much to say on the other clauses, because she has introduced them, but in my enthusiasm for explaining the civil procedure rules I neglected to respond to her question about the interim orders in clauses 124 and 126.

The hon. Lady asked what criteria have to be met for these interim orders to be made. The conditions for clause 124 are set out in subsections (3) and (4) of that clause, which states, first, that it has to be

“likely that the…service is failing to comply with an enforceable requirement”—

so it is likely that there has been a breach—and, secondly, that

“the level of risk of harm to individuals in the United Kingdom…and the nature and severity of that harm, are such that it would not be appropriate to wait to establish the failure before applying for the order.”

Similar language in clause 124(4) applies to breaches of section 103.

Essentially, if it is likely that there has been a breach, and if the resulting harm is urgent and severe—for example, if children are at risk—we would expect these interim orders to be used as emergency measures to prevent very severe harm. I hope that answers the shadow Minister’s question. She is very kind, as is the Chair, to allow such a long intervention.

None Portrait The Chair
- Hansard -

In a Bill Committee, a Member can speak more than once. However, your intervention resolved the situation amicably, Minister.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister and his Back Benchers will, I am sure, be tired of our calls for more transparency, but I will be kind to him and confirm that Labour welcomes the provisions in clause 128.

We believe that it is vital that, once Ofcom has followed the process outlined in clause 110 when issuing a confirmation decision outlining its final decision, that is made public. We particularly welcome provisions to ensure that when a confirmation decision is issued, Ofcom will be obliged to publish the identity of the person to whom the decision was sent, details of the failure to which the decision relates, and details relating to Ofcom’s response.

Indeed, the transparency goes further, as Ofcom will be obliged to publish details of when a penalty notice has been issued in many more areas: when a person fails to comply with a confirmation decision; when a person fails to comply with a notice to deal with terrorism content or child sexual exploitation and abuse content, or both; and when there has been a failure to pay a fee in full. That is welcome indeed. Labour just wishes that the Minister had committed to the same level of transparency on the duties in the Bill to keep us safe in the first place. That said, transparency on enforcement is a positive step forward, so we have not sought to amend the clause at this stage.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful for the shadow Minister’s support. I have nothing substantive to add, other than to point to the transparency reporting obligation in clause 64, which we have debated.

Question put and agreed to.

Clause 128 accordingly ordered to stand part of the Bill.

Clause 129

OFCOM’s guidance about enforcement action

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move amendment 7, in clause 129, page 114, line 3, at end insert—

“(aa) the Information Commissioner, and”.

This amendment ensures that before Ofcom produce guidance about their exercise of their enforcement powers, they must consult the Information Commissioner.

If I may, in the interest of speed and convenience, I will speak to clause stand part as well.

The clause requires Ofcom to issue guidance setting out how it will use its enforcement powers in the round. That guidance will ensure that the enforcement process is transparent, it will cover the general principles and processes of the enforcement regime, and it is intended to help regulated providers and other stakeholders to understand how Ofcom will exercise its powers.

--- Later in debate ---
Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Clause 129(4) states that the Secretary of State will be consulted in the process. What would be the Secretary of State’s powers in relation to that? Would she be able to overrule Ofcom in the writing of its guidance?

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Member asks for my assistance in interpreting legislative language. Generally speaking, “consult” means what it suggests. Ofcom will consult the Secretary of State, as it will consult the ICO, to ascertain the Secretary of State’s opinion, but Ofcom is not bound by that opinion. Unlike the power in a previous clause—I believe it was clause 40—where the Secretary of State could issue a direct instruction to Ofcom on certain matters, here we are talking simply about consulting. When the Secretary of State expresses an opinion in response to the consultation, it is just that—an opinion. I would not expect it to be binding on Ofcom, but I would expect Ofcom to pay proper attention to the views of important stakeholders, which in this case include both the Secretary of State and the ICO. I hope that gives the hon. Member the clarification he was seeking.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, clause 129 requires Ofcom to publish guidance about how it will use its enforcement powers. It is right that regulated providers and other stakeholders have a full understanding of how, and in what circumstances, Ofcom will have the legislative power to exercise this suite of enforcement powers. We also welcome Government amendment 7, which will ensure that the Information Commissioner—a key and, importantly, independent authority—is included in the consultation before guidance is produced.

As we have just heard, however, the clause sets out that Secretary of State must be consulted before Ofcom produces guidance, including revised or replacement guidance, about how it will use its enforcement powers. We feel that that involves the Secretary of State far too closely in the enforcement of the regime. The Government should be several steps away from being involved, and the clause seriously undermines Ofcom’s independence—the importance of which we have been keen to stress as the Bill progresses, and on which Conservative Back Benchers have shared our view—so we cannot support the clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I repeat the point I made to the hon. Member for Liverpool, Walton a moment ago. This is simply an obligation to consult. The clause gives the Secretary of State an opportunity to offer an opinion, but it is just that—an opinion. It is not binding on Ofcom, which may take that opinion into account or not at its discretion. This provision sits alongside the requirement to consult the Information Commissioner’s Office. I respectfully disagree with the suggestion that it represents unwarranted and inappropriate interference in the operation of a regulator. Consultation between organs of state is appropriate and sensible, but in this case it does not fetter Ofcom’s ability to act at its own discretion. I respectfully do not agree with the shadow Minister’s analysis.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Apologies, Ms Rees, for coming in a bit late on this, but I was not aware of the intention to vote against the clause. I want to make clear what the Scottish National party intends to do, and the logic behind it. The inclusion of Government amendment 7 is sensible, and I am glad that the Minister has tabled it. Clause 129 is incredibly important, and the requirement to publish guidance will ensure that there is a level of transparency, which we and the Labour Front Benchers have been asking for.

The Minister has been clear about the requirement for Ofcom to consult the Secretary of State, rather than to be directed by them. As a whole, this Bill gives the Secretary of State far too much power, and far too much ability to intervene in the workings of Ofcom. In this case, however, I do not have an issue with the Secretary of State being consulted, so I intend to support the inclusion of this clause, as amended by Government amendment 7.



Question put, That the amendment be made.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

When I spoke at the very beginning of the Committee’s proceedings, I said that the legislation was necessary, that it was a starting point and that it would no doubt change and develop over time. However, I have been surprised at how little, considering all of the rhetoric we have heard from the Secretary of State and other Ministers, the Bill actually deals with the general societal harm that comes from the internet. This is perhaps the only place in the Bill where it is covered.

I am thinking of the echo chambers that are created around disinformation and the algorithms that companies use. I really want to hear from the Minister where he sees this developing and why it is so weak and wishy-washy. While I welcome that much of the Bill seeks to deal with the criminality of individuals and the harm and abuse that can be carried out over the internet, overall it misses a great opportunity to deal with the harmful impact the internet can have on society.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by speaking on the issue of disinformation more widely, which clearly is the target of the two amendments and the topic of clause 130. First, it is worth reminding the Committee that non-legislatively—operationally—the Government are taking action on the disinformation problem via the counter-disinformation unit of the Department for Digital, Culture, Media and Sport, which we have discussed previously.

The unit has been established to monitor social media firms and sites for disinformation and then to take action and work with social media firms to take it down. For the first couple of years of its operation, it understandably focused on disinformation connected to covid. In the last two or three months, it has focused on disinformation relating to the Russia-Ukraine conflict —in particular propaganda being spread by the Russian Government, which, disgracefully, has included denying responsibility for various atrocities, including those committed at Bucha. In fact, in cases in which the counter-disinformation unit has not got an appropriate response from social media firms, those issues have been escalated to me, and I have raised them directly with those firms, including Twitter, which has tolerated all kinds of disinformation from overt Russian state outlets and channels, including from Russian embassy Twitter accounts, which are of particular concern to me. Non-legislative action is being taken via the CDU.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

It is fantastic to hear that those other things are happening—that is all well and good—but surely we should explicitly call out disinformation and misinformation in the Online Safety Bill. The package of other measures that the Minister mentions is fantastic, but I think they have to be in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady says that those measures should be in the Bill—more than they already are—but as I have pointed out, the way in which the legal architecture of the Bill works means that the mechanisms to do that would be adding a criminal offence to schedule 7 as a priority offence, for example, or using a statutory instrument to designate the relevant kind of harm as a priority harm, which we plan to do in due course for a number of harms. The Bill can cover disinformation with the use of those mechanisms.

We have not put the harmful to adults content in the Bill; it will be set out in statutory instruments. The National Security Bill is still progressing through Parliament, and we cannot have in schedule 7 of this Bill an offence that has not yet been passed by Parliament. I hope that that explains the legal architecture and mechanisms that could be used under the Bill to give force to those matters.

On amendment 57, the Government feel that six months is a very short time within which to reach clear conclusions, and that 18 months is a more appropriate timeframe in which to understand how the Bill is bedding in and operating. Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. To be clear, the Bill already requires Ofcom to produce codes of practice that set out the steps that providers will take to tackle illegal content— I mentioned the new National Security Bill, which is going through Parliament—and harmful content, which may, in some circumstances, include disinformation.

Disinformation that is illegal or harmful to individuals is in scope of the duties set out in the Bill. Ofcom’s codes of practice will, as part of those duties, have to set out the steps that providers should take to reduce harm to users that arises from such disinformation. Those steps could include content-neutral design choices or interventions of other kinds. We would like Ofcom to have a certain amount of flexibility in how it develops those codes of practice, including by being able to combine or disaggregate those codes in ways that are most helpful to the general public and the services that have to pay regard to them. That is why we have constructed them in the way we have. I hope that provides clarity about the way that disinformation can be brought into the scope of the Bill and how that measure then flows through to the codes of practice. I gently resist amendments 57 and 58 while supporting the clause standing part of the Bill.

Question put, That the amendment be made.

--- Later in debate ---
Question proposed, That the clause stand part of the Bill.
Chris Philp Portrait Chris Philp
- Hansard - -

The clause allows Ofcom to confer functions on the content board in relation to content-related functions under the Bill, but does not require it to do so. We take the view that how Ofcom manages its responsibilities internally is a matter for Ofcom. That may change over time. The clause simply provides that Ofcom may, if Ofcom wishes, ask its content board to consider online safety matters alongside its existing responsibilities. I trust that the Committee considers that a reasonable measure.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes the clause, which, as the Minister has said, sets out some important clarifications with respect to the Communications Act 2003. We welcome the clarification that the content board will have delegated and advisory responsibilities, and look forward to the Minister’s confirmation of exactly what those are and how this will work in practice. It is important that the content board and the advisory committee on disinformation and misinformation are compelled to communicate, too, so we look forward to an update from the Minister on what provisions in the Bill will ensure that that happens.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has asked how this will work in practice, but as I said, the internal operation of Ofcom obviously is a matter for Ofcom. As Members have said in the recent past—indeed, in the last hour—they do not welcome undue Government interference in the operation of Ofcom, so it is right that we leave this as a matter for Ofcom. We are providing Ofcom with the power, but we are not compelling it to use that power. We are respecting Ofcom’s operational independence—a point that shadow Ministers and Opposition Members have made very recently.

Question put and agreed to.

Clause 131 accordingly ordered to stand part of the Bill.

Clause 132

Research about users’ experiences of regulated services

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 133 stand part.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I agree with the right hon. Member for Basingstoke that these are important clauses. I want to put them into the context of what we heard from Frances Haugen, who, when she spoke to Congress, said that Facebook consistently chose to maximise its growth rather than implement safeguards on its platforms. She said:

“During my time at Facebook, I came to realise a devastating truth: Almost no one outside of Facebook knows what happens inside Facebook. “The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world.”

When we consider users’ experiences, I do not think it is good enough just to look at how the user engages with information. We need far more transparency about how the companies themselves are run. I would like to hear the Minister’s views on how this clause, which looks at users’ experiences, can go further in dealing with the harms at source, with the companies, and making sure a light is shone on their practices.

Chris Philp Portrait Chris Philp
- Hansard - -

I welcome the support of the hon. Member for Pontypridd for these clauses. I will turn to the questions raised by my right hon. Friend the Member for Basingstoke. First, she asked whether Ofcom has to publish these reports so that the public, media and Parliament can see what they say. I am pleased to confirm that Ofcom does have to publish the reports; section 15 of the Communications Act 2003 imposes a duty on Ofcom to publish reports of this kind.

Secondly, my right hon. Friend asked about educating the public on issues pertinent to these reports, which is what we would call a media literacy duty. Again, I confirm that, under the Communications Act, Ofcom has a statutory duty to promote media literacy, which would include matters that flow from these reports. In fact, Ofcom published an expanded and updated set of policies in that area at the end of last year, which is why the old clause 103 in the original version of this Bill was removed—Ofcom had already gone further than that clause required.

Thirdly, my right hon. Friend asked about the changes that might happen in response to the findings of these reports. Of course, it is open to Ofcom—indeed, I think this Committee would expect it—to update its codes of practice, which it can do from time to time, in response to the findings of these reports. That is a good example of why it is important for those codes of practice to be written by Ofcom, rather than being set out in primary legislation. It means that when some new fact or circumstance arises or some new bit of research, such as the information required in this clause, comes out, those codes of practice can be changed. I hope that addresses the questions my right hon. Friend asked.

The hon. Member for Liverpool, Walton asked about transparency, referring to Frances Haugen’s testimony to the US Senate and her disclosures to The Wall Street Journal, as well as the evidence she gave this House, both to the Joint Committee and to this Committee just before the Whitsun recess. I have also met her bilaterally to discuss these issues. The hon. Gentleman is quite right to point out that these social media firms use Facebook as an example, although there are others that are also extremely secretive about what they say in public, to the media and even to representative bodies such as the United States Congress. That is why, as he says, it is extremely important that they are compelled to be a lot more transparent.

The Bill contains a large number of provisions compelling or requiring social media firms to make disclosures to Ofcom as the regulator. However, it is important to have public disclosure as well. It is possible that the hon. Member for Liverpool, Walton was not in his place when we came to the clause in question, but if he turns to clause 64 on page 56, he will see that it includes a requirement for Ofcom to give every provider of a relevant service a notice compelling them to publish a transparency report. I hope he will see that the transparency obligation that he quite rightly refers to—it is necessary—is set out in clause 64(1). I hope that answers the points that Committee members have raised.

Question put and agreed to.

Clause 132 accordingly ordered to stand part of the Bill.

Clause 133 ordered to stand part of the Bill.

Clause 134

OFCOM’s statement about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we all know, the clause requires Ofcom to publish annual reports on the steps it has taken, when carrying out online safety functions, to uphold users’ rights under articles 8 and 10 of the convention, as required by section 6 of the Human Rights Act 1998. It will come as no surprise to the Minister that Labour entirely supports this clause.

Upholding users’ rights is a central part of this Bill, and it is a topic we have debated repeatedly in our proceedings. I know that the Minister faces challenges of his own, as the Opposition do, regarding the complicated balance between freedom of speech and safety online. It is only right and proper, therefore, for Ofcom to have a specific duty to publish reports about what steps it is taking to ensure that the online space is fair and equal for all.

That being said, we know that we can and should go further. My hon. Friend the Member for Batley and Spen will shortly address an important new clause tabled in her name—I believe it is new clause 25—so I will do my best not to repeat her comments, but it is important to say that Ofcom must be compelled to publish reports on how its overall regulatory operating function is working. Although Labour welcomes clause 134 and especially its commitment to upholding users’ rights, we believe that when many feel excluded in the existing online space, Ofcom can do more in its annual reporting. For now, however, we support clause 134.

Chris Philp Portrait Chris Philp
- Hansard - -

I welcome the shadow Minister’s continuing support for these clauses. Clause 134 sets out the requirement on Ofcom to publish reports setting out how it has complied with articles 8 and 10 of the European convention on human rights.

I will pause for a second, because my hon. Friend the Member for Don Valley and others have raised concerns about the implications of the Bill for freedom of speech. In response to a question he asked last week, I set out in some detail the reasons why I think the Bill improves the position for free speech online compared with the very unsatisfactory status quo. This clause further strengthens that case, because it requires this report and reminds us that Ofcom must discharge its duties in a manner compatible with articles 8 and 10 of the ECHR.

From memory, article 8 enshrines the right to a family life, and article 10 enshrines the right to free speech, backed up by quite an extensive body of case law. The clause reminds us that the powers that the Bill confers on Ofcom must be exercised—indeed, can only be exercised—in conformity with the article 10 duties on free speech. I hope that that gives my hon. Friend additional assurance about the strength of free speech protection inherent in the Bill. I apologise for speaking at a little length on a short clause, but I think that was an important point to make.

Question put and agreed to.

Clause 134 accordingly ordered to stand part of the Bill.

Clause 135

OFCOM’s transparency reports

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour welcomes clause 135, which places a duty on Ofcom to produce its own reports based on information from the transparency reports that providers are required to publish. However, the Minister will know that Labour feels the Bill has much more work to do on transparency more widely, as we have repeatedly outlined through our debates. The Minister rejected our calls for increased transparency when we were addressing, I believe, clause 61. We are not alone in feeling that transparency reports should go further. The sector and his own Back Benchers are calling for it, yet so far his Department has failed to act.

It is a welcome step that Ofcom must produce its own reports based on information from the provider’s transparency reports, but the ultimate motivation for the reports to provide a truly accurate depiction of the situation online is for them to be made public. I know the Minister has concerns around security, but of course no one wants to see users put at harm unnecessarily. That is not what we are asking for here. I will refrain from repeating debates we have already had at length, but I wish to again put on the record our concerns around the transparency reporting process as it stands.

That being said, we support clause 135. It is right that Ofcom is compelled to produce its own reports; we just wish they were made public. With the transparency reports coming from the providers, we only wish they would go further.

Chris Philp Portrait Chris Philp
- Hansard - -

I have spoken to these points previously, so I do not want to tax the Committee’s patience by repeating what I have said.

Question put and agreed to.

Clause 135 accordingly ordered to stand part of the Bill.

Clause 136

OFCOM’s report about researchers’ access to information

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, Labour welcomes clause 136, which is a positive step towards a transparent approach to online safety, given that it requires Ofcom to publish a report about the access that independent researchers have, or could have, to matters relating to the online safety of regulated services. As my hon. Friend the Member for Worsley and Eccles South rightly outlined in an earlier sitting, Labour strongly believes that the transparency measures in the Bill do not go far enough.

Independent researchers already play a vital role in regulating online safety. Indeed, there are far too many to list, but many have supported me, and I am sure the Minister, in our research on the Bill. That is why we have tabled a number of amendments on this point, as we sincerely feel there is more work to be done. I know the Minister says he understands and is taking on board our comments, but thus far we have seen little movement on transparency.

Chris Philp Portrait Chris Philp
- Hansard - -

In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.

Question put and agreed to.

Clause 136 accordingly ordered to stand part of the Bill.

Clause 137

OFCOM’s reports

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Briefly, before I hand over to my hon. Friend the Member for Worsley and Eccles South, I should say that Labour welcomes clause 137, which gives Ofcom a discretionary power to publish reports about certain online safety measures and matters. Clearly, it is important to give Ofcom the power to redact or exclude confidential matters where needs be, and I hope that there will be a certain level of common sense and public awareness, should information of this nature be excluded. As I have previously mentioned—I sound a bit like a broken record—Labour echoes the calls for more transparency, which my hon. Friend the Member for Batley and Spen will come on to in her new clause. However, broadly, we support this important clause.

I would like to press the Minister briefly on how exactly the exclusion of material from Ofcom reports will work in practice. Can he outline any specific contexts or examples, beyond commercial sensitivity and perhaps matters of national security, where he can envision this power being used?

Chris Philp Portrait Chris Philp
- Hansard - -

I welcome the shadow Minister’s support for the clause, once again. The clause provides Ofcom with the power to publish relevant reports about online safety matters to keep users, the public and Parliament well informed. Again, clearly, it is up to Ofcom to decide how it publishes those reports; we will not compel it.

On the question about confidential material that might be withheld, the relevant language in clause 137 looks, to me, to precisely echo the language we saw previously in clause—where was it? Anyway, we have come across this in a previous clause. When it comes to publishing material that can be excluded, the language is just the same.

I would like to make it clear that, while, obviously, this decision is a matter for Ofcom, I would expect that exclusion to be used on a pretty rare basis. Obviously, one would expect matters that are acutely commercially sensitive to be excluded—or redacted—to address that. If there was very sensitive intellectual property, where it would prejudice a company’s commercial interest to have all of that intellectual property exposed, I would expect Ofcom to exercise the exclusion or at least redact what it publishes.

However, because transparency is so important—it is a point that the Committee has made repeatedly—I would expect these exclusions to be used sparingly, and only where absolutely necessary to deliver issues such as the commercial confidentiality or IP protection. Then, it should be used to the minimum extent necessary, because I think that this Committee thinks, and Parliament thinks, that the disclosure around these reports and the reports about breaches—mentioned in the clause I was trying to reach for previously, which was clause 128(4)(b) and (5)(b); perhaps Hansard would be kind enough to clarify that point to make me look slightly more articulate than I in fact am—should be used only very carefully and very rarely. The Committee should be clear on that, and that the bias, as it were—the assumption—should be on the side of disclosure rather than withholding information.

Question put and agreed to.

Clause 137 accordingly ordered to stand part of the Bill.

Clause 138

Appeals against OFCOM decisions relating to the register under section 81

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this, it will be convenient to consider clause 139 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Good morning, Ms Rees. It is a pleasure to serve on the Committee with you in the Chair. Clause 138 allows companies to make appeals against Ofcom’s decisions regarding the categorisation of services within categories 1, 2A or 2B.

We have argued, many times, that we believe the Government’s size-based approach to categorisation is flawed. Our preference for an approach based on risk is backed up by the views of multiple stakeholders and the Joint Committee. It was encouraging to hear last week of the Minister’s intention to look again at the issues of categorisation, and I hope we will see movement on that on Report.

Clause 138 sets out that where a regulated provider has filed an appeal, they are exempt from carrying out the duties in the Bill that normally apply to services designated as category 1, 2A or 2B. That is concerning, given that there is no timeframe in which the appeals process must be concluded.

While the right to appeal is important, it is feasible that many platforms will raise appeals about their categorisation to delay the start of their duties under the Bill. I understand that the platforms will still have to comply with the duties that apply to all regulated services, but for a service that has been classified by Ofcom as high risk, it is potentially dangerous that none of the risk assessments on measures to assess harm will be completed while the appeal is taking place. Does the Minister agree that the appeals process must be concluded as quickly as possible to minimise the risk? Will he consider putting a timeframe on that?

Clause 139 allows for appeals against decisions by Ofcom to issue notices about dealing with terrorism and child sexual abuse material, as well as a confirmation decision or a penalty notice. As I have said, in general the right to appeal is important. However, would an appeals system work if, for example, a company were appealing to a notice under clause 103? In what circumstances does the Minister imagine that a platform would appeal a notice by Ofcom requiring the platform to use accredited technology to identify child sexual abuse content and swiftly take down that content? It is vital that appeals processes are concluded as rapidly as possible, so that we do not risk people being exposed to harmful or dangerous content.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has set out the purpose of the clauses, which provide for, in clause 138 appeal rights for decisions relating to registration under clause 81, and in clause 139 appeals against Ofcom notices.

I agree that it is important that judicial decisions in this area get made quickly. I note that the appeals are directly to the relevant upper tribunal, which is a higher tier of the tribunal system and tends to be a little less congested than the first-tier tribunal, which often gets used for some first-instance matters. I hope that appeals going to the upper tribunal, directly to that more senior level, provides some comfort.

On putting in a time limit, the general principle is that matters concerning listing are reserved to the judiciary. I recall from my time as a Minister in the Ministry of Justice, that the judiciary guards its independence fiercely. Whether it is the Senior President of Tribunals or the Lord Chief Justice, they consider listing matters to be the preserve of the judiciary, not the Executive or the legislature. Compelling the judiciary to hear a case in a certain time might well be considered to infringe on such principles.

We can agree, however—I hope the people making those listing decisions hear that we believe, that Parliament believes—that it is important to do this quickly, in particular where there is a risk of harm to individuals. Where there is risk to individuals, especially children, but more widely as well, those cases should be heard very expeditiously indeed.

The hon. Member for Worsley and Eccles South also asked about the basis on which appeals might be made and decided. I think that is made fairly clear. For example, clause 139(3) makes it clear that, in deciding an appeal, the upper tribunal will use the same principles as would be applied by the High Court to an application for judicial review—so, standard JR terms—which in the context of notices served or decisions made under clause 103 might include whether the power had been exercised in conformity with statute. If the power were exercised or purported to be exercised in a manner not authorised by statute, that would be one grounds for appeal, or if a decision were considered so grossly unreasonable that no reasonable decision maker could make it, that might be a grounds for appeal as well.

I caution the Committee, however: I am not a lawyer and my interpretation of judicial review principles should not be taken as definitive. Lawyers will advise their clients when they come to apply the clause in practice and they will not take my words in Committee as definitive when it comes to determining “standard judicial review principles”—those are well established in law, regardless of my words just now.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

There is a concern that platforms might raise appeals about their categorisation in order to delay the start of their duties under the Bill. How would the Minister act if that happened—if a large number of appeals were pending and the duties under the Bill therefore did not commence?

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 143, in clause 140, page 121, line 1, after “services” insert “, consumers”.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Bill currently specifies that super-complaints can be made back to Ofcom by bodies representing users or members of the public. The addition of consumer representatives through the amendments is important. Consumer representatives are a key source of information about harms to users of online services, which are widespread, and would be regulated by this legislation. We support the amendments, which would include consumers on the list as an entity that is eligible to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister explicitly say that he thinks that an eligible entity, acting on behalf of consumers, could, if it fulfils the other criteria, bring a super-complaint?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Yes, definitely. That is the idea of an eligible entity, which could seek to represent a particular demographic, such as children or people from a particular marginalised group, or it could represent people who have a particular interest, which would potentially include consumers. So I can confirm that that is the intention behind the drafting of the Bill. Having offered that clarification and made clear that the definition is already as wide as it conceivably can be—we cannot get wider than “members of the public”—I ask the hon. Member for Aberdeen North to consider withdrawing the amendments, particularly as there are so many. It will take a long time to vote on them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for the clarification. Given that he has explicitly said that he expects that groups acting on behalf of consumers could, if they fulfil the other criteria, be considered as eligible entities for making super-complaints, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 66, in clause 140, page 121, line 8, at end insert—

“(d) causing harm to any human or animal.”

This amendment ensures groups are able to make complaints regarding animal abuse videos.(Alex Davies-Jones.)

--- Later in debate ---
There must be no loopholes in the complaints procedures, including as regards holding individual services and providers to account. Amendment 77 both strengthens and simplifies the super-complaint provisions, and we support it.
Chris Philp Portrait Chris Philp
- Hansard - -

I think the Committee, and the House, are pretty unanimous in agreeing that the power to make super-complaints is important. As we have discussed, there are all kinds of groups, such as children, under-represented groups and consumers, that would benefit from being represented where systemic issues are not being addressed and that Ofcom may have somehow overlooked or missed in the discharge of its enforcement powers.

I would observe in passing that one of the bases on which super-complaints can be made—this may be of interest to my hon. Friend the Member for Don Valley—is where there is a material risk under clause 140(1)(b) of

“significantly adversely affecting the right to freedom of expression within the law of users of the services or members of the public”.

That clause is another place in the Bill where freedom of expression is expressly picked out and supported. If freedom of expression is ever threatened in a way that we have not anticipated and that the Bill does not provide for, there is a particular power here for a particular free speech group, such as the Free Speech Union, to make a super-complaint. I hope that my hon. Friend finds the fact that freedom of expression is expressly laid out there reassuring.

Let me now speak to the substance of amendment 77, tabled by the hon. Member for Aberdeen North. It is important to first keep in mind the purpose of the super-complaints, which, as I said a moment ago, is to provide a basis for raising issues of widespread and systemic importance. That is the reason for some of the criteria in sections (1)(a), (b) and (c), and why we have subsection (2)—because we want to ensure that super-complaints are raised only if they are of a very large scale or have a profound impact on freedom of speech or some other matter of particular importance. That is why the tests, hurdles and thresholds set out in clause 140(2) have to be met.

If we were to remove subsection (2), as amendment 77 seeks to, that would significantly lower the threshold. We would end up having super-complaints that were almost individual in nature. We set out previously why we think an ombudsman-type system or having super-complaints used for near-individual matters would not be appropriate. That is why the clause is there, and I think it is reasonable that it is.

The hon. Lady asked a couple of questions about how this arrangement might operate in practice. She asked whether a company such Facebook would be caught if it alone were doing something inappropriate. The answer is categorically yes, because the condition in clause 140(2)(b)—

“impacts on a particularly large number of users”,

which would be a large percentage of Facebook’s users,

“or members of the public”—

would be met. Facebook and—I would argue—any category 1 company would, by definition, be affecting large numbers of people. The very definition of category 1 includes the concept of reach—the number of people being affected. That means that, axiomatically, clause 140(2)(b) would be met by any category 1 company.

The hon. Lady also raised the question of Facebook, for a period of time in Europe, unilaterally ceasing to scan for child sexual exploitation and abuse images, which, as mentioned, led to huge numbers of child sex abuse images and, consequently, huge numbers of paedophiles not being detected. She asks how these things would be handled under the clause if somebody wanted to raise a super-complaint about that. Hopefully, Ofcom would stop them happening in the first place, but if it did not the super-complaint redress mechanism would be the right one. These things would categorically be caught by clause 140(2)(a), because they are clearly of particular importance.

In any reasonable interpretation of the words, the test of “particular importance” is manifestly met when it comes to stopping child sexual exploitation and abuse and the detection of those images. That example would categorically qualify under the clause, and a super-complaint could, if necessary, be brought. I hope it would never be necessary, because that is the kind of thing I would expect Ofcom to catch.

Having talked through the examples from the hon. Lady, I hope I have illustrated how the clause will ensure that either large-scale issues affecting large numbers of people or issues that are particularly serious will still qualify for super-complaint status with subsection (2) left in the Bill. Given those assurances, I urge the hon. Member to consider withdrawing her amendment.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I welcome the Minister’s fairly explicit explanation that he believes that every category 1 company would be in scope, even if there was a complaint against one single provider. I would like to push the amendment to a vote on the basis of the comments I made earlier and the fact that each of these platforms is different. We have heard concerns about, for example, Facebook groups being interested in celebrating eight-year-olds’ birthdays. We have heard about the amount of porn on Twitter, which Facebook does not have in the same way. We have heard about the kind of algorithmic stuff that takes people down a certain path on TikTok. We have heard all these concerns, but they are all specific to that one provider. They are not a generic complaint that could be brought toward a group of providers.

Chris Philp Portrait Chris Philp
- Hansard - -

Would the hon. Lady not agree that in all those examples—including TikTok and leading people down dark paths—the conditions in subsection (2) would be met? The examples she has just referred to are, I would say, certainly matters of particular importance. Because the platforms she mentions are big in scale, they would also meet the test of scale in paragraph (b). In fact, only one of the tests has to be met—it is one or the other. In all the examples she has just given, not just one test—paragraph (a) or (b)— would be met, but both. So all the issues she has just raised would make a super-complaint eligible to be made.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad the Minister confirms that he expects that that would be the case. I am clearer now that he has explained it, but on my reading of the clause, the definitions of “particular importance” or

“a particularly large number of users…or members of the public”

are not clear. I wanted to ensure that this was put on the record. While I do welcome the Minister’s clarification, I would like to push amendment 77 to a vote.

Question put, That the amendment be made.

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Fourteenth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

As we have heard, the super-complaint process is extremely important for enabling eligible entities representing the interests of users or members of the public to make representations where there are systemic problems that need to be addressed. I think we all agree that is an important approach.

Clauses 140 to 142 set out the power to make super-complaints, the procedure for making them and the guidance that Ofcom will publish in relation to them. The shadow Minister raised a few questions first, some of which we have touched on previously. In relation to transparency, which we have debated before, as I said previously, there are transparency provisions in clause 64 that I think will achieve the objectives that she set out.

The shadow Minister also touched on some of the questions about individual rather than systemic complaints. Again, we debated those right at the beginning, I think, when we discussed the fact that the approach taken in the Bill is to deal with systems and processes, because the scale involved here is so large. If we tried to create an architecture whereby Ofcom, or some other public body, adjudicated individual complaints, as an ombudsman would, it would simply be overwhelmed. A much better approach is to ensure that the systems and processes are fixed, and that is what the Bill does.

The hon. Member for Aberdeen North had some questions too. She touched in passing on the Secretary of State’s powers to specify by regulation who counts as an eligible entity—this is under clause 140(3). Of course, the nature of those regulations is circumscribed by the very next subsection, subsection (4), in which one of the criteria is that the entity

“must be a body representing the interests of users of regulated services, or members of the public”.

That speaks to the important point about consumers that we touched on this morning. As the hon. Lady said, this will be done by the affirmative procedure, so there is enhanced parliamentary scrutiny. I hope that makes it clear that it would be done in a reasonable way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am sorry to try the Minister’s patience. I think that we are in quite a lot of agreement about what an eligible entity looks like. I appreciate that this is being done by the affirmative procedure, but we seem to be in much less agreement about the next clause, which is being done by the negative procedure. I would like him to explain that contrast.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me move on to clause 141 and amendment 153, which the hon. Lady spoke to a moment ago. Let us first talk about the question of time limits. As she said, the regulations that can be made under the clause include regulations on the time for various steps in the process. Rather than setting those out in the Bill, our intention is that when those regulations are moved they will include those time limits, but we want to consult Ofcom and other appropriate bodies to ensure that the deadlines set are realistic and reasonable. I cannot confirm now what those will be, because we have not yet done the consultation, but I will make a couple of points.

First, the steps set out in clause 141(2)(d)(i), (ii) and (iii), at the top of page 122, are essentially procedural steps about whether a particular complaint is in scope, whether it is admissible and whether the entity is eligible. Those should be relatively straightforward to determine. I do not want to pre-empt the consultation and the regulations, but my expectation is that those are done in a relatively short time. The regulations in clause 141(2)

“may…include provisions about the following matters”—

it then lists all the different things—and the total amount of time the complaint must take to resolve in its totality is not one of them. However, because the word “include” is used, it could include a total time limit. If the regulations were to set a total time limit, one would have to be a little careful, because clearly some matters are more complicated than others. The hon. Member for Aberdeen North acknowledged that we would not want to sacrifice quality and thoroughness for speed. If an overall time limit were set, it would have to accommodate cases that were so complicated or difficult, or that required so much additional information, that they could not be done in a period of, say, 90 days. I put on record that that is something that the consultation should carefully consider. We are proceeding in this way—with a consultation followed by regulations—rather than putting a time limit in the Bill because it is important to get this right.

The question was asked: why regulations rather than Ofcom? This is quite an important area, as the hon. Member for Aberdeen North and the shadow Minister—the hon. Member for Worsley and Eccles South—have said. This element of governmental and parliamentary oversight is important, hence our having regulations, rather than letting Ofcom write its own rules at will. We are talking about an important mechanism, and we want to make sure that it is appropriately responsive.

The question was asked: why will the regulations be subject to the negative, rather than the affirmative, procedure? Clearly that is a point of detail, albeit important detail. Our instinct was that the issue was perhaps of slightly less parliamentary interest than the eligible entity list, which will be keenly watched by many external parties. The negative procedure is obviously a little more streamlined. There is no hard-and-fast rule as to why we are using negative rather than affirmative, but that was broadly the thinking. There will be a consultation, in which Ofcom will certainly be consulted. Clause 141(3) makes it clear that others can be consulted too. That consultation will be crucial in ensuring that we get this right and that the process is as quick as it can be—that is important—but also delivers the right result. I gently resist amendment 153 and commend clauses 140 to 142.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Some Acts that this Parliament has passed have provided for a time limit within which something must be considered, but the time limit can be extended if the organisation concerned says to the Secretary of State, “Look, this is too complicated. We don’t believe that we can do this.” I think that was the case for the Subsidy Control Act 2022, but I have been on quite a few Bill Committees, so I may be wrong about that. That situation would be the exception, obviously, rather than the rule, and would apply only in the most complicated cases.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is suggesting a practical solution: a default limit that can be extended if the case is very complicated. That sort of structure can certainly be consulted on and potentially implemented in regulations. She referred to asking the Secretary of State’s permission. Opposition Members have been making points about the Secretary of State having too much power. Given that we are talking here about the regulator exercising their investigatory power, that kind of extension probably would not be something that we would want the Secretary of State’s permission for; we would find some other way of doing it. Perhaps the chief executive of Ofcom would have to sign it off, or some other body that is independent of Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Sorry, I phrased that quite badly. My point was more about having to justify things—having to say, “Look, we are sorry; we haven’t managed to do this in the time in which we were expected to. This is our justification”—rather than having to get permission. Apologies for phrasing that wrongly. I am glad that the Minister is considering including that point as something that could be suggested in the consultation.

I appreciate what the Minister says, but I still think we should have a time limit in the Bill, so I am keen to push amendment 153 to a vote.

Question put and agreed to.

Clause 140 accordingly ordered to stand part of the Bill.

Clause 141

Procedure for super-complaints

Amendment proposed: 153, in clause 141, page 121, line 32, after “140” insert

“, which must include the requirement that OFCOM must respond to such complaints within 90 days”—(Kirsty Blackman.)

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

As we know, clause 143 introduces a power for the Secretary of State to set out a statement of the Government’s strategic priorities in relation to online safety matters. Given that the power is similar to those that already exist in the Communications Act 2003, we do not formally oppose the clause. We welcome the fact that the Secretary of State must follow a consultation and parliamentary procedure before proceeding. It is vital that transparency surrounds any targets or priorities that the Secretary of State may outline. However, we want to put on record our slight concerns around the frequency limitations on amendments that are outlined in subsections (7) and (8). This is a direct interference regime, and we would appreciate the Minister’s reassurances on the terms of how it will work in practice.

We also welcome clause 144, which sets out the consultation and parliamentary procedure requirements that must be satisfied before the Secretary of State can designate a statement of strategic priorities under clause 143. We firmly believe that parliamentary oversight must be at the heart of the Bill, and the Minister’s Back Benchers agree. We have heard compelling statements from the right hon. Member for Basingstoke and other colleagues about just how important parliamentary oversight of the Bill will be, even when it has received Royal Assent. That is why clause 144 is so important: it ensures that the Secretary of State must consult Ofcom when considering the statement of strategic priorities.

Following that, the draft statement must be laid before Parliament for proper scrutiny. As we have said before, this is central to the Bill’s chances of success, but Labour firmly believes that it would be unreasonable for us to expect the Secretary of State to always be an expert across every policy area out there, because it is not possible. That is why parliamentary scrutiny and transparency are so important. It is not about the politics; it is about all of us working together to get this right. Labour will support clause 144 because, fundamentally, it is for the Secretary of State to set out strategic priorities, but we must ensure that Parliament is not blocked from its all-important role in providing scrutiny.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the shadow Minister for her broad support for these two clauses. Clause 143 provides the power, but not an obligation, for the Secretary of State to set out a strategic statement on her priorities for online safety matters. As the shadow Minister said, it is similar to powers that already exist in other areas. The clause links back to clause 78, whereby Ofcom must have regard to the strategic priorities and set out how it responds to them when they are updated. On clause 144, I am glad that the shadow Minister accepts the consultation has to happen and that the 40-day period for Parliament to consider changes to the draft statement and, if it wishes to, to object to them is also a welcome opportunity for parliamentary scrutiny.

The Government have heard the wider points about parliamentary scrutiny and the functioning of the Joint Committee, which my right hon. Friend the Member for Basingstoke mentioned previously. I have conveyed them to higher authorities than me, so that transmission has occurred. I recognise the valuable work that the Joint Committee of the Commons and Lords did in scrutinising the Bill prior to its introduction, so I am glad that these clauses are broadly welcome.

Question put and agreed to.

Clause 143 accordingly ordered to stand part of the Bill.

Clause 144 ordered to stand part of the Bill.

Clause 145

Directions about advisory committees

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports the clause, which enables the Secretary of State to give Ofcom a direction to establish an expert committee to advise it on a specific online safety matter. As we have said repeatedly, it is vital that expert stakeholders are included as we begin the challenging process of regulating the internet. With that in mind, we need to ensure that the committee truly is expert and that it remains independent.

The Minister knows that I have concerns about Ofcom’s ability to remain truly independent, particularly given the recent decision to appoint a Tory peer to chair the organisation. I do not want to use our time today to make pointed criticisms about that decision—much as I would like to—but it is important that the Minister addresses these concerns. Ofcom must be independent—it really is quite important for the future success of the Bill. The expert committee’s chair, and its other members, must be empowered to report freely and without influence. How can the Minister ensure that that will genuinely be the case?

Subsection (4) places a duty on an advisory committee established under such a direction to publish a report within 18 months of its being established. I want to push the Minister on the decision to choose 18 months. I have mentioned my concerns about that timeframe; it seems an awfully long time for the industry, stakeholders, civil society and, indeed, Parliament to wait. I cannot be clearer about how important a role I think that this committee will have, so I would be grateful if the Minister could clarify why he thinks it will take 18 months for such a committee to be established.

That said, we broadly support the principles of what the clause aims to do, so we have not sought to amend it at this stage.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the shadow Minister for her comments and questions. She raised two substantive points on the clause; I will address those, rather than any wider issues that may be contentious.

The first question was about whether the advisory committee would be independent, and how we can be certain that it will not be unduly interfered in by the Government. The answer lies clearly in subsection (3). Paragraphs (a) and (b) make it very clear that although the Secretary of State may direct Ofcom to establish the committee, the identity of the people on the committee is for Ofcom to determine. Subsection (3)(a) states very clearly that the chairman is “appointed by OFCOM”, and subsection (3)(b) states that members of the committee are

“appointed by OFCOM as OFCOM consider appropriate.”

It is Ofcom, not the Secretary of State, that appoints the chair and the members. I trust that that deals with the question about the independence of the members.

On the second question, about time, the 18 months is not 18 months for the committee to be established—I am looking at clause 145(4)—but 18 months for the report to be published. Subsection (4) says “within” a period of 18 months, so it does not have to be 18 months for delivery of the report; it could be less, and I am sure that in many cases it will be. I hope that answers the shadow Minister’s questions on the clause, and I agree that it should stand part of the Bill.

Question put and agreed to.

Clause 145 accordingly ordered to stand part of the Bill.

Clause 146

Directions in special circumstances

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss new clause 10—Special circumstances—

“(1) This section applies where OFCOM has reasonable grounds for believing that circumstances exist that present a threat—

(a) to the health or safety of the public, or

(b) to national security.

(2) OFCOM may, in exercising their media literacy functions, give priority for a specified period to specified objectives designed to address the threat presented by the circumstances mentioned in subsection (1).

(3) OFCOM may give a public statement notice to—

(a) a specified provider of a regulated service, or

(b) providers of regulated services generally.

(4) A ‘public statement notice’ is a notice requiring a provider of a regulated service to make a publicly available statement, by a date specified in the notice, about steps the provider is taking in response to the threat presented in the circumstances mentioned in subsection (1).

(5) OFCOM may, by a public statement notice or a subsequent notice, require a provider of a regulated service to provide OFCOM with such information as they may require for the purpose of responding to that threat.

(6) If OFCOM takes any of the steps set out in this Chapter, they must publish their reasons for doing so.

(7) In subsection (2) ‘media literacy functions’ means OFCOM’s functions under section 11 of the Communications Act (duty to promote media literacy), so far as functions under that section relate to regulated services.”

This new clause gives Ofcom the power to take particular steps where it considers that there is a threat to the health and safety of the public or to national security, without the need for a direction from the Secretary of State.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we all know, the clause as it stands enables the Secretary of State to give Ofcom directions in circumstances where it considers that there is a threat to the health or safety of the public or to national security. That includes directing Ofcom to prioritise action to respond to a specific threat when exercising its media literacy functions, and to require specified service providers, or providers of regulated services more generally, to publicly report on what steps they are taking to respond to that threat.

However, Labour shares the concerns of the Carnegie UK Trust, among others, that there is no meaningful constraint on the Secretary of State’s powers to intervene as outlined in the clause. Currently, the Secretary of State has the power to direct Ofcom where they have “reasonable grounds for believing” that there is a threat to the public’s health or safety or to national security. The UK did not need these powers before—during the cold war, for example—so we have to ask: why now?

Chris Philp Portrait Chris Philp
- Hansard - -

So far as I am aware, the phenomenon of social media companies, to which media literacy relates, did not exist during the cold war.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It did not, but there were examples of disinformation, misinformation and the spreading of falsehoods, and none of these powers existed at the time. It seems weird—if I can use that term—that these exist now. Surely, the more appropriate method would be for the Secretary of State to write a letter to Ofcom to which it had to have regard. As it stands, this dangerous clause ensures the Secretary of State has the power to interfere with day-to-day enforcement. Ultimately, it significantly undermines Ofcom’s overall independence, which we truly believe should be at the heart of the Bill.

With that in mind, I will now speak to our crucial new clause 10, which instead would give Ofcom the power to take particular steps, where it considers that there is a threat to the health and safety of the public or national security, without the need for direction from the Secretary of State. Currently, there is no parliamentary scrutiny of the powers outlined in clause 146; it says only that the Secretary of State must publish their reasoning unless national security is involved. There is no urgency threshold or requirement in the clause. The Secretary of State is not required to take advice from an expert body, such as Public Health England or the National Crime Agency, in assessing reasonable grounds for action. The power is also not bounded by the Bill’s definition of harm.

These instructions do two things. First, they direct Ofcom to use its quite weak media literacy duties to respond to the circumstances. Secondly, a direction turns on a power for Ofcom to ask a platform to produce a public statement about what the platform is doing to counter the circumstances or threats in the direction order—that is similar in some ways to the treatment of harm to adults. This is trying to shame a company into doing something without actually making it do it. The power allows the Secretary of State directly to target a given company. There is potential for the misuse of such an ability.

The explanatory notes say:

“the Secretary of State could issue a direction during a pandemic to require OFCOM to; give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue.”

Recent experience of the covid pandemic and the Russian invasion of Ukraine suggests that the Government can easily legislate when required in an emergency and can recall Parliament. The power in the Bill is a strong power, cutting through regulatory independence and targeting individual companies to evoke quite a weak effect. It is not being justified as an emergency power where the need to move swiftly is paramount. Surely, if a heavier-duty action is required in a crisis, the Government can legislate for that and explain to Parliament why the power is required in the context of a crisis.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to talk about a specific example. Perhaps the Minister will be able to explain why the legislation is written this way around when I would have written it the opposite way around, much more in line with proposed new clause 10.

Snapchat brought in the Snap Map feature, which that involved having geolocation on every individual’s phone; whenever anyone took a photo to put it on Snapchat, that geolocation was included. The feature was automatically turned on for all Snapchat users when it first came in, I think in 2017. No matter what age they were, when they posted their story on Snapchat, which is available to anyone on their friends list and sometimes wider, anyone could see where they were. If a child had taken a photo at their school and put it on Snapchat, anyone could see what school they went to. It was a major security concern for parents.

That very concerning situation genuinely could have resulted in children and other vulnerable people, who may not have even known that the feature had been turned on by default and would not know how to turn on ghost mode in Snapchat so as not to post their location, being put at risk. The situation could have been helped if media literacy duties had kicked in that meant that the regulator had to say, “This is a thing on Snapchat: geolocation is switched on. Please be aware of this if your children or people you are responsible for are using Snapchat.”

Chris Philp Portrait Chris Philp
- Hansard - -

Is the hon. Member aware of a similar situation that arose more recently with Strava? People’s running routes were publicly displayed in the same way, which led to incidents of stalking.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I was aware that Strava did that mapping, which is why my friends list on Strava numbers about two people, but I was not aware that it had been publicly displayed. There are similar issues that routes can be public on things such as Garmin, so it is important to keep a note of that. I did not know that that information was public on Strava. If Ofcom had had the duty to ensure that people were aware of that, it would have been much easier for parents and vulnerable adults to take those decisions or have them taken on their behalf.

My reading of the clause is that if Ofcom comes across a problem, it will have to go and explain to the Secretary of State that it is a problem and get the Secretary of State to instruct it to take action. I do not think that makes sense. We have talked already about the fact that the Secretary of State cannot be an expert in everything. The Secretary of State cannot necessarily know the inner workings of Snapchat, Strava, TikTok and whatever other new platforms emerge. It seems like an unnecessary hurdle to stop Ofcom taking that action on its own, when it is the expert. The Minister is likely to say that the Secretary of State will say, “Yes, this is definitely a problem and I will easily instruct you to do this”—

Chris Philp Portrait Chris Philp
- Hansard - -

rose

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister will get the chance to make a proper speech in which he can respond.

It could be that the process is different from the one I see from reading the Bill. The Minister’s clarifications will be helpful to allow everyone to understand how the process is supposed to work, what powers Ofcom is supposed to have and whether it will have to wait for an instruction from the Secretary of State, which is what it looks like. That is why proposed new clause 10 is so important, because it would allow action to be taken to alert people to safety concerns. I am focusing mostly on that.

I appreciate that national security is also very important, but I thought I would take the opportunity to highlight specific concerns with individual platforms and to say to the Minister that we need Ofcom to be able to act and to educate the public as well as it possibly can, and to do so without having to wait for an instruction.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by addressing the point that was raised by the hon. Member for Aberdeen North on Ofcom’s power to issue media literacy advice of its own volition, which is the subject of new clause 10. Under section 11 of the Communications Act 2003, Ofcom already has the power to issue media literacy guidance on issues such as Snapchat geolocation, the Strava map location functionality that I mentioned, and the other example that came up. Ofcom does not need the Secretary of State’s permission to do that, as it already has the power to do so. The power that new clause 10 would confer on Ofcom already exists.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister says that Ofcom can already use that existing power, so why does it not do so?

Chris Philp Portrait Chris Philp
- Hansard - -

That is obviously an operational matter for Ofcom. We would encourage it to do as much as possible. We encouraged it through our media literacy strategy, and it published an updated policy on media literacy in December last year. If Members feel that there are areas of media literacy in which Ofcom could do more, they will have a good opportunity to raise those questions when senior Ofcom officials next appear before the Digital, Culture, Media and Sport Committee or any other parliamentary Committee.

The key point is that the measures in new clause 10 are already in legislation, so the new clause is not necessary. The Secretary of State’s powers under clause 146 do not introduce a requirement for permission—they are two separate things. In addition to Ofcom’s existing powers to act of its own volition, the clause gives the Secretary of State powers to issue directions in certain very limited circumstances. A direction may be issued where there is a present threat—I stress the word “threat”—to the health or safety of the public or to national security, and only in relation to media literacy. We are talking about extremely narrowly defined powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister said “a present threat”, but the clause says “present a threat”. The two mean different things. To clarify, could he confirm that he means “present a threat”?

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Lady is quite right to correct me. I do mean “present a threat”, as it is written in the Bill—I apologise for inadvertently transposing the words.

Is it reasonable that the Secretary of State has those very limited and specific powers? Why should they exist at all? Does this represent an unwarranted infringement of Ofcom’s freedom? I suppose those are the questions that the Opposition and others might ask. The Government say that, yes, it is reasonable and important, because in those particular areas—health and safety, and national security—there is information to which only the Government have access. In relation to national security, for example, information gathered by the UK intelligence community—GCHQ, the Secret Intelligence Service and MI5—is made available to the Government but not more widely. It is certainly not information that Ofcom would have access to. That is why the Secretary of State has the power to direct in those very limited circumstances.

I hope that, following that explanation, the Committee will see that new clause 10 is not necessary because it replicates an existing power, and that clause 146 is a reasonable provision.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments, but I am not convinced by his arguments on the powers given to the Secretary of State on issues of national security or public health and safety. Parliament can be recalled and consulted, and Members of Parliament can have their say in the Chamber on such issues. It should not be up to the Secretary of State alone to direct Ofcom and challenge its independence.

Chris Philp Portrait Chris Philp
- Hansard - -

I understand the shadow Minister’s point, but recalling Parliament during a recess is extremely unusual. I am trying to remember how many times it has happened in the seven years that I have been here, and I can immediately recall only one occasion. Does she think that it would be reasonable and proportionate to recall 650 MPs in recess for the purpose of issuing a media literacy directive to Ofcom?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I think the Minister has just made my point for me. If he does not see this happening only in extreme circumstances where a threat is presented or there is an immediate risk to public health and safety, how many times does he envisage the power being used? How many times will the Secretary of State have the power to overrule Ofcom if the power is not to be used only in those unique situations where it would be deemed appropriate for Parliament to be recalled?

Chris Philp Portrait Chris Philp
- Hansard - -

It is not overruling Ofcom; it is offering a direction to Ofcom.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Yes—having direct influence on a regulator, overruling its independence and taking the stance directly themselves. The Minister has made my point for me: if he does not envisage the power being used only in unique circumstances where Parliament would need to be recalled to have a say, it will be used a lot more often than he suggests.

With that in mind, the Opposition will withhold our support for clause 146, in order to progress with new clause 10. I place on record the Labour party’s distinct concerns with the clause, which we will seek to amend on Report.

None Portrait The Chair
- Hansard -

Minister, do you wish to respond?

Chris Philp Portrait Chris Philp
- Hansard - -

I have nothing further to add.

Question put and agreed to.

Clause 146 accordingly ordered to stand part of the Bill.

Clause 147

Secretary of State’s guidance

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It seems that our support for the clauses has run out. Clause 147 enables the Secretary of State to give guidance to Ofcom relating to its exercise of its statutory powers and functions under the Bill. It also allows the Secretary of State to give guidance to Ofcom around its functions and general powers under certain provisions of the Communications Act 2003. While we appreciate that the Secretary of State must consult Ofcom before issuing, revising or replacing guidance, we feel that this level of interference is unnecessary.

The Minister must recognise that the clause allows for an incredibly granular level of interference by the Secretary of State in the day-to-day functioning of a supposedly independent regulator. It profoundly interferes with enforcement and once again broadly undermines Ofcom’s independence. Civil society and stakeholders alike share our concerns. I must press the Minister on why this level of interference is included in the Bill—what is the precedent? We have genuine concerns that the fundamental aims of the Bill—to keep us all safe online—could easily be shifted according to the priorities of the Secretary of State of the day. We also need to ensure there is consistency in our overall approach to the Bill. Labour feels that this level of interference will cause the Bill to lack focus.

Ultimately, Ofcom, as the independent regulator, should be trusted to do what is right. The Minister must recognise how unpopular the Bill’s current approach of giving overarching powers to the Secretary of State is. I hope he will go some way to addressing our concerns, which, as I have already said, we are not alone in approaching him with. For those reasons, we cannot support clause 147 as it stands.

Chris Philp Portrait Chris Philp
- Hansard - -

We are introducing a new, groundbreaking regime, and we are trying to strike a balance between the need for regulatory independence of Ofcom and appropriate roles for Parliament and Government. There is a balance to strike there, particularly in an area such as this, which has not been regulated previously. It is a brand-new area, so we do not have decades of cumulated custom and practice that has built up. We are creating this from the ground up—from a blank sheet of paper.

That is why, in establishing this regime, we want to provide a facility for high-level strategic guidance to be given to Ofcom. Of course, that does not infringe on Ofcom’s day-to-day operations; it will continue to do those things itself, in taking decisions on individual enforcement matters and on the details around codes of practice. All those things, of course, remain for Ofcom.

We are very clear that guidance issued under clause 147 is strategic in nature and will not stray into the operational or organisational matters that should properly fall into the exclusive ambit of the independent regulator. There are a number of safeguards in the clause to ensure that the power is exercised in the way that I have just described and does not go too far.

First, I point to the fact that clause 147(8) simply says that

“ OFCOM must have regard to the guidance”.

That is obviously different from a hard-edged statutory obligation for it to follow the guidance in full. Of course, it does mean that Ofcom cannot ignore it completely—I should be clear about that—but it is different from a hard-edged statutory obligation.

There is also the requirement for Ofcom to be consulted, so that its opinions can be known. Of course, being consulted does not mean that the opinions will be followed, but it means that they will be sought and listened to. There are also some constraints on how frequently this strategic guidance can be revised, to ensure that it does not create regulatory uncertainty by being chopped and changed on an unduly frequent basis, which would cause confusion.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question about subsection (4)(b), which says that the guidance can be replaced more frequently than once every three years. I understand subsection (4)(a)—that is fine—but subsection (4)(b) says that the guidance can be changed if

“the revision or replacement is by agreement between the Secretary of State and OFCOM.”

How will those of us who are not the Secretary of State or Ofcom know that there has been an agreement that the guidance can be changed and that the Secretary of State is not just acting on their own? If the guidance is changed because of an agreement, will there be a line in the guidance that says, “The Secretary of State has agreed with Ofcom to publish this only 1.5 years after the last guidance was put out, because of these reasons”? In the interests of transparency, it would be helpful for something like that to be included in the guidance, if it was being changed outside the normal three-year structure.

Chris Philp Portrait Chris Philp
- Hansard - -

It is better than being in the guidance, which is non-statutory, because it is in the Bill—it is right here in front of us in the measure that the hon. Lady just referred to, clause 147(4)(b). If the Secretary of State decided to issue updated guidance in less than three years without Ofcom’s consent, that would be unlawful; that would be in breach of this statute, and it would be a very straightforward matter to get that struck down. It would be completely illegal to do that.

My expectation would be that if updated guidance was issued in less than three years, it would be accompanied by written confirmation that Ofcom had agreed. I imagine that if a future Secretary of State—I cannot imagine the current Secretary of State doing it—published guidance in less than three years without Ofcom’s consent, Ofcom would not be shy in pointing that out, but to do that would be illegal. It would be unlawful; it would be a breach of this measure in the Bill.

I hope that the points that I have just made about the safeguards in clause 147, and the assurance and clarity that I have given the Committee about the intent that guidance will be at the strategic level rather than the operational level, gives Members the assurance they need to support the clause.

Question put, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. The clause is incredibly important. It requires the Secretary of State to prepare and lay before Parliament annual reports about their performance in relation to online safety. We fully support such transparency. That is all we want—we want it to go further. That is what we have been trying to say in Committee all day. We agree in principle and therefore have not sought to amend the clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I could not possibly add to that exceptionally eloquent description.

Question put and agreed to.

Clause 148 accordingly ordered to stand part of the Bill.

Clause 149

Review

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause compels the Secretary of State to undertake a review to assess the effectiveness of the regulatory framework. The review will have to be published and laid before Parliament, which we welcome. However, we note the broad time limits on this duty. We have heard repeatedly about the challenges that delays to the Bill’s full implementation will cause, so I urge the Minister to consider that point closely. By and large, though, we absolutely support the clause, especially as the Secretary of State will be compelled to consult Ofcom and other appropriate persons when carrying out its review—something that we have called for throughout scrutiny of the Bill. We only wish that that level of collaboration had been accepted by the Minister on the other clauses. I will not waste time repeating points that I have already made. We support the clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I welcome the shadow Minister’s support for this review clause, which is important. I will not add to her comments.

Question put and agreed to.

Clause 149 accordingly ordered to stand part of the Bill.

Clause 150

Harmful communications offence

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I beg to move amendment 112, in clause 150, page 127, line 28, at end insert “and;

(b) physical harm that has been acquired as a consequence of receiving the content of a message sent online.”

This amendment would expand the definition of harm for the purposes of the harmful communications offence to incorporate physical harm resulting from messages received online.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

At the risk of following my earlier voting pattern, I am also very much with the hon. Member for Batley and Spen in spirit. I could not do the subject any more justice than she has, describing this appalling online behaviour and just how damaging it is. I am a member of the all-party parliamentary group on epilepsy and have lived experience myself.

I want to highlight the comments of the Epilepsy Society, which I am sure is following our work this afternoon. It welcomes many of the introductions to the Bill, but highlights something of a legislative no man’s land. Clause 187 mentions physical harm, but does not apply to clause 150. Clause 150 only covers psychological harm when, as we have heard described, many seizures result in physical harm and some of that is very serious. I know the Minister is equally committed to see this measure come about and recognises the points we have demonstrated. The hon. Lady is right that we are united. I suspect the only point on which there might be some difference is around timing. I will be looking to support the introduction and the honouring in full of Zach’s law before the Bill is passed. There are many other stages.

My understanding is that many others wish to contribute, not least the Ministry of Justice. My hope, and my request to the Minister, is that those expert stakeholder voices will be part of the drafting, should it not be the case that supporting the amendment presented today is the very best and strongest way forward. I want to see recognition in law.

Chris Philp Portrait Chris Philp
- Hansard - -

Amendment 112 is clearly very important. As my hon. Friend the Member for Watford pointed out, I have already said that I believe that clause 150 goes a long way to address the various issues that have been raised. Since my hon. Friends the Members for Eastbourne and for Watford, and the hon. Member for Batley and Spen have been raising this issue—my hon. Friends have been lobbying me on this issue persistently and frequently, behind closed doors as well as publicly, and the hon. Member for Batley and Spen has been campaigning on this publicly with great tenacity and verve—the Government and the MOJ have been further considering the Law Commission’s recommendations, which I referenced on Second Reading. Subsequent to Second Reading and the lobbying by the three Members who have just spoken—the hon. Member for Batley and Spen, and my hon. Friends the Members for Watford and for Eastbourne—I can now announce to the Committee that the Government have decided to enact the Law Commission’s recommendations, so there will be a new and separate standalone offence that is specific to epilepsy for the very first time. I can firmly commit to that and announce it today.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I genuinely appreciate the Minister’s comments, but why would we spend more time doing other pieces of legislation when we can do it right here and right now? The amendment will solve the problem without causing any more pain or suffering over a long period of time.

Chris Philp Portrait Chris Philp
- Hansard - -

One of the pieces of legislation that could be used is this Bill, because it is in scope. If the hon. Lady can bear with me until Report, I will say more about the specific legislative vehicle that we propose to use.

On the precise wording to be used, I will make a couple of points about the amendments that have been tabled—I think amendment 113 is not being moved, but I will speak to it anyway. Amendment 112, which was tabled by the hon. Member for Batley and Spen, talks about bringing physical harm in general into the scope of clause 150. Of course, that goes far beyond epilepsy trolling, because it would also bring into scope the existing offence of assisting or encouraging suicide, so there would be duplicative law: there would be the existing offence of assisting or encouraging suicide and the new offence, because a communication that encouraged physical harm would do the same thing.

If we included all physical harm, it would duplicate the proposed offence of assisting or encouraging self-harm that is being worked on by the Ministry of Justice and the Law Commission. It would also duplicate offences under the Offences Against the Person Act 1861, because if a communication caused one person to injure another, there would be duplication between the offence that will be created by clause 150 and the existing offence. Clearly, we cannot have two offences that criminalise the same behaviour. To the point made by the hon. Member for Aberdeen North, it would not be right to create two epilepsy trolling offences. We just need one, but it needs to be right.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - -

In a second.

The physical harm extension goes way beyond the epilepsy point, which is why I do not think that that would be the right way to do it, although the Government have accepted that we will do it and need to do it, but by a different mechanism.

I was about to speak to amendment 113, the drafting of which specifically mentions epilepsy and which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys (Paul Maynard), but was the hon. Lady’s question about the previous point?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My question was about the announcement that the Minister is hoping to make on Report. I appreciate that he has committed to introduce the new offence, which is great. If the Bill is to be the legislative vehicle, does he expect to amend it on Report, or does he expect that that will have to wait until the amendment goes through the Lords?

Chris Philp Portrait Chris Philp
- Hansard - -

That is a good question, and it ties into my next point. Clearly, amendment 113 is designed to create a two-sentence epilepsy trolling offence. When trying to create a brand-new offence—in this case, epilepsy trolling—it is unlikely that two sentences’ worth of drafting will do the trick, because a number of questions need to be addressed. For example, the drafting will need to consider what level of harm should be covered and exactly what penalty would be appropriate. If it was in clause 150, the penalty would be two years, but it might be higher or lower, which needs to be addressed. The precise definitions of the various terms need to be carefully defined as well, including “epilepsy” and “epileptic seizures” in amendment 113, which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys. We need to get proper drafting.

My hon. Friend the Member for Eastbourne mentioned that the Epilepsy Society had some thoughts on the drafting. I know that my colleagues in the Ministry of Justice and, I am sure, the office of the parliamentary counsel, would be keen to work with experts from the Epilepsy Society to ensure that the drafting is correct. Report will likely be before summer recess—it is not confirmed, but I am hoping it will be—and getting the drafting nailed down that quickly would be challenging.

I hope that, in a slightly indirect way, that answers the question. We do not have collective agreement about the precise legislative vehicle to use; however, I hope it addresses the questions about how the timing and the choreography could work.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

We have talked a lot about the Epilepsy Society this afternoon, and quite rightly too, as they are the experts in this field. My understanding is that it is perfectly happy with the language in this amendment—

Chris Philp Portrait Chris Philp
- Hansard - -

Which one?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Amendment 112. I think that the Epilepsy Society feels that this would be covered. I am also confused, because the Minister said previously that it was his belief and intention that this clause would cover epilepsy trolling, but he is now acknowledging that it does not. Why would we not, therefore, just accept the amendment that covers it and save everybody a lot of time?

Chris Philp Portrait Chris Philp
- Hansard - -

Representations have been made by the three Members here that epilepsy deserves its own stand-alone offence, and the Government have just agreed to do that, so take that as a win. On why we would not just accept amendment 112, it may well cover epilepsy, and may well cover it to the satisfaction of the Epilepsy Society, but it also, probably inadvertently, does a lot more than that. It creates a duplication with the offence of assisting or encouraging suicide.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Surely that is almost a bonus?

Chris Philp Portrait Chris Philp
- Hansard - -

No, it is not a bonus, because we cannot have two different laws that criminalise the same thing. We want to have laws that are, essentially, mutually exclusive. If a person commits a particular act, it should be clear which Act the offence is being committed under. Imagine that there were two different offences for the same act with different sentences—one is two years and one is 10 years. Which sentence does the judge then apply? We do not want to have law that overlaps, where the same act is basically a clear offence under two different laws. Just by using the term “physical harm”, amendment 112 creates that. I accept that it would cover epilepsy, but it would also cover a whole load of other things, which would then create duplication.

That is why the right way to do this is essentially through a better drafted version of amendment 113, which specifically targets epilepsy. However, it should be done with drafting that has been done properly—with respect to my hon. Friend the Member for Blackpool North and Cleveleys, who drafted the amendment—with definitions that are done properly, and so on. That is what we want to do.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Having been involved on this Bill for quite a while now and having met Zach, I know the concerns that the Epilepsy Society have had. For me, we just need the Minister to tell us, which I think he has, that this will become law, whatever the vehicle for that is. If we know that this will be an offence by the end of this year—hopefully by summer, if not sooner—so that people cannot send flashing images to people with epilepsy, like Zach, then I will feel comfortable in not backing the amendment, on the premise that the Government will do something, moving forward. Am I correct in that understanding?

Chris Philp Portrait Chris Philp
- Hansard - -

Yes. Just to be clear, in no world will a new law pass by the summer recess. However, I can say that the Government are committed, unequivocally, to there being a new offence in law that will criminalise epilepsy trolling specifically. That commitment is categoric. The only matter on which I need to come back to the House, which I will try to do on Report, is to confirm specifically which Bill that offence will go in. The commitment to legislate is made unequivocally today.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

I welcome the Minister’s announcement and that commitment. I particularly welcome that the new offence will have epilepsy in the title. People who seek out those who may be triggered and have seizures to cause this harm use all sorts of tags, organisations and individuals to deliberately and specifically target those who suffer from epilepsy. It is therefore wholly right that this new offence, whether in this Bill or another, cites epilepsy, because those who would seek to do harm know it and call it that.

I have not had the privilege of meeting Zach; however, thanks to this online world, which we are experiencing through this legislation as the wild west, I was able to see the most beautiful tribute interview he did with his mum. He said that if the change were to be made and offence were to be recognised, “we win.” He is so right that we all win.

Chris Philp Portrait Chris Philp
- Hansard - -

My hon. Friend makes an extremely powerful point that is incapable of being improved upon.

Chris Philp Portrait Chris Philp
- Hansard - -

Or perhaps it is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

It is wonderful that we have such consensus on this issue. I am grateful to colleagues for that. I am very concerned about the pressures on parliamentary time, and the fact that we are kicking this issue down the road again. We could take action today to get the process moving. That is what Zach and his family want and what other people who have been subjected to this hideous bullying want. Without a firm timeframe for another way of getting this done, I am struggling to understand why we cannot do this today.

Chris Philp Portrait Chris Philp
- Hansard - -

The progress that the campaign has made, with the clear commitment from the Government that we are going to legislate for a specific epilepsy trolling offence, is a huge step forward. I entirely understand the hon. Lady’s impatience. I have tried to be as forthcoming as I can be about likely times, in answer to the question from the hon. Member for Aberdeen North, within the constraints of what is currently collectively agreed, beyond which I cannot step.

Amendment 112 will sort out the epilepsy, but unfortunately it will create duplicative criminal law. We cannot let our understandable sense of urgency end up creating a slightly dysfunctional criminal statute book. There is a path that is as clear as it reasonably can be. Members of the Committee will probably have inferred the plan from what I said earlier. This is a huge step forward. I suggest that we bank the win and get on with implementing it.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

I appreciate that there will be differences of opinion, but I feel that Zach should be smiling today whatever the outcome—if there is a vote, or if this is a probing amendment. When I have chatted about this previously over many months, it has been a real challenge. The Minister quite rightly said that the Bill already covered epilepsy. I felt that to be true. This is a firming up of the agreement we had. This is the first time I have heard this officially in any form. My message to Zach and the Epilepsy Society, who may well be watching the Committee, is that I hope they will see this as a win. With my head and my heart together, I feel that it is a win, but I forewarn the Minister that I will continue to be like a dog with a bone and make sure that those promises are delivered upon.

Chris Philp Portrait Chris Philp
- Hansard - -

I think that is probably a good place to leave my comments. I can offer public testimony of my hon. Friend’s tenacity in pursuing this issue.

I ask the hon. Member for Batley and Spen to withdraw the amendment. I have given the reasons why: because it would create duplicative criminal law. I have been clear about the path forward, so I hope that on that basis we can work together to get this legislated for as a new offence, which is what she, her constituent and my hon. Friends the Members for Watford and for Eastbourne and others have been calling for.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I appreciate the Minister’s comments and the support from across the House. I would like to the push the amendment to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Part 10 of the Bill sets out three new offences involving harmful, false or threatening communications. Clause 156 includes a new offence on cyber-flashing, to which my hon. Friend the Member for Pontypridd will speak shortly.

For many years, charities have been calling for an update to the offences included in the Malicious Communications Act 1998 and the Communications Act 2003. Back in 2018, the Law Commission pointed out that using the criminal law to deal with harmful online conduct was hindered by several factors, including limited law enforcement capacity to pursue the scale of abusive communications, what the commission called a “persistent cultural tolerance” of online abuse, and difficulties in striking a balance between protecting people from harm and maintaining rights of freedom of expression—a debate that we keep coming to in Committee and one that is still raging today. Reform of the legislation governing harmful online communications is welcome—that is the first thing to say—but the points laid out by the Law Commission in 2018 still require attention if the new offences are to result in the reduction of harm.

My hon. Friend the Member for Batley and Spen spoke about the limited definition of harm, which relates to psychological harm but does not protect against all harms resulting from messages received online, including those that are physical. We also heard from the hon. Member for Ochil and South Perthshire about the importance of including an offence of encouraging or assisting self-harm, which we debated last week with schedule 7. I hope that the Minister will continue to upgrade the merits of new clause 36 when the time comes to vote on it.

Those are important improvements about what should constitute an offence, but we share the concerns of the sector about the extent to which the new offences will result in prosecution. The threshold for committing one of the offences in clause 150 is high. When someone sends the message, there must be

“a real and substantial risk that it would cause harm to a likely audience”,

and they must have

“no reasonable excuse for sending the message.”

The first problem is that the threshold of having to prove the intention to cause distress is an evidential threshold. Finding evidence to prove intent is notoriously difficult. Professor Clare McGlynn’s oral evidence to the Committee was clear:

“We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence.”

Professor McGlynn highlighted the story of Gaia Pope. With your permission, Ms Rees, I will make brief reference to it, in citing the evidence given to the Committee. In the past few weeks, it has emerged that shortly before Gaia Pope went missing, she was sent indecent images through Facebook, which triggered post-traumatic stress disorder from a previous rape. Professor McGlynn said:

“We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 58, Q105.]

The communications offences should be grounded upon consent rather than the motivation of the perpetrator. That is a clear omission in the Bill, which my hon. Friend the Member for Pontypridd will speak more about in relation to our amendments 41 and 42 to clause 156. The Government must act or risk missing a critical opportunity to tackle the harms resulting from communications offences.

We then come to the problem of the “reasonable excuse” defence and the “public interest” defence. Clause 150(5) sets out that the court must consider

“whether the message is, or is intended to be, a contribution to a matter of public interest”.

The wording in the clause states that this should not “determine the point”. If that is the case, why does the provision exist? Does the Minister recognise that there is a risk of the provision being abused? In a response to a question from the hon. Member for Aberdeen North, the Minister has previously said that:

“Clause 150…does not give a get-out-of-jail-free card”.––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 275.]

Could he lay out what the purpose of this “matter of public interest” defence is? Combined with the reasonable excuse defence in subsection (1), the provisions risk sending the wrong message when it comes to balancing harms, particularly those experienced by women, of which we have already heard some awful examples.

There is a difference in the threshold of harm between clause 150, on harmful communications offences, and clause 151, on false communications offences. To constitute a false communications offence, the message sender must have

“intended the message, or the information in it, to cause non-trivial psychological or physical harm to a likely audience”.

To constitute a harmful communications offence, the message sender must have

“intended to cause harm to a likely audience”

and there must have been

“a real and substantial risk that it would cause harm to a likely audience”.

Will the Minister set out the Government’s reasoning for that distinction? We need to get these clauses right because people have been let down by inadequate legislation and enforcement on harmful online communications offences for far too long.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by saying that many of these clauses have been developed in careful consultation with the Law Commission, which has taken a great deal of time to research and develop policy in this area. It is obviously quite a delicate area, and it is important to make sure that we get it right.

The Law Commission is the expert in this kind of thing, and it is right that the Government commissioned it, some years ago, to work on these provisions, and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do. The clauses replace previous offences—for example, those in the Malicious Communications Act 1998—and update and improve those provisions in the form we see them in the Bill.

The shadow Minister, the hon. Member for Worsley and Eccles South, asked a number of questions about the drafting of the clauses and the thresholds that have to be met for an offence to be committed. We are trying to strike a balance between criminalising communications that deserve to be criminalised and not criminalising communications that people would consider should fall below the criminal threshold. There is obviously a balance to strike in doing that. We do not want to infringe free speech by going too far and having legitimate criticism and debate being subject to criminal sanctions. There is a balance to strike here between, on the one hand, public protection and where the criminal law sits versus, on the other hand, free speech and people expressing themselves. That is why clause 150 is constructed as it is, on the advice of the Law Commission.

As the hon. Member set out, the offence is committed only where there is a “real and substantial risk” that the likely audience would suffer harm. Harm is defined as

“psychological harm amounting to at least serious distress.”

Serious distress is quite a high threshold—it is significant thing, not something trivial. It is important to make that clear.

The second limb is that there is an intention to cause harm. Intention can in some circumstances be difficult to prove, but there are also acts that are so obviously malicious that there can be no conceivable motivation or intention other than to cause harm, where the communication is so obviously malfeasant. In those cases, establishing intent is not too difficult.

In a number of specific areas, such as intimate image abuse, my right hon. Friend the Member for Basingstoke and others have powerfully suggested that establishing intent is an unreasonably high threshold, and that the bar should be set simply at consent. For the intimate image abuse offence, the bar is set at the consent level, not at intent. That is being worked through by the Law Commission and the Ministry of Justice, and I hope that it will be brought forward as soon as possible, in the same way as the epilepsy trolling offence that we discussed a short while ago. That work on intimate image abuse is under way, and consent, not intent, is the test.

For the generality of communications—the clause covers any communications; it is incredibly broad in scope—it is reasonable to have the intent test to avoid criminalising what people would consider to be an exercise of free speech. That is a balance that we have tried to strike. The intention behind the appalling communications that we have heard in evidence and elsewhere is clear: it is in inconceivable that there was any other motivation or intention than to cause harm.

There are some defences—well, not defences, but conditions to be met—in clause 150(1)(c). The person must have “no reasonable excuse”. Subsection (5) makes it clear that

“In deciding whether a person has a reasonable excuse…one of the factors that a court must consider (if it is relevant in a particular case) is whether the message is, or is intended to be, a contribution to a matter of public interest (but that does not determine the point)”

of whether there is a reasonable excuse—it simply has to be taken into account by the court and balanced against the other considerations. That qualification has been put in for reasons of free speech.

There is a delicate balance to strike between criminalising what should be criminal and, at the same time, allowing reasonable free speech. There is a line to draw, and that is not easy, but I hope that, through my comments and the drafting of the clause, the Committee will see that that line has been drawn and a balance struck in a carefully calibrated way. I acknowledge that the matter is not straightforward, but we have addressed it with advice from the Law Commission, which is expert in this area. I commend clause 150 to the Committee.

The other clauses in this group are a little less contentious. Clause 151 sets out a new false communication offence, and I think it is pretty self-explanatory as drafted. The threatening communications offence in clause 152 is also fairly self-explanatory—the terms are pretty clear. Clause 153 contains interpretative provisions. Clause 154 sets out the extra-territorial application, and Clause 155 sets out the liability of corporate officers. Clause 157 repeals some of the old offences that the new provisions replace.

Those clauses—apart from clause 150—are all relatively straightforward. I hope that, in following the Law Commission’s advice, we have struck a carefully calibrated balance in the right place.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I would like to take the Minister back to the question I asked about the public interest defence. There is a great deal of concern that a lot of the overlaying elements create loopholes. He did not answer specifically the question of the public interest defence, which, combined with the reasonable excuse defence, sends the wrong message.

Chris Philp Portrait Chris Philp
- Hansard - -

The two work together. On the reasonable excuse condition, for the offence to have been committed, it has to be established that there was no reasonable excuse. The matter of public interest condition—I think the hon. Lady is referring to subsection (5)—simply illustrates one of the ways in which a reasonable excuse can be established, but, as I said in my remarks, it is not determinative. It does not mean that someone can say, “There is public interest in what I am saying,” and they automatically have a reasonable excuse—it does not work automatically like that. That is why in brackets at the end of subsection (5) it says

“but that does not determine the point”.

That means that if a public interest argument was mounted, a magistrate or a jury, in deciding whether the condition in subsection (1)(c)—the “no reasonable excuse” condition—had been met, would balance the public interest argument, but it would not be determinative. A balancing exercise would be performed. I hope that provides some clarity about the way that will operate in practice.

--- Later in debate ---
This seismic change will particularly affect young people: millennials and those who are younger, whatever they are called—generation Z. As parliamentarians, we are interested not just in the law, but in how we make sure it bites. It would be helpful if the Minister explained how we can make the Bill as preventive as possible, so that we do not simply punish young people but actually start to train them to understand that they will be committing a serious offence if they send indecent images of male genitalia to others—predominantly women—without their consent, as they are clearly doing on a large scale. Will the Minister indicate whether he will have conversations with those of his colleagues who are responsible for relationship and sex education, to ensure that young people are aware of this new sex offence and that they do not inadvertently fall foul of the law?
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the Members who have contributed to the debate. Rather like with the provisions in clause 150, which we discussed a few minutes ago, a difficult and delicate balance needs to be struck. We want to criminalise that which should be criminal, but not inadvertently criminalise that which should not be. The legal experts at the Law Commission have been studying the matter and consulting other legal experts for quite some time. As my right hon. Friend the Member for Basingstoke said in her excellent speech, their recommendations have been our starting point.

It is probably worth making one or two points about how the clause works. There are two elements of intention, set out in subsection (1). First, the act of sending has to be intentional; it cannot be done accidentally. I think that is reasonable. Secondly, as set out in subsection (1)(a), there must be an intention to cause the person who sees the image alarm, distress or intimidation.

I understand the point that establishing intent could, in some circumstances, present a higher hurdle. As we discussed in relation to clause 150, we are, separately from this, working on the intimate image abuse offence, which does not require intention to be established; it simply requires lack of consent. I was not aware, until my right hon. Friend mentioned it a few moments ago—she was ahead of me there—that the Law Commission has given a timeframe for coming back. I am not sure whether that implies it will be concomitant with Ministry of Justice agreement or whether that will have to follow, but I am very pleased to hear that there is a timeframe. Clearly, it is an adjacent area to this and it will represent substantial progress.

I understand that it can sometimes be hard to establish intention, but there will be circumstances in which the context of such an incident will often make it clear that there was an intention to cause alarm, distress or humiliation.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Has the Minister ever received a dick pic?

Chris Philp Portrait Chris Philp
- Hansard - -

Is that a rhetorical question?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

No, it is a genuine question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

So he cannot possibly know how it feels to receive one. I appreciate the comments that he is trying to make, and that this is a fine balance, but I do see this specific issue of sending a photograph or film of genitals as black and white: they are sent either with or without consent. It is as simple as that. What other circumstances could there be? Can he give me an example of when one could be sent without the intention to cause distress, harm or intimidation?

Chris Philp Portrait Chris Philp
- Hansard - -

It is a fair question. There might be circumstances in which somebody simply misjudges a situation—has not interpreted it correctly—and ends up committing a criminal offence; stumbling into it almost by accident. Most criminal offences require some kind of mens rea—some kind of intention to commit a criminal offence. If a person does something by accident, without intention, that does not normally constitute a criminal offence. Most criminal offences on the statute book require the person committing the offence to intend to do something bad. If we replace the word “intent” with “without consent”, the risk is that someone who does something essentially by accident will have committed a criminal offence.

I understand that the circumstances in which that might happen are probably quite limited, and the context of the incidents that the hon. Member for Pontypridd and my right hon. Friend the Member for Basingstoke have described would generally support the fact that there is a bad intention, but we have to be a little careful not accidentally to draw the line too widely. If a couple are exchanging images, do they have to consent prior to the exchange of every single image? We have to think carefully about such circumstances before amending the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have to say, just as an aside, that the Minister has huge levels of empathy, so I am sure that he can put himself into the shoes of someone who receives such an image. I am not a lawyer, but I know that there is a concept in law of acting recklessly, so if someone acts recklessly, as my hon. Friend has set out in his Bill, they can be committing a criminal offence. That is why I thought he might want to consider not having the conditional link between the two elements of subsection(1)(b), but instead having them as an either/or. If he goes back to the Law Commission’s actual recommendations, rather than the interpretation he was given by the MOJ, he will see that they set out that one of the conditions should be that defendants who are posting in this way are likely to cause harm. If somebody is acting in a way that is likely to cause harm, they would be transgressing. The Bill acknowledges that somebody can act recklessly. It is a well-known concept in law that people can be committing an offence if they act recklessly—reckless driving, for example. I wonder whether the Minister might think about that, knowing how difficult it would be to undertake what the hon. Member for Pontypridd is talking about, as it directly contravenes the Law Commission’s recommendations. I do not think what I am suggesting would contravene the Law Commission’s recommendations.

Chris Philp Portrait Chris Philp
- Hansard - -

I will commit to consider the clause further, as my right hon. Friend has requested. It is important to do so in the context of the Law Commission’s recommendations, but she has pointed to wording in the Law Commission’s original report that could be used to improve the drafting here. I do not want to make a firm commitment to change, but I will commit to considering whether the clause can be improved upon. My right hon. Friend referred to the “likely to cause harm” test, and asked whether recklessness as to whether someone suffers alarm, distress or humiliation could be looked at as a separate element. We need to be careful; if we sever that from sexual gratification, we need to have some other qualification on sexual gratification. We might have sexual gratification with consent, which would be fine. If we severed them, we would have to add another qualification.

It is clear that there is scope for further examination of clause 156. That does not necessarily mean it will be possible to change it, but it is worth examining it further in the light of the comments made by my right hon. Friend. The testimony we heard from witnesses, the testimony of my right hon. Friend and what we heard from the hon. Member for Pontypridd earlier do demonstrate that this is a widespread problem that is hugely distressing and intrusive and that it represents a severe violation. It does need to be dealt with properly.

We need to be cognisant of the fact that in some communities there is a culture of these kinds of pictures being freely exchanged between people who have not met or communicated before—on some dating websites, for example. We need to draft the clause in such a way that it does not inadvertently criminalise those communities—I have been approached by members of those communities who are concerned.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

They have consent to do that.

Chris Philp Portrait Chris Philp
- Hansard - -

The hon. Member for Pontypridd says from a sedentary position that they have given consent. The consent is not built into the website’s terms and conditions; it is an assumed social norm for people on those websites. We need to tread carefully and be thoughtful, to ensure that by doing more to protect one group we do not inadvertently criminalise another.

There is a case for looking at the issue again. My right hon. Friend has made the point thoughtfully and powerfully, and in a way that suggests we can stay within the confines of the Law Commission’s advice, while being more thoughtful. I will certainly undertake to go away and do that, in consultation with my right hon. Friend and others.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

I am pleased the Minister will go away and look at this. I am sure there are laws already in place that cover these things, but I know that this issue is very specific. An awful lot of the time, we put laws in place, but we could help an awful lot of people through education, although the last thing we want to do is victim blame. The Government could work with companies that provide devices and have those issued with the airdrop in contacts-only mode, as opposed to being open to everybody. That would stop an awful lot of people getting messages that they should not be receiving in the first place.

Chris Philp Portrait Chris Philp
- Hansard - -

My hon. Friend makes a very powerful and important point. Hopefully, people listening to our proceedings will hear that, as well as those working on media literacy—principally, Ofcom and the Government, through their media literacy strategy. We have had a couple of specific tips that have come out of today’s debate. My right hon. Friend the Member for Basingstoke and my hon. Friend the Member for Don Valley mentioned disabling a device’s airdrop, or making it contacts-only. A point was also made about inadvertently sharing geolocations, whether through Snapchat or Strava. Those are two different but important points that the general public should be more aware of than they are.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We have argued that changes to the legislation are long overdue to protect people from the harms caused by online communications offences. The clause and schedule 13 include necessary amendments to the legislation, so we do not oppose them standing part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

The clause cross-references schedule 13 and sets out amendments to existing legislation consequential on the communications offences in part 10. Schedule 13 has a number of consequential amendments, divided broadly into two parts. It makes various changes to the Sexual Offences Act 2003, amends the Regulatory Enforcement and Sanctions Act 2008 in relation to the Malicious Communications Act 1988, and makes various other changes, all of which are consequential on the clauses we have just debated. I therefore commend clause 158 and its associated schedule 13 to the Committee.

Question put and agreed to.

Clause 158 accordingly ordered to stand part of the Bill.

Schedule 13 agreed to.

Clause 159

Providers that are not legal persons

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider:

Government amendment 159.

Clauses 160 and 161 stand part.

That schedule 14 be the Fourteenth schedule to the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports clause 159, because it is vital that the Bill includes provisions for Ofcom to issue a penalty notice or confirmation decision when the provider may not be a legal person in the traditional sense. We have repeatedly maintained that it is central to the success of the Bill that, once implemented, it properly and sufficiently gives Ofcom the relevant powers, autonomy and independence to properly pursue providers of regulated services and their wrongdoings.

We recognise the complexity of the service providers’ business models and therefore agree that the Bill must be broad enough to ensure that penalty notices and confirmation decisions can be given, even when the provider may constitute an association, or an organisation between a group of people. Ultimately, as we have made clear, Labour will continue to support giving the regulator the tools required to keep us all safe online.

We have already raised concerns over Ofcom’s independence and the interference of and over-reliance on the Secretary of State’s powers within the Bill as it stands. However, we are in agreement on clause 159 and feel that it provides a vital tool for Ofcom to have at its disposal should the need for a penalty notice or confirmation decision arise. That is why we support the clause and have not sought to amend it.

Government amendment 159, as we know, ensures that if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under new schedule 2. As I will come on to in my comments on clauses 160 and 161, we welcome the provisions and clarifications around liability for fees when the provider of a service consists of two or more individuals.

As with clause 159, we welcome the clarity of provisions in the Bill that confirm actions to be taken where a group of two or more individuals act together. It is absolutely right that where two or more individuals together are the providers of a regulated service, they should be jointly and severally liable for any duty, requirement or liability to pay a fee.

We also welcome the clarification that that liability and joint responsibility will also apply in the event of a penalty notice or confirmation decision. We believe that these provisions are vital to capturing the true extent of where responsibility should lie, and we hope they will go some way to remedying the hands-off approach that service providers have managed to get away with for too long when it comes to regulation of the internet. We do, however, feel that the Government could have gone further, as we outlined in amendment 50, which we spoke to when we addressed clause 123.

Labour firmly believes that Ofcom’s ability to take action against non-compliance en masse is critical. That is why we welcome clause 160 and will not be seeking to amend it at this stage. We also fundamentally support clause 161, which contains provisions on how joint liability will operate.

We will speak to our concerns about supply chains when we debate a later clause—I believe it is new clause 13 —because it is vital that this Bill captures the challenges around supply chain failures and where responsibility lies. With that in mind, we will support clause 161, with a view to the Minister understanding our broader concerns, which we will address when we debate new clause 13.

Finally, schedule 14 establishes that decisions or notices can be given jointly to both a regulated provider and its parent company. We particularly support the confirmation that all relevant entities must be given the opportunity to make representations when Ofcom seeks to establish joint liability, including on the matters contained in the decision or notice and whether joint liability would be appropriate.

As we have made clear, we see the provisions outlined in this schedule as fundamental to Ofcom’s ability to issue truly meaningful decisions, penalties and notices to multiple parties. The fact that, in this instance, service providers will be jointly liable to comply is key to capturing the extent to which it has been possible to perpetuate harm online for so long. That is why we support the intention behind schedule 14 and have not sought to amend it.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister has set out clearly the purpose of and intent behind these clauses, and how they work, so I do not think I will add anything. I look forward to our future debate on the new clause.

There is one point of correction that I wish to make, and it relates to a question that the hon. Member for Aberdeen North asked this morning and that is germane to amendment 159. That amendment touches on the arrangements for recouping the set-up costs that Ofcom incurs prior to the Bill receiving Royal Assent. The hon. Member for Aberdeen North asked me over what time period those costs would be collected, and I answered slightly off the cuff. Now I have had a chance to dig through the papers, I will take this opportunity to confirm exactly how that works.

To answer the question a little bit better than I did this morning, the place to go is today’s amendment paper. The relevant provisions are on page 43 of the amendment paper, in paragraph 7(5) of Government new schedule 2, which we will debate later. If we follow the drafting through—this is quite a convoluted trail to follow —it states that the cost can be recouped over a period that is not less than three years and not more than five years. I hope that gives the hon. Member for Aberdeen North a proper answer to her question from this morning, and I hope it provides clarity and points to where in the new schedule the information can be found. I wanted to take the first opportunity to clarify that point.

Beyond that, the hon. Member for Pontypridd has summarised the provisions in this group very well, and I have nothing to add to her comments.

Question put and agreed to.

Clause 159 accordingly ordered to stand part of the Bill.

Clause 160

Individuals providing regulated services: liability

Amendment made: 159, in clause 160, page 133, line 6, after “71” insert

“or Schedule (Recovery of OFCOM’s initial costs)”.—(Chris Philp.)

This amendment ensures that, if the provider of a service consists of two or more individuals, those individuals are jointly liable to pay a fee demanded under NS2.

Clause 160, as amended, ordered to stand part of the Bill.

Clause 161 ordered to stand part of the Bill.

Schedule 14 agreed to.

Clause 162

Information offences: supplementary

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 163 to 165 stand part.

--- Later in debate ---
or can be attributed to the neglect of an officer, the officer also commits the offence. It is a welcome step indeed that the Bill captures both officer liability and, to a certain degree, group liability, in the form of partnership and unincorporated association. We are happy to support clause 165 and have not sought to make changes at this stage.
Chris Philp Portrait Chris Philp
- Hansard - -

Once again, the shadow Minister has described the various clauses in this group. They speak, as she said, to the important and very strong measures around information offences. It is so important that where someone fails to provide the information that Ofcom requires, not only is there a liability on the company to pay very large fines or have their service cut off, as we discussed earlier, but individuals have criminal liability as well.

Clause 162 gives further information about how information-related criminal offences operate and how criminal proceedings can be brought against a person who fails to comply with an information notice or a requirement imposed when Ofcom exercises its powers of entry and inspection. Clause 163 goes further to explain how defences to accusations of criminal offences can operate, and it is helpful to have that clearly set out.

Clause 164 allows for corporate officers of regulated providers to be found liable for offences committed by the provider under the Act. For example, corporate officers can also be found liable for information offences committed by their company. That is extremely important, because it means that senior personnel can be held liable even where they are not named by their company in an information response. That means the most senior executives will have their minds focused on making sure the information requirements are properly met.

Clause 165 provides further information about how information-related criminal offences will operate under the Bill when the regulated provider is not a legal person—when it is, for example, a partnership or an unincorporated association. I hope the clauses give the specificity and clarification required to operate the personal criminal liability, which gives the enforcement powers in the Bill such strong teeth.

Question put and agreed to.

Clause 162 accordingly ordered to stand part of the Bill.

Clauses 163 to 165 ordered to stand part of the Bill.

Clause 166

Extra-territorial application

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes clause 166, which specifies that references to regulated services and Ofcom’s information-gathering powers apply to services provided from outside the United Kingdom as well as to services provided from within the United Kingdom. While we recognise the challenges around internet regulation in the UK, we live in a global world, and we are pleased that the legislation has been drawn up in a way that will capture services based overseas.

We feel the Bill is lacking in its ability to regulate against content that may have originated from outside the UK. While it is welcome that regulated services based abroad will be within scope, we have concerns that that will do little to capture specific content that may not originate within the UK. We have raised these points at length in previous debates, so I will not dwell on them now, but the Minister knows that the Bill will continue to fall short when it does not capture, for example, child sexual exploitation and abuse content that was filmed and originated abroad. That is a huge loophole, which will allow harmful content to be present and to be perpetuated online well into the future. Although we support clause 166 for now, I urge the Minister to reconsider his view on how all-encompassing the current approach to content can be as he considers his Department’s strategy before Report.

Clause 167 outlines that the information offences in the Bill apply to acts done in the United Kingdom and outside the United Kingdom. We welcome its provisions, but we feel that the Government could go further. We welcome the clarification that it will be possible to prosecute information offences in any part of the UK as if they occurred there. Given the devastating pressures that our legal system already faces thanks to this Government’s cuts and shambolic approach to justice, such flexibility is crucial and a welcome step forward.

Chris Philp Portrait Chris Philp
- Hansard - -

Last week or the week before, we debated extensively the points about the extraterritorial application to protecting children, and I made it clear that the Bill protects people as we would wish it to.

Clause 166 relates to extraterritorial enforceability. It is important to make sure that the duties, enforceable elements and sanctions apply worldwide, reflecting the realities of the internet, and clause 166 specifies that references to regulated services in the Bill include services provided from outside the United Kingdom. That means that services based overseas must also comply, as well as those in the UK, if they reach UK users.

The clause ensures that Ofcom has effective information-gathering powers and can seek information from in-scope companies overseas for the purposes of regulating and enforcing the regime. Obviously, companies such as Facebook are firmly in scope, as hon. Members would expect. The clause makes it clear that Ofcom can request information held outside the UK and interview individuals outside the UK, if that is necessary for its investigations.

Clause 167 explains that the information-related personal criminal offences in the Bill—for example, failing to comply with Ofcom’s information notices—apply to acts done inside and outside the UK. That means that those offences can be criminally prosecuted whether the perpetrator is based in the UK or outside the UK. That will send a clear message to the large global social media firms that no matter where they may be based in the world or where their services may be provided from, we expect them to comply and the enforcement provisions in the Bill will apply to them.

Question put and agreed to.

Clause 166 accordingly ordered to stand part of the Bill.

Clause 167 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fifteenth sitting)

Chris Philp Excerpts
Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. Please ensure your phones are switched to silent.

Clause 168

Publication by OFCOM

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.

Question put and agreed to.

Clause 168 accordingly ordered to stand part of the Bill.

Clause 169

Service of notices

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.

Question put and agreed to.

Clause 169 accordingly ordered to stand part of the Bill.

Clause 170

Repeal of Part 4B of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 171 and 172.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.

Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.

Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

Chris Philp Portrait Chris Philp
- Hansard - -

I agree with the hon. Member for Worsley and Eccles South, that we want to see these provisions brought into force as quickly as possible, for the reasons that she set out. We are actively thinking about ways of ensuring that these provisions are brought into force as fast as possible. It is something that we have been actively discussing with Ofcom, and that, I hope, will be reflected in the road map that it intends to publish before the summer. That will of course remain an area of close working between the Department for Digital, Culture, Media and Sport and Ofcom, ensuring that these provisions come into force as quickly as possible. Of course, the illegal duties will be brought into force more quickly. That includes the CSEA offences set out in schedule 6.

The hon. Member for Aberdeen North raised questions in relation to the repeal of part 3 of the Digital Economy Act. Although that is on the statute book, it was never commenced. When it is repealed, we will not be removing from force something that is applied at the moment, because the statutory instrument to commence it was never laid. So the point she raised about whether the Bill would come into force the day after the Digital Economy Act is repealed does not apply; but the point she raised about bringing this legislation into force quickly is reasonable and right, and we will work on that.

The hon. Lady asked about the differences in scope between the video-sharing platform and the online safety regime. As I said, the online safety regime does have an increased scope compared with the VSP regime, but I think it is reasonable to keep an eye on that as she suggested, and keep it under review. There is of course a formal review mechanism in clause 149, but I think that more informally, it is reasonable that as the transition is made we keep an eye on it, as a Government and as parliamentarians, to ensure that nothing gets missed out.

I would add that, separately from the Bill, the online advertising programme is taking a holistic look at online advertising in general, and that will also be looking at matters that may also touch on the VSPs and what they regulate.

Question put and agreed to.

Clause 170 accordingly ordered to stand part of the Bill.

Clauses 171 and 172 ordered to stand part of the Bill.

Clause 173

Powers to amend section 36

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to take clauses 174 to 176 stand part.

Chris Philp Portrait Chris Philp
- Hansard - -

The clause gives the Secretary of State the power to amend the list of fraudulent offences in section 36 in relation to the duties in relation to fraudulent advertising. These are the new duties that were introduced following feedback from Parliament, the Joint Committee, Martin Lewis and many other people. That is to ensure that we can keep the list of fraudulent offences up to date. The power to make those changes is subject to some constraints, as we would expect. The clause lists the criteria that any new offences must meet before the Secretary of State can include them in the section 36 list, which relates to the prevalence of the paid-for advertisements that amount to the new offence on category 1 services and the risk and severity of harm that that content poses to individuals in the UK.

The clause further limits the Secretary of State’s power to include new fraud offences, listing types of offence that may not be added. Offences from the Consumer Protection from Unfair Trading Regulations would be one instance. As I mentioned, the power to update section 36 is necessary to ensure that the legislation is future-proofed against new legislation and changes in criminal behaviour. Hon. Members have often said that it is important to ensure that the Bill is future-proof, and here is an example of exactly that future-proofing.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of questions, particularly on clause 176 and the powers to amend schedules 6 and 7. I understand the logic for schedule 5 being different—in that terrorism offences are a wholly reserved matter—and therefore why only the Secretary of State would be making any changes.

My question is on the difference in the ways to amend schedules 6 and 7—I am assuming that Government amendment 126, which asks the Secretary of State to consult Scottish Ministers and the Department of Justice in Northern Ireland, and which we have already discussed, will be voted on and approved before we come to clause 176. I do not understand the logic for having different procedures to amend the child sexual exploitation and abuse offences and the priority offences. Why have the Government chosen two different procedures for amending the two schedules?

I understand why that might not be a terribly easy question to answer today, and I would be happy for the Minister to get in touch afterwards with the rationale. It seems to me that both areas are very important, and I do not quite understand why the difference is there.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by addressing the questions the shadow Minister raised about these powers. She used the phrase “free rein” in her speech, but I would not exactly describe it as free rein. If we turn to clause 179, which we will come to in a moment or two, and subsection (1)(d), (e), (f) and (g), we see that all the regulations made under clauses 173 to 176, which we are debating, require an SI under the affirmative procedure. Parliament will therefore get a chance to have its say, to object and indeed to vote down a provision if it wishes to. It is not that the Secretary of State can act alone; changes are subject to the affirmative SI procedure.

It is reasonable to have a mechanism to change the lists of priority offences and so on by affirmative SI, because the landscape will change and new offences will emerge, and it is important that we keep up to date. The only alternative is primary legislation, and a slot for a new Act of Parliament does not come along all that often—perhaps once every few years for any given topic. I think that would lead to long delays—potentially years—before the various exemptions, lists of priority offences and so on could be updated. I doubt that it is Parliament’s intention, and it would not be good for the public if we had to wait for primary legislation to change the lists. The proposed mechanism is the only sensible and proportionate way to do it, and it is subject to a parliamentary vote.

A comment was made about Ofcom’s independence. The way the offences are defined has no impact on Ofcom’s operational independence. That is about how Ofcom applies the rules; this is about what the rules themselves are. It is right that we are able to update them relatively nimbly by affirmative SI.

The hon. Member for Aberdeen North asked about the differences in the way schedules 6 and 7 can be updated. I will happily drop her a line with further thoughts if she wants me to, but in essence we are happy to get the Scottish child sexual exploitation and abuse offences, set out in part 2 of schedule 6, adopted as soon as Scottish Ministers want. We do not want to delay any measures on child exploitation and abuse, and that is why it is done automatically. Schedule 7, which sets out the other priority offences, could cover any topic at all—any criminal offence could fall under that schedule—whereas schedule 6 is only about child sexual exploitation and abuse. Given that the scope of schedule 7 takes in any criminal offence, it is important to consult Scottish Ministers if it is a Scottish offence but then use the statutory instrument procedure, which applies it to the entire UK internet. Does the hon. Lady want me to write to her, or does that answer her question?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is actually incredibly helpful. I do not need a further letter, thanks.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful to the hon. Lady for saving DCMS officials a little ink, and electricity for an email.

I hope I have addressed the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 173 accordingly ordered to stand part of the Bill.

Clauses 174 and 175 ordered to stand part of the Bill.

Clause 176

Powers to amend Schedules 5, 6 and 7

Amendment made: 126, in clause 176, page 145, line 4, at end insert—

“(5A) The Secretary of State must consult the Scottish Ministers before making regulations under subsection (3) which—

(a) add an offence that extends only to Scotland, or

(b) amend or remove an entry specifying an offence that extends only to Scotland.

(5B) The Secretary of State must consult the Department of Justice in Northern Ireland before making regulations under subsection (3) which—

(a) add an offence that extends only to Northern Ireland, or

(b) amend or remove an entry specifying an offence that extends only to Northern Ireland.”—(Chris Philp.)

This amendment ensures that the Secretary of State must consult the Scottish Ministers or the Department of Justice in Northern Ireland before making regulations which amend Schedule 7 in connection with an offence which extends to Scotland or Northern Ireland only.

Clause 176, as amended, ordered to stand part of the Bill.

Clause 177

Power to make consequential provision

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 178 stand part.

Government amendment 160.

Clause 179 stand part.

Chris Philp Portrait Chris Philp
- Hansard - -

As new services and functions emerge and evolve, and platforms and users develop new ways to interact online, the regime will need to adapt. Harms online will also continue to change, and the framework will not function effectively if it cannot respond to these changes. These clauses provide the basis for the exercise of the Secretary of State’s powers under the Bill to make secondary legislation. The Committee has already debated the clauses that confer the relevant powers.

Clause 177 gives the Secretary of State the power to make consequential changes to this legislation or regulations made under it. It further provides that the regulations may amend or repeal relevant provisions made under the Communications Act 2003 or by secondary legislation made under that Act. The power is necessary to give effect to the various regulation-making powers in the Bill, which we have mostly already debated, and to ensure that the provisions of the 2003 Act and regulations that relate to online safety can continue to be updated as appropriate. That is consistent with the principle that the Bill must be flexible and future-proof. The circumstances in which these regulation-making powers may be exercised are specified and constrained by the clauses we have previously debated. Clause 178 ensures that the regulation-making powers in the Bill may make different provisions for different purposes, in particular ensuring that regulations make appropriate provisions for different types of service.

Amendment 160 forms part of a group of amendments that will allow Ofcom to recover costs from the regulated services for work that Ofcom carries out before part 6 of the Bill is commenced. As I said previously, the costs may be recouped over a period of three to five years. Currently, the costs of preparations for the exercise of safety functions include only costs incurred after commencement. The amendment makes sure that initial costs incurred before commencement can be recouped as well.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I rise briefly to support amendment 76, in the name of the hon. Member for Aberdeen North. Labour supports broadening the definition of “content” in this way. I refer the Minister to our earlier contributions about the importance of including newspaper comments, for example, in the scope of the Bill. This is a clear example of a key loophole in the Bill. We believe that a broadened definition of “content” would be a positive step forward to ensure that there is future-proofing, to prevent any unnecessary harm from any future content.

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister, in her first contribution to the debate, introduced the broad purpose of the various clauses in this group, so I do not propose to repeat those points.

I would like to touch on one or two issues that came up. One is that clause 187 defines the meaning of “harm” throughout the Bill, although clause 150, as we have discussed, has its own internal definition of harm that is different. The more general definition of harm is made very clear in clause 187(2), which states:

“‘Harm’ means physical or psychological harm.”

That means that harm has a very broad construction in the Bill, as it should, to make sure that people are being protected as they ought to be.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In one of our earlier debates, I asked the Minister about the difference between “oral” and “aural”, and I did not get a very satisfactory answer. I know the difference in their dictionary definition—I understand that they are different, although the words sound the same. I am confused that clause 189 uses “oral” as part of the definition of content, but clause 49 refers to

“one-to-one live aural communications”

in defining things that are excluded.

I do not understand why the Government have chosen to use those two different words in different places in the Bill. It strikes me that, potentially, we mean one or the other. If they do mean two different things, why has one thing been chosen for clause 49 and another thing for clause 189? Why has the choice been made that clause 49 relates to communications that are heard, but clause 189 relates to communications that are said? I do not quite get the Government’s logic in using those two different words.

I know this is a picky point, but in order to have good legislation, we want it to make sense, for there to be a good rationale for everything that is in it and for people to be able to understand it. At the moment, I do not properly understand why the choice has been made to use two different words.

More generally, the definitions in clause 189 seem pretty sensible, notwithstanding what I said in the previous debate in respect of amendment 76, which, with your permission, Sir Roger, I intend to move when we reach the appropriate point.

Chris Philp Portrait Chris Philp
- Hansard - -

As the hon. Member for Pontypridd said, clause 189 sets out various points of definition and interpretation necessary for the Bill to be understood and applied.

I turn to the question raised by the hon. Member for Aberdeen North. First, I strongly commend and congratulate her on having noticed the use of the two words. Anyone who thinks that legislation does not get properly scrutinised by Parliament has only to look to the fact that she spotted this difference, 110 pages apart, in two different clauses—clauses 49 and 189. That shows that these things do get properly looked at. I strongly congratulate her on that.

I think the best way of addressing her question is probably to follow up with her after the sitting. Clause 49 relates to regulated user-to-user content. We are in clause 49(2)—is that right?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Subsection (5).

Chris Philp Portrait Chris Philp
- Hansard - -

It is cross-referenced in subsection (5). The use of the term “aural” in that subsection refers to sound only—what might typically be considered telephony services. “Oral” is taken to cover livestreaming, which includes pictures and voice. That is the intention behind the use of the two different words. If that is not sufficient to explain the point—it may not be—I would be happy to expand in writing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That would be helpful, in the light of the concerns I raised and what the hon. Member for Pontypridd mentioned about gaming, and how those communications work on a one-to-one basis. Having clarity in writing on whether clause 49 relates specifically to telephony-type services would be helpful, because that is not exactly how I read it.

Chris Philp Portrait Chris Philp
- Hansard - -

Given that the hon. Lady has raised the point, it is reasonable that she requires more detail. I will follow up in writing on that point.

Amendment proposed: 76, in clause 189, page 154, line 34, after “including” insert “but not limited to”.(Kirsty Blackman.)

This amendment clarifies the definition of “content” in the bill in order that anything communicated by means of an internet service is considered content, not only those examples listed.

Question put, That the amendment be made.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour has not tabled any amendments to clause 190, which lists the provisions that define or explain terms used in the Bill. However, it will come as no surprise that we dispute the Bill’s definition of harm, and I am grateful to my hon. Friend the Member for Batley and Spen for raising those important points in our lively debate about amendment 112 to clause 150. We maintain that the Minister has missed the point, in that the Bill’s definition of harm fails to truly capture physical harm caused as a consequence of being online. I know that the Minister has promised to closely consider that as we head to Report stage, but I urge him to bear in mind the points raised by Labour, as well as his own Back Benchers.

The Minister knows, because we have repeatedly raised them, that we have concerns about the scope of the Bill’s provisions relating to priority content. I will not repeat myself, but he will be unsurprised to learn that this is an area in which we will continue to prod as the Bill progresses through Parliament.

Chris Philp Portrait Chris Philp
- Hansard - -

I have made points on those issues previously. I do not propose to repeat now what I have said before.

Question put and agreed to.

Clause 190 accordingly ordered to stand part of the Bill.

Clause 191 ordered to stand part of the Bill.

Clause 192

Extent

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move amendment 141, in clause 192, page 160, line 9, at end insert—

“(aa) section (Offence under the Obscene Publications Act 1959: OFCOM defence);”.

This amendment provides for NC35 to extend only to England and Wales.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government new clause 35—Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

“(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.”

(3) In subsection (7)—

(a) the words after “In this section” become paragraph (a), and

(b) at the end of that paragraph, insert “;

(b) “OFCOM” means the Office of Communications.””

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

Chris Philp Portrait Chris Philp
- Hansard - -

New clause 35 amends section 2 of the Obscene Publications Act 1959 to create a defence for Ofcom to the offence of publishing an obscene article where Ofcom is exercising its online safety duties. Ofcom has a range of functions that may result in its staff handling such content, so we want to ensure that that is covered properly. We have debated that already.

Clause 192 covers territorial extent. The regulation of the internet, as a reserved matter, covers all of the United Kingdom, but particular parts of the Bill extend to particular areas of the UK. In repealing that point in the Obscene Publications Act, we are ensuring that the Bill applies to the relevant parts of the United Kingdom, because that area of legislation has different areas of applicability. The clause and our amendments are important in ensuring that that is done in the right way.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the hon. Member’s intervention, and I am grateful for her and her party’s support for this important amendment.

It is also worth drawing colleagues’ attention to the history of issues, which have been brought forward in this place before. We know there was reluctance on the part of Ministers when the Digital Economy Act 2017 was on the parliamentary agenda to commence the all-important part 3, which covered many of the provisions now in part 5. Ultimately, the empty promises made by the Minister’s former colleagues have led to huge, record failures, even though the industry is ready, having had years to prepare to implement the policy. I want to place on record my thanks to campaigning groups such as the Age Verification Providers Association and others, which have shown fierce commitment in getting us this far.

It might help if I cast colleagues’ minds back to the Digital Economy Act 2017, which received Royal Assent in April of that year. Following that, in November 2018, the then Minister of State for Digital and Creative Industries told the Science and Technology Committee that part 3 of the DEA would be in force “by Easter next year”. Then, in December 2018, both Houses of Parliament approved the necessary secondary legislation, the Online Pornography (Commercial Basis) Regulations 2018, and the required statutory guidance.

But shortly after, in April 2018, the first delay arose when the Government published an online press release stating that part 3 of the DEA would not come into force until 15 July 2019. However, June 2019 came around and still there was nothing. On 20 June, five days after it should have come into force, the then Under-Secretary of State told the House of Lords that the defendant had failed to notify the European Commission of the statutory guidance, which would need to be done, and that that would result in a delay to the commencement of part 3

“in the region of six months”.—[Official Report, House of Lords, 20 June 2019; Vol. 798, c. 883.]

However, on 16 October 2019, the then Secretary of State announced via a written statement to Parliament that the Government

“will not be commencing part 3 of the Digital Economy Act 2017 concerning age verification for online pornography.”—[Official Report, 16 October 2019; Vol. 666, c. 17WS.]

A mere 13 days later, the Government called a snap general election. I am sure those are pretty staggering realities for the Minister to hear—and defend—but I am willing to listen to his defence. It really is not good enough. The industry is ready, the technology has been there for quite some time, and, given this Government’s fondness for a U-turn, there are concerns that part 5 of the Bill, which we have spent weeks deliberating, could be abandoned in a similar way as part 3 of the DEA was.

The Minister has failed to concede on any of the issues we have raised in Committee. It seems we are dealing with a Government who are ignoring the wide-ranging gaps and issues in the Bill. He has a relatively last-ditch opportunity to at least bring about some positive change, and to signify that he is willing to admit that the legislation as it stands is far from perfect. The provisions in part 5 are critical—they are probably the most important in the entire Bill—so I urge him to work with Labour to make sure they are put to good use in a more than reasonable timeframe.

Chris Philp Portrait Chris Philp
- Hansard - -

On the implementation of part 3 of the Digital Economy Act 2017, all the events that the shadow Minister outlined predated my time in the Department. In fact, apart from the last few weeks of the period she talked about, the events predated my time as a Minister in different Departments, and I cannot speak for the actions and words of Ministers prior to my arrival in DCMS. What I can say, and I have said in Committee, is that we are determined to get the Bill through Parliament and implemented as quickly as we can, particularly the bits to do with child safety and the priority illegal content duties.

The shadow Minister commented at the end of her speech that she thought the Government had been ignoring parliamentary opinion. I take slight issue with that, given that we published a draft Bill in May 2021 and went through a huge process of scrutiny, including by the Joint Committee of the Commons and the Lords. We accepted 66 of the Joint Committee’s recommendations, and made other very important changes to the Bill. We have made changes such as addressing fraudulent advertising, which was previously omitted, and including commercial pornography—meaning protecting children—which is critical in this area.

The Government have made a huge number of changes to the Bill since it was first drafted. Indeed, we have made further changes while the Bill has been before the Committee, including amending clause 35 to strengthen the fraudulent advertising duties on large search companies. Members of Parliament, such as the right hon. Member for East Ham (Sir Stephen Timms), raised that issue on Second Reading. We listened to what was said at that stage and we made the changes.

There have also been quite a few occasions during these Committee proceedings when I have signalled—sometimes subtly, sometimes less so—that there are areas where further changes might be forthcoming as the Bill proceeds through both Houses of Parliament. I do not think the hon. Member for Pontypridd, or any member of the Committee, should be in any doubt that the Government are very open to making changes to the Bill where we are able to and where they are right. We have done so already and we might do so again in the future.

On the specifics of the amendment, we share the intention to protect children from accessing pornography online as quickly as possible. The amendment seeks to set a three-month timeframe within which part 5 must come into force. However, an important consideration for the commencement of part 5 will be the need to ensure that all kinds of providers of online pornography are treated the same, including those hosting user-generated content, which are subject to the duties of part 3. If we take a piecemeal approach, bringing into force part 5, on commercial pornography, before part 3, on user-to-user pornography, that may enable some of the services, which are quite devious, to simply reconfigure their services to circumvent regulation or cease to be categorised as part 5 services and try to be categorised as part 3 services. We want to do this in a comprehensive way to ensure that no one will be able to wriggle out of the provisions in the Bill.

Parliament has also placed a requirement on Ofcom to produce, consult on and publish guidance for in-scope providers on meeting the duties in part 5. The three-month timescale set out in the amendment would be too quick to enable Ofcom to properly consult on that guidance. It is important that the guidance is right; if it is not, it may be legally challenged or turn out to be ineffective.

I understand the need to get this legislation implemented quickly. I understand the scepticism that flows from the long delays and eventual cancellation of part 3 of the Digital Economy Act 2017. I acknowledge that, and I understand where the sentiment comes from. However, I think we are in a different place today. The provisions in the Bill have been crafted to address some of the concerns that Members had about the previous DEA measures—not least the fact that they are more comprehensive, as they cover user-to-user, which the DEA did not. There is therefore a clear commitment to getting this done, and getting it done fast. However, we also have to get it done right, and I think the process we have set out does that.

The Ofcom road map is expected before the summer. I hope that will give further reassurance to the Committee and to Parliament about the speed with which these things can get implemented. I share Members’ sentiments about needing to get this done quickly, but I do not think it is practical or right to do it in the way set out in amendment 49.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s comments. However, I respectfully disagree, given the delays already since 2017. The industry is ready for this. The providers of the age verification services are ready for this. We believe that three months is an adequate timeframe, and it is vital that we get this done as quickly as possible. With that in mind, I will be pushing amendment 49 to a vote.

Question put, That the amendment be made.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

This very important and concise clause sets out that the Bill, when passed, will be cited as the Online Safety Act 2022, which I hope is prophetic when it comes the lightning speed of passage through the House of Lords.

Question put and agreed to.

Clause 194 accordingly ordered to stand part of the Bill.

New Clause 35

Offence under the Obscene Publications Act 1959: OFCOM defence

“(1) Section 2 of the Obscene Publications Act 1959 (prohibition of publication of obscene matter) is amended in accordance with subsections (2) and (3).

(2) After subsection (5) insert—

‘(5A) A person shall not be convicted of an offence against this section of the publication of an obscene article if the person proves that—

(a) at the time of the offence charged, the person was a member of OFCOM, employed or engaged by OFCOM, or assisting OFCOM in the exercise of any of their online safety functions (within the meaning of section188 of the Online Safety Act 2022), and

(b) the person published the article for the purposes of OFCOM’s exercise of any of those functions.’

(3) In subsection (7)—

(a) the words after ‘In this section’ become paragraph (a), and

(b) at the end of that paragraph, insert ‘;

(b) “OFCOM” means the Office of Communications.’”—(Chris Philp.)

This new clause (to be inserted after clause 171) amends section 2 of the Obscene Publications Act 1959 to create a defence for OFCOM and their employees etc to the offence of the publication of an obscene article.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Recovery of OFCOM’s initial costs

“Schedule (Recovery of OFCOM’s initial costs) makes provision about fees chargeable to providers of regulated services in connection with OFCOM’s recovery of costs incurred on preparations for the exercise of their online safety functions.”—(Chris Philp.)

This new clause introduces NS2.

Brought up, and read the First time.

Chris Philp Portrait Chris Philp
- Hansard - -

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government new clause 43 and Government new schedule 2.

Chris Philp Portrait Chris Philp
- Hansard - -

New clause 42 introduces new schedule 2. New clause 43 provides that the additional fees charged to providers under new schedule 2 must be paid into the consolidated fund. We discussed that a few days ago. That is where the fees are currently destined and I owe my right hon. Friend the Member for Basingstoke some commentary on this topic in due course. The Bill already provided that monetary penalties must be paid into the Consolidated Fund; the provisions are now placed into that clause.

New schedule 2, which is quite detailed, makes provisions in connection with Ofcom’s ability to recover its initial costs, which we have previously debated. As discussed, it is important that the taxpayer not only is protected from the ongoing costs but that the set-up costs are recovered. The taxpayer should not have to pay for the regulatory framework; the people who are being regulated should pay, whether the costs are incurred before or after commencement, in line with the “polluter pays” principle. Deep in new schedule 2 is the answer to the question that the hon. Member for Aberdeen North asked a day or two ago about the period over which set-up costs can be recovered, with that period specified as between three and five years. I hope that provides an introduction to the new clauses and new schedules.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We welcome this grouping, which includes two new clauses and a new schedule. Labour has raised concerns about the future funding of Ofcom more widely, specifically when we discussed groupings on clause 42. The Minister’s response did little to alleviate our concerns about the future of Ofcom’s ability to raise funds to maintain its position as the regulator. Despite that, we welcome the grouping, particularly the provisions in the new schedule, which will require Ofcom to seek to recover the costs it has incurred when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services. This is an important step, which we see as being broadly in line with the kind of mechanisms already in place for other, similar regulatory regimes.

Ultimately, it is right that fees charged to providers under new schedule 2 must be paid into the Consolidated Fund and important that Ofcom can recover its costs before a full fee structure and governance process is established. However, I have some questions for the Minister. How many people has Ofcom hired into roles, and can any of those costs count towards the calculation of fees? We want to ensure that other areas of regulation do not lose out as a consequence. Broadly speaking, though, we are happy to support the grouping and have not sought to table amendment at this stage.

Chris Philp Portrait Chris Philp
- Hansard - -

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by stating the fact that this Bill, as drafted, rightly has incredibly strong protections for children. The children’s safety duties that we have already debated are extremely strong. They apply to any platform with significant numbers of children using it and they impose a duty on such companies to protect children from harm. The priority illegal safety duties are listed in schedule 6, on child sexual exploitation and abuse offences—they have their very own schedule because we attach such importance to them. Committee members should be in no doubt that protecting children is at the very heart of the Bill. I hope that has been obvious from the debates we have had.

On children’s ability to raise complaints and seek redress under the Bill, it is worth reminding ourselves of a couple of clauses that we have debated previously, through which we are trying to make sure it is as easy as possible for children to report problematic content or to raise complaints. Members will recall that we debated clause 17. Clause 17(6)(c) allows for

“a parent of, or other adult with responsibility for, a child”

to raise content-reporting claims with users, so that children are not left on their own. We have also been clear under the complaints procedures set out in clause 18(2)(c) that those procedures must be

“easy to access, easy to use (including by children)”.

That is an explicit reference to accessibility for children.

The hon. Member for Aberdeen North has also already referred to the fact that in both the children’s risk assessment duties and the adult’s risk assessment duties people’s characteristics, including whether they are a member of a particular group, have to be taken into account. The children’s risk assessment duties are set out in clause 10(6)(d). Children with particular characteristics —orientation, race and so on—have to be particularly considered. The fact that a clause on the children’s risk assessment duties even exists in the first place shows that specific and special consideration has to be given to children and the risks they face. That is hardwired right into the architecture of the Bill.

All the provisions that I have just mentioned—starting with clause 10 on children’s risk assessment duties, right through to the end of the Bill and the priority offences in schedule 6, on child sexual exploitation and abuse offences—show that, right throughout the whole Bill, the protection of children is integral to what we are trying to do with the Bill.

On the consultation that happened in forming and framing the Bill, really extensive engagement and consultation took place throughout the preparation of this piece of legislation, including direct consultation with children themselves, their parents and the many advocacy groups for children. There should be no doubt at all that children have been thoroughly consulted as the Bill has been prepared.

On the specifics of new clause 3, which relate to advocacy for children, as the hon. Member for Aberdeen North referred to in passing a moment ago, there is a mechanism in clause 140 for organisations that represent particular groups, such as children, to raise super-complaints with Ofcom when there is a problem. In fact, when we debated that clause, I used children as an example when I spoke about the “eligible entities” that can raise super-complaints—I used the NSPCC speaking for children as a specific example of the organisations I would expect the term “eligible entity” to include. Clause 140 explicitly empowers organisations such as the NSPCC and others to speak for children.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.

Chris Philp Portrait Chris Philp
- Hansard - -

There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the Minister for his support for Labour legislation. Does he acknowledge that we have different Children’s Commissioners across the nations of the UK? Each would have the same rights to advocate for children, so we would have four, rather than one focusing on one specific issue, which is what the Children’s Commissioners across the UK are advocating for.

Chris Philp Portrait Chris Philp
- Hansard - -

I do not have in front of me the relevant devolved legislation—I have only the Children Act 2004 directly in front of me—but I assume it is broadly similar. The hon. Member for Aberdeen North can correct me if I am wrong, but I assume it is probably broadly similar in the way—[Interruption.] She is not sure, so I do not feel too bad about not being sure either. I imagine it is similar. I am not sure that having similar statutory bodies with the same function—we would create another with the new clause—is necessarily helpful.

The Bill sets out formal processes that allow other organisations, such as the NSPCC, to raise complaints that have to be dealt with. That ensures that the voices of groups—including children, but not just children—will be heard. I suspect that if we have a children’s advocacy body, other groups will want them and might feel that they have been overlooked by omission.

The good thing about the way the super-complaint structure in clause 140 works is that it does not prescribe what the groups are. Although I am sure that children will be top of the list, there will be other groups that want to advocate and to be able to bring super-complaints. I imagine that women’s groups will be on that list, along with groups advocating for minorities and people with various sexual orientations. Clause 140 is not exclusive; it allows all these groups to have a voice that must be heard. That is why it is so effective.

My right hon. Friend the Member for Basingstoke and the hon. Member for Batley and Spen asked whether the groups have enough resources to advocate on issues under the super-complaint process. That is a fair question. The allocation of funding to different groups tends to be done via the spending review process. Colleagues in other Departments—the Department for Education or, in the case of victims, the Ministry of Justice—allocate quite a lot of money to third-sector groups. The victims budget was approximately £200 million a year or two ago, and I am told it has risen to £300 million for the current financial year. That is the sort of funding that can find its way into the hands of the organisations that advocate for particular groups of victims. My right hon. Friend asked whether the proceeds of fines could be applied to fund such work, and I have undertaken to raise that with the Treasury.

We already have a statutory advocate for children: the four Children’s Commissioners for the four parts of the United Kingdom. We have the super-complaints process, which covers more than children’s groups, crucial though they are. We have given Ofcom statutory duties to consult when developing its codes of practice, and we have money flowing via the Ministry of Justice, the DFE and others, into advocate groups. Although we agree with the intention behind new clause 3, we believe its objectives are very well covered via the mechanisms that I have just set out at some length.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

There have not been all that many times during the debate on the Bill when the Minister has so spectacularly missed the point as he has on this section. I understand everything he said about provisions already being in place to protect to children and the provisions regarding the super-complaints, but the new clause is not intended to be a replacement for the super-complaints procedure, which we all support—in fact, we have tried to strengthen that procedure. The new clause is intended to be an addition—another, very important layer.

Unfortunately, I do not have at the front of my mind the legislation that set up the Children’s Commissioner for Scotland, or the one for England. The Minister talked through some of the provisions and phrasing in the Children Act 2004. He said that the role of the Children’s Commissioner for England is to encourage bodies to act positively on behalf of children—to encourage. There is no requirement for the body to act in the way the Children’s Commissioner says it should act. Changes have been made in Wales establishing the Future Generations Commissioner, who has far more power.

Chris Philp Portrait Chris Philp
- Hansard - -

As far as I can tell, the user advocacy body proposed in new clause 3 would not have the ability to compel Ofcom either.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

But it would be a statutory consultee that is specifically mentioned in this provision. I cannot find in the Bill a provision giving Ofcom a statutory duty to consult the four Children’s Commissioners. The new clause would make the children’s advocacy body a statutory consultee in decisions that affect children.

Chris Philp Portrait Chris Philp
- Hansard - -

The Bill will require Ofcom to consult people who represent the interests of children. Although not named, it would be astonishing if the first people on that list were not the four Children’s Commissioners when developing the relevant codes of practice. The statutory obligation to consult those groups when developing codes of practice and, indeed, guidance is set out in clauses 37(6)(d) and 69(3)(d).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful, but there are still shortcomings in what the Minister says. The Bill, as drafted, requires Ofcom to require things of other organisations. Some of the detail is in the Bill, some of the detail will come in secondary legislation and some of the detail will come in the codes of practice published by Ofcom. We broadly agree that the Bill will ensure people are safer on the internet than they currently are, but we do not have all the detail on the Government’s intent. We would like more detail on some things, but we are not saying, “We need every little bit of detail.” If we did, the Bill would not be future-proof. We would not be able to change and update the Bill if we required everything to be in the Bill.

The Bill is not a one-off; it will continually change and grow. Having a user advocacy body would mean that emerging threats can quickly be brought to Ofcom’s attention. Unlike the Children’s Commissioners, who have a hundred other things to do, the entire purpose of this body would be to advocate on behalf of children online. The Children’s Commissioners do an amazing job, but this is not their No. 1 priority. If the Minister wants this to be a world-leading Bill, its No. 1 priority should be to protect the human rights of children.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I think the hon. Lady is being a little unfair to the Children’s Commissioners. Dame Rachel de Souza is doing a fantastic job of advocating specifically in the digital sphere. She really is doing a fantastic job, and I say that as a Minister. I would not say she is leaving any gaps.

These digital children’s safety issues link to wider children’s safety issues that exist offline, such as sexual exploitation, grooming and so on, so it is useful that the same person advocates for children in both the offline and online worlds.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Sixteenth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

We believe that the platforms need to get into disclosure proactively, and that this is a reasonable clause, so we will push it to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

Chris Philp Portrait Chris Philp
- Hansard - -

I understand the intention behind the new clause, but I want to draw the Committee’s attention to existing measures in the Bill that address this matter. I will start with the point raised by the hon. Member for Aberdeen North, who said that as a parent she would like to be able to see a helpful summary of what the risks are prior to her children using a new app. I am happy to say to her that that is already facilitated via clause 13(2), which appears at the top of page 13. There is a duty there

“to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service”,

including the levels of risk, and the nature and severity of those risks. That relates specifically to adults, but there is an equivalent provision relating to children as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I just gently say that if there is a requirement for people to sign up or begin to go through the sign-up process in order to see the terms of service, that is not as open and transparent. That is much more obstructive than it could be. A requirement for providers to make their terms of service accessible to any user, whether or not they were registered, would assist in the transparency.

Chris Philp Portrait Chris Philp
- Hansard - -

I think the terms of service are generally available to be viewed by anyone. I do not think people have to be registered users to view the terms of service.

In addition to the duty to summarise the findings of the most recent risk assessment in relation to adults in clause 13(2), clause 11 contains obligations to specify in the terms of service, in relation to children, where children might be exposed to risks using that service. I suggest that a summary in the terms of service, which is an easy place to look, is the best way for parents or anybody else to understand what the risks are, rather than having to wade through a full risk assessment. Obviously, the documents have not been written yet, because the Bill has not been passed, but I imagine they would be quite long and possibly difficult to digest for a layperson, whereas a summary is more readily digestible. Therefore, I think the hon. Lady’s request as a parent is met by the duties set out in clause 11, and the duties for adults are set out in clause 13.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. As my hon. Friend the Member for Worsley and Eccles South mentioned when speaking to new clause 11, Labour has genuine concerns about supply chain risk assessment duties. That is why we have tabled new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties drawing on existing legislation.

As we know, platforms, particularly those supporting user-to-user generated content, often employ services from third parties. At our evidence sessions we heard from Danny Stone of the Antisemitism Policy Trust that this has included Twitter explaining that racist GIFs were not its own but were provided by another service. The hands-off approach that platforms have managed to get away with for far too long is exactly what the Bill is trying to fix, yet without this important new clause we fear there will be very little change.

We have already raised issues with the reliance on third party providers more widely, particularly content moderators, but the same problems also apply to some types of content. Labour fears a scenario in which a company captured by the regulatory regime established by the Bill will argue that an element of its service is not within the ambit of the regulator simply because it is part of a supply chain, represented by, but not necessarily the responsibility of, the regulated services.

The contracted element, supported by an entirely separate company, would argue that it is providing business-to-business services. That is not user-to-user generated content per se but content designed and delivered at arm’s length, provided to the user-to-user service to deploy to its users. The result would likely be a timely, costly and unhelpful legal process during which systems could not be effectively regulated. The same may apply in relation to moderators, where complex contract law would need to be invoked.

We recognise that in UK legislation there are concerns and issues around supply chains. The Bribery Act 2010, for example, says that a company is liable if anyone performing services for or on the company’s behalf is found culpable of specific actions. We therefore strongly urge the Minister to consider this new clause. We hope he will see the extremely compelling reasons why liability should be introduced for platforms failing to ensure that associated parties, considered to be a part of a regulated service, help to fulfil and abide by relevant duties.

Chris Philp Portrait Chris Philp
- Hansard - -

The new clause seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The provider would be liable regardless of whether it has any control over the service in question. We take the view this would impose an unreasonable burden on businesses and cause confusion over which companies are required to comply with the duties in the Bill.

As drafted, the Bill ensures legal certainty and clarity over which companies are subject to duties. Clause 180 makes it clear that the Bill’s duties fall on companies with control over the regulated service. The point about who is in control is very important, because the liability should follow the control. These companies are responsible for ensuring that any third parties, such as contractors or individuals involved in running the service, are complying with the Bill’s safety duties, so that they cannot evade their duties in that way.

Companies with control over the regulated service are best placed to keep users safe online, assess risk, and put in place systems and processes to minimise harm, and therefore bear the liability if there is a transgression under the Bill as drafted. Further, the Bill already contains robust provisions in clause 161 and schedule 14 that allow Ofcom to hold parent and subsidiary companies jointly liable for the actions of other companies in a group structure. These existing mechanisms promote strong compliance within groups of companies and ensure that the entities responsible for breaches are the ones held responsible. That is why we feel the Bill as drafted achieves the relevant objectives.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

I rise to speak in favour of new clauses 14 to 16, on media literacy. As we have discussed in Committee, media literacy is absolutely vital to ensure that internet users are aware of the tools available to protect themselves. Knowledge and understanding of the risks online, and how to protect against them, are the first line of defence for us all.

We all know that the Bill will not eliminate all risk online, and it will not entirely clean up the internet. Therefore, ensuring that platforms have robust tools in place, and that users are aware of them, is one of the strongest tools in the Bill to protect internet users. As my hon. Friend the Member for Pontypridd said, including the new clauses in the Bill would help to ensure that we all make decisions based on sound evidence, rather than on poorly informed opinions that can harm not just individuals but democracy itself. The new clauses, which would place a duty on Ofcom to promote media literacy and publish a strategy, are therefore crucial.

I am sure we all agree about the benefits of public health information that informs us of the role of a healthy diet and exercise, and of ways that we can adopt a healthier lifestyle. I do not want to bring up the sensitive subject of the age of members of the Committee, as it got me into trouble with some of my younger colleagues last week, but I am sure many of us will remember the Green Cross Code campaign, the stop smoking campaigns, the anti-drink driving ads, and the powerful campaign to promote the wearing of seatbelts—“Clunk click every trip”. These were publicly funded and produced information campaigns that have stuck in our minds and, I am sure, protected thousands of lives across the country. They laid out the risks and clearly stated the actions we all need to take to protect ourselves.

When it comes to online safety, we need a similar mindset to inform the public of the risks and how we can mitigate them. Earlier in Committee, the right hon. Member for Basingstoke, a former Secretary of State for Digital, Culture, Media and Sport, shared her experience of cyber-flashing and the importance of knowing how to turn off AirDrop to prevent such incidents from occurring in the first place. I had no idea about this simple change that people can make to protect themselves from such an unpleasant experience. That is the type of situation that could be avoided with an effective media literacy campaign, which new clauses 14 to 16 would legislate for.

I completely agree that platforms have a significant duty to design and implement tools for users to protect themselves while using platforms’ services. However, I strongly believe that only a publicly funded organisation such as Ofcom can effectively promote their use, explain the dangers of not using them and target such information at the most vulnerable internet users. That is why I wholeheartedly support these vital new clauses.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We feel that we cannot have an online safety Bill without a core digital media literacy strategy. We are disappointed that clause 103 was removed from the draft Bill. We do not feel that the current regime, under the Communications Act 2003, is robust enough. Clearly, the Government do not think it is robust enough, which is why they tried to replace it in the first place. We are sad to see that now replaced altogether. We fully support these new clauses.

Question put, That the clause be read a Second time.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

It is important to make clear how the Bill operates, and I draw the Committee’s attention in particular to clauses 23 to 26, which deal with the risk assessment and safety duties for search services. I point in particular to clause 23(5)(a), which deals with the risk assessment duties for illegal content. The provision makes it clear that those risk assessments have to be carried out

“taking into account (in particular) risks presented by algorithms used by the service”.

Clause 25 relates to children’s risk assessment duties, and subsection (5)(a) states that children’s risk assessment duties have to be carried out

“taking into account (in particular) risks presented by algorithms”.

The risks presented by algorithms are expressly accounted for in clauses 23 and 25 in relation to illegal acts and to children. Those risk assessment duties flow into safety duties as we know.

By coincidence, yesterday I met with Google’s head of search, who talked about the work Google is doing to ensure that its search work is safe. Google has the SafeSearch work programme, which is designed to make the prompts better constructed.

In my view, the purpose of the new clause is covered by existing provisions. If we were to implement the proposal—I completely understand and respect the intention behind it, by the way—there could be an unintended consequence in the sense that it would ban any reference in the prompts to protected characteristics, although people looking for help, support or something like that might find such prompts helpful.

Through a combination of the existing duties and the list of harms, which we will publish in due course, as well as legislating via statutory instrument, we can ensure that people with protected characteristics, and indeed other people, are protected from harmful prompts while not, as it were, throwing the baby out with the bathwater and banning the use of certain terms in search. That might cause an unintended negative consequence for some people, particularly those from marginalised groups who were looking for help. I understand the spirit of the new clause, but we shall gently resist it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has highlighted clauses 23 and 25. Clause 25 is much stronger than clause 23, because clause 23 includes only illegal content and priority illegal content, whereas clause 25 goes into non-designated content that is harmful to children. Some of the things that we are talking about, which might not be on the verge of illegal, but which are wrong and discriminatory, might not fall into the categories of illegal or priority illegal content unless the search service, which presumably an organisation such as Google is, has a children’s risk assessment duty. Such organisations are getting a much easier ride in that regard.

I want to make the Minister aware of this. If he turns on Google SafeSearch, which excludes explicit content, and googles the word “oral” and looks at the images that come up, he will see that those images are much more extreme than he might imagine. My point is that, no matter the work that the search services are trying to do, they need to have the barriers in place before that issue happens—before people are exposed to that harmful or illegal content. The existing situation does not require search services to have enough in place to prevent such things happening. The Minister was talking about moderation and things that happen after the fact in some ways, which is great, but does not protect people from the harm that might occur. I very much wish to press the new clause to the vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the new clause would give Ofcom a proactive role in identifying and responding to misinformation incidents that can occur in a moment of crisis. As we have discussed, there are huge gaps in the Bill’s ability to sufficiently arm Ofcom with the tools it will likely need to tackle information incidents in real time. It is all very well that the Bill will ensure that things such as risk assessments are completed, but, ultimately, if Ofcom is not able to proactively identify and respond to incidents in a crisis, I have genuine concerns about how effective this regulatory regime will be in the wider sense. Labour is therefore pleased support the new clause, which is fundamental to ensuring that Ofcom can be the proactive regulator that the online space clearly needs.

The Government’s methods of tackling disinformation are opaque, unaccountable and may not even work. New clause 45, which would require reporting to Parliament, may begin to address this issue. When Ministers are asked how they tackle misinformation or disinformation harms, they refer to some unaccountable civil service team involved in state-based interference in online media.

I thank those at Carnegie UK Trust for their support when researching the following list, and for supporting my team and me to make sense of the Bill. First, we have the counter-disinformation unit, which is based in the Department for Digital, Culture, Media and Sport and intends to address mainly covid issues that breach companies’ terms of service and, recently, the Russia-Ukraine conflict. In addition, the Government information cell, which is based in the Foreign, Commonwealth and Development Office, focuses on war and national security issues, including mainly Russia and Ukraine. Thirdly, there is the so-called rapid response unit, which is based in the Cabinet Office, and mainly tackles proactive counter-messaging.

Those teams appear to nudge service providers in different ways where there are threats to national security or the democratic process, or risks to public health, yet we have zero record of their effectiveness. The groups do not publish logs of action to any external authority for oversight of what they raise with companies using the privilege authority of Her Majesty’s Government, nor do they publish the effectiveness of their actions. As far as we know, they are not rooted in expert independent external advisers. That direct state interference in the media is very worrying.

In our recent debate on amendment 83, which calls on the Government to include health misinformation and disinformation in the Bill, the Minister clearly set out why he thinks the situation is problematic. He said,

“We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.”––[Official Report, Online Safety Public Bill Committee, 14 June 2022; c. 408.]

Until we know more about those units, the boundary between their actions and that of a press office remains unclear. In the new regulatory regime, Ofcom needs to be kept up to date on the issues they are raising. The Government should reform the system and bring those units out into the open. We support Carnegie’s longer term strategic goal to set up a new external oversight body and move the current Government functions under Ofcom’s independent supervision. The forthcoming National Security Bill may tackle that, but I will leave that for the Minister to consider.

There must be a reporting system that requires the Government to set out their operational involvement with social media companies to address misinformation and disinformation, which is why we have tabled new clause 45. I hope the Minister will see that the current efforts in these units are hugely lacking in transparency, which we all want and have learned is fundamental to keep us all safe online.

Chris Philp Portrait Chris Philp
- Hansard - -

We agree that it is important that the Bill contains measures to tackle disinformation and misinformation that may emerge during serious information incidents, but the Bill already contains measures to address those, including the powers vested in the Secretary of State under clause 146, which, when debated, provoked some controversy. Under that clause, the Secretary of State will have the power to direct Ofcom when exercising its media literacy functions in the context of an issue of public health or safety or national security.

Moreover, Ofcom will be able to require platforms to issue a public statement about the steps they are taking to respond to a threat to public health or safety or to national security. As we discussed, it is appropriate that the Secretary of State will make those directions, given that the Government have the access to intelligence around national security and the relevant health information. Ofcom, as a telecoms regulator, obviously does not have access to that information, hence the need for the Secretary of State’s involvement.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by saying that I completely agree with the premise of the new clause. First, I agree that these large social media companies are acting principally for motives of their own profit and not the public good. Secondly, I agree with the proposition that they are extremely secretive, and do not transparently and openly disclose information to the public, the Government or researchers, and that is a problem we need to solve. I therefore wholeheartedly agree with the premise of the hon. Member for Aberdeen North’s new clause and her position.

However, I am honestly a bit perplexed by the two speeches we have just heard, because the Bill sets out everything the hon. Members for Aberdeen North and for Worsley and Eccles South asked for in unambiguous, black and white terms on the face of the Bill—or black and green terms, because the Bill is published on green paper.

Clause 85 on page 74 outlines the power Ofcom has to request information from the companies. Clause 85(1) says very clearly that Ofcom may require a person

“to provide them with any information”—

I stress the word “any”—

“that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom can already request anything of these companies.

For the avoidance of doubt, clause 85(5) lists the various things Ofcom can request information for the purpose of and clause 85(5)(l)—on page 75, line 25— includes for

“the purpose of carrying out research, or preparing a report, in relation to online safety matters”.

Ofcom can request anything, expressly including requesting information to carry out research, which is exactly what the hon. Member for Aberdeen North quite rightly asks for.

The hon. Lady then said, “What if they withhold information or, basically, lie?” Clause 92 on page 80 sets out the situation when people commit an offence. The Committee will see that clause 92(3)(a) states that a person “commits an offence” if

“the person provides information that is false in a material respect”.

Again, clause 92(5)(a) states that a person “commits an offence” if

“the person suppresses, destroys or alters, or causes or permits the suppression, destruction or alteration of, any information required to be provided.”

In short, if the person or company who receives the information request lies, or falsifies or destroys information, they are committing an offence that will trigger not only civil sanctions—under which the company can pay a fine of up to 10% of global revenue or be disconnected—but a personal offence that is punishable by up to two years in prison.

I hope I have demonstrated that clauses 85 and 92 already clearly contain the powers for Ofcom to request any information, and that if people lie, destroy information or supress information as they do as the moment, as the hon. Member for Aberdeen North rightly says they do, that will be a criminal offence with full sanctions available. I hope that demonstrates to the Committee’s satisfaction that the Bill does this already, and that it is important that it does so for the reasons that the hon. Lady set out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a question for the Minister that hopefully, given the Committee’s work, he might be able to answer. New clause 19(2)(b) would give Ofcom the power to require services to submit to it

“all research the service holds on a topic specified by OFCOM.”

Ofcom could say, “We would like all the research you have on the actual age of users.”

My concern is that clause 85(1) allows Ofcom to require companies to provide it

“with any information that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom might not know what information the company holds. I am concerned that Ofcom is able to say, as it is empowered to do by clause 85(1), “Could you please provide us with the research piece you did on under-age users or on the age of users?”, instead of having a more general power to say, “Could you provide us with all the research you have done?” I am worried that the power in clause 85(1) is more specific.

Chris Philp Portrait Chris Philp
- Hansard - -

rose

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If the Minister holds on for two seconds, he will get to make an actual speech. I am worried that the power is not general enough. I would very much like to hear the Minister confirm what he thinks.

Chris Philp Portrait Chris Philp
- Hansard - -

I am not going to make a full speech. I have conferred with colleagues. The power conferred by clause 85(1) is one to require any information in a particular domain. Ofcom does not have to point to a particular research report and say, “Please give me report X.” It can ask for any information that is relevant to a particular topic. Even if it does not know what specific reports there may be—it probably would not know what reports there are buried in these companies—it can request any information that is at all relevant to a topic and the company will be obliged to provide any information relevant to that request. If the company fails to do so, it will be committing an offence as defined by clause 92, because it would be “suppressing”, to use the language of that clause, the information that exists.

I can categorically say to the hon. Lady that the general ability of Ofcom is to ask for any relevant information—the word “any” does appear—and even if the information notice does not specify precisely what report it is, Ofcom does have that power and I expect it to exercise it and the company to comply. If the company does not, I would expect it to be prosecuted.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Given that clarification, I will not press the new clause. The Minister has made the case strongly enough and has clarified clause 85(1) to my satisfaction. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 23

Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

violence against women or girls.

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).” —(Alex Davies-Jones.)

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This new clause would apply provisions applied to priority illegal content also to content that constitutes, encourages or promotes violence against women and girls. As it stands, the Bill is failing women and girls. In an attempt to tackle that alarming gap, the new clause uses the Istanbul convention definition of VAWG, given that the Home Secretary has so recently agreed to ratify the convention—just a decade after was signed.

The Minister might also be aware that GREVIO—the Group of Experts on Action against Violence against Women and Domestic Violence—which monitors the implementation of the Istanbul convention, published a report in October 2021 on the digital dimension of violence against women and girls. It stated that domestic laws are failing to place the abuse of women and girls online

“in the context of a continuum of violence against women that women and girls are exposed to in all spheres of life, including in the digital sphere.”

The purpose of naming VAWG in the Bill is to require tech companies to be responsible for preventing and addressing VAWG as a whole, rather than limiting their obligations only to specific criminal offences listed in schedule 7 and other illegal content. It is also important to note that the schedule 7 priority list was decided on without any consultation with the VAWG sector. Naming violence against women and girls will also ensure that tech companies are held to account for addressing emerging forms of online hate, which legislation is often unable to keep up with.

We only need to consider accounts from survivors of online violence against women and girls, as outlined in “VAWG Principles for the Online Safety Bill”, published in September last year, to really see the profound impact that the issue is having on people’s lives. Ellesha, a survivor of image-based sexual abuse, was a victim of voyeurism at the hands of her ex-partner. She was filmed without her consent and was later notified by someone else that he had uploaded videos of her to Pornhub. She recently spoke at an event that I contributed to—I believe the right hon. Member for Basingstoke and others also did—on the launch of the “Violence Against Women and Girls Code of Practice”. I am sure we will come to that code of practice more specifically on Report. Her account was genuinely difficult to listen to.

This is an issue that Ellesha, with the support of EVAW, Glitch, and a huge range of other organisations, has campaigned on for some time. She says:

“Going through all of this has had a profound impact on my life. I will never have the ability to trust people in the same way and will always second guess their intentions towards me. My self confidence is at an all time low and although I have put a brave face on throughout this, it has had a detrimental effect on my mental health.”

Ellesha was informed by the police that they could not access the websites where her ex-partner had uploaded the videos, so she was forced to spend an immense amount of time trawling through all of the videos uploaded to simply identify herself. I can only imagine how distressing that must have been for her.

Pornhub’s response to the police inquiries was very vague in the first instance, and it later ignored every piece of following correspondence. Eventually the videos were taken down, likely by the ex-partner himself when he was released from the police station. Ellesha was told that Pornhub had only six moderators at the time—just six for the entire website—and it and her ex-partner ultimately got away with allowing the damaging content to remain, even though the account was under his name and easily traced back to his IP address. That just is not good enough, and the Minister must surely recognise that the Bill fails women in its current form.

If the Minister needs any further impetus to genuinely consider the amendment, I point him to a BBC report from last week that highlighted how much obscene material of women and girls is shared online without their consent. The BBC’s Angus Crawford investigated Facebook accounts and groups that were seen to be posting pictures and videos of upskirting. Naturally, Meta—Facebook’s owner—said that it had a grip on the problem and that those accounts and groups had all been removed, yet the BBC was able to find thousands of users sharing material. Indeed, one man who posted videos of himself stalking schoolgirls in New York is now being investigated by the police. This is the reality of the internet; it can be a powerful, creative tool for good, but far too often it seeks to do the complete opposite.

I hate to make this a gendered argument, but there is a genuine difference between the experiences of men and women online. Last week the Minister came close to admitting that when I queried whether he had ever received an unsolicited indecent picture. I am struggling to understand why he has failed to consider these issues in a Bill proposed by his Department.

The steps that the Government are taking to tackle violence against women and girls offline are broadly to be commended, and I welcome a lot of the initiatives. The Minister must see sense and do the right thing by also addressing the harms faced online. We have a genuine opportunity in the Bill to prevent violence against women and girls online, or at least to diminish some of the harms they face. Will he please do the right thing?

Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that my hon. Friend will discuss the Law Commission’s recommendations on intimate image abuse. When I raised this issue in an earlier sitting, he was slightly unsighted by the fact that the recommendations were about to come out—I can confirm again that they will come out on 7 July, after some three years of deliberation. It is unfortunate that will be a week after the end of the Committee’s deliberations, and I hope that the timing will not preclude the Minister from mopping it up in his legislation.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister is making a good speech about the important things that the Bill will do to protect women and girls. We do not dispute that it will do so, but I do not understand why he is so resistant to putting this on the face of the Bill. It would cost him nothing to do so, and it would raise the profile. It would mean that everybody would concentrate on ensuring that there are enhanced levels of protection for women and girls, which we clearly need. I ask him to reconsider putting this explicitly on the face of the Bill, as he has been asked to do by us and so many external organisations.

Chris Philp Portrait Chris Philp
- Hansard - -

I completely understand and accept the point that there are groups of people in society who suffer disproportionate harms, as we have debated previously, and that obviously includes women and girls. There are of course other groups as well, such as ethnic minorities or people whose sexual orientation makes them the target of completely unacceptable abuse in a way that other groups do not suffer.

I accept the point about having this “on the face of the Bill”. We have debated this. That is why clauses 10 and 12 use the word “characteristic”—we debated this word previously The risk assessment duties, which are the starting point for the Bill’s provisions, must specifically and expressly—it is on the face of the Bill—take into account characteristics, first and foremost gender, but also racial identity, sexual orientation and so on. Those characteristics must be expressly addressed by the risk assessments for adults and for children, in order to make sure that the special protections or vulnerabilities or the extra levels of abuse people with those characteristics suffer are recognised and addressed. That is why those provisions are in the Bill, in clauses 10 and 12.

A point was raised about platforms not responding to complaints raised about abusive content that has been put online—the victim complains to the platform and nothing happens. The hon. Members for Pontypridd and for Aberdeen North are completely right that this is a huge problem that needs to be addressed. Clause 18(2) places a duty—they have to do it; it is not optional—on these platforms to operate a complaints procedure that is, in paragraph (c),

“easy to access, easy to use (including by children)”

and that, in paragraph (b),

“provides for appropriate action to be taken”.

They must respond. They must take appropriate action. That is a duty under clause 18. If they do not comply with that duty on a systemic basis, they will be enforced against. The shadow Minister and the hon. Member for Aberdeen North are quite right. The days of the big platforms simply ignoring valid complaints from victims have to end, and the Bill will end them.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I am extremely impressed by the Minister’s knowledge of the Bill, as I have been throughout the Committee’s sittings. It is admirable to see him flicking from page to page, finding where the information about violence against women and girls is included, but I have to concur with the hon. Member for Aberdeen North and my Front-Bench colleagues. There is surely nothing to be lost by specifically including violence against women and girls on the face of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - -

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Bill is strong, but it could be stronger. It could be, and should be, a world-leading piece of legislation. We want it to be world-leading and we feel that new clause 23 would go some way to achieving that aim. We have cross-party support for tackling violence against women and girls online. Placing it on the face of the Bill would put it at the core of the Bill—at its heart—which is what we all want to achieve. With that in mind, I wish to press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 24 would enable users to bring civil proceedings against providers when they fail to meet their duties under part 3 of the Bill. As has been said many times, power is currently skewed significantly against individuals and in favour of big corporations, leading people to feel that they have no real ability to report content or complain to companies because, whenever they do, there is no response and no action. We have discussed how the reporting, complaints and super-complaints mechanisms in the Bill could be strengthened, as well as the potential merits of an ombudsman, which we argued should be considered when we debated new clause 1.

In tabling this new clause, we are trying to give users the right to appeal through another route—in this case, the courts. As the Minister will be aware, that was a recommendation of the Joint Committee, whose report stated:

“While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.”

The Government’s response to that recommendation was that the Bill would not change the current situation, which allows individuals to

“seek redress through the courts in the event that a company has been negligent or is in breach of its contract with the individual.”

It went on to note:

“Over time, as regulatory precedent grows, it will become easier for individuals to take user-to-user services to court when necessary.”

That seems as close as we are likely to get to an admission that the current situation for individuals is far from easy. We should not have to wait for the conclusion of the first few long and drawn-out cases before it becomes easier for people to fight companies in the courts.

Some organisations have rightly pointed out that a system of redress based on civil proceedings in the courts risks benefiting those with the resources to sue—as we know, that is often the case. However, including that additional redress system on the face of the Bill should increase pressure on companies to fulfil their duties under part 3, which will hopefully decrease people’s need to turn to the redress mechanism.

If we want the overall system of redress to be as strong as possible, individuals must have the opportunity to appeal failures of a company’s duty of care as set out in the Bill. The Joint Committee argued that the importance of the issues dealt with by the Bill requires that users have a right of redress in the courts. The Government did not respond to that criticism in their formal response, but it is a critical argument. A balancing act between proportionate restrictions and duties versus protections against harms is at the heart of this legislation, and has been at the heart of all our debates. Our position is in line with that of the Joint Committee: these issues are too important to deny individuals the right to appeal failures of duty by big companies through the courts.

Chris Philp Portrait Chris Philp
- Hansard - -

I agree with the shadow Minister’s point that it is important to make sure social media firms are held to account, which is the entire purpose of the Bill. I will make two points in response to the proposed new clause, beginning with the observation that the first part of its effect is essentially to restate an existing right. Obviously, individuals are already at liberty to seek redress through the courts where a company has caused that individual to suffer loss through negligence or some other behaviour giving rise to grounds for civil liability. That would, I believe, include a breach of that company’s terms of service, so simply restating in legislation a right that already exists as a matter of law and common law is not necessary. We do not do declaratory legislation that just repeats an existing right.

Secondly, the new clause creates a new right of action that does not currently exist, which is a right of individual action if the company is in breach of one of the duties set out in part 3 of the Bill. Individuals being able to sue for a breach of a statutory duty that we are creating is not the way in which we are trying to construct enforcement under the Bill. We will get social media firms to comply through Ofcom acting as the regulator, rather than via individuals litigating these duties on a case-by-case basis. A far more effective way of dealing with the problems, as we discussed previously when we debated the ombudsman, is to get Ofcom to deal with this on behalf of the whole public on a systemic basis, funded not by individual litigants’ money, which is what would happen, at least in the first instance, if they had to proceed individually. Ofcom should act on behalf of us all collectively—this should appeal to socialists—using charges levied from the industry itself.

That is why we want to enforce against these companies using Ofcom, funded by the industry and acting on behalf of all of us. We want to fix these issues not just on an individual basis but systemically. Although I understand the Opposition’s intent, the first part simply declares what is already the law, and the second bit takes a different route from the one that the Bill takes. The Bill’s route is more comprehensive and will ultimately be more effective. Perhaps most importantly of all, the approach that the Bill takes is funded by the fees charged on the polluters—the social media firms—rather than requiring individual citizens, at least in the first instance, to put their hand in their own pocket, so I think the Bill as drafted is the best route to delivering these objectives.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will say a couple of things in response to the Minister. It is individuals who are damaged by providers breaching their duties under part 3 of the Bill. I understand the point about—

Chris Philp Portrait Chris Philp
- Hansard - -

Systems.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Yes, but it is not systems that are damaged; it is people. As I said in my speech, the Government’s response that, as regulatory precedent grows, it will become easier over time for individuals to take user-to-user services to court where necessary clearly shows that the Government think it will happen. What we are saying is: why should it wait? The Minister says it is declaratory, but I think it is important, so we will put the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I would like to press new clause 25 to a Division. It is important that it is included in the Bill.

Question put, That the clause be read a Second time.

Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Seventeenth sitting)

Chris Philp Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Before we adjourned, I was discussing the Government’s national artificial intelligence strategy and the two separate consultations launched by the Government to look at the intellectual property system in relation to AI. In those consultations, the Intellectual Property Office recognised that AI

“is playing an increasing role in...artistic creativity.”

However, specific questions about reviewing or enhancing performers’ rights were notably absent from both Government consultations. If the UK Government really want to make Britain a global AI and creative superpower, strengthening the rights of performers and other creatives must be at the heart of the national AI strategy.

Another key challenge is that our intellectual property framework is desperately out of date. Currently, performers have two sets of rights under the Copyright, Designs and Patents Act 1988: the right to consent to the making of a recording of a performance; and the right to control the subsequent use of such recordings, such as the right to make copies. However, as highlighted by Dr Mathilde Pavis, senior lecturer in law at the University of Exeter, AI-made performance synthetisation challenges our intellectual property framework because it reproduces performances without generating a recording or a copy, and therefore falls outside the scope of the Act. An unintended consequence is that people are left vulnerable to abuse and exploitation. Without effective checks and balances put in place by the Government, that will continue. That is why 93% of Equity members responding to a recent survey stated that the Government should introduce a new legal protection for performers, so that a performance cannot be reproduced by AI technology without the performer’s consent.

Advances in AI, including deepfake technology, have reinforced the urgent need to introduce image rights—also known as personality rights or publicity rights. That refers to

“the expression of a personality in the public domain”,

such as an individual’s name, likeness or other personal indicators. Provision of image rights in law enables performers to safeguard meaningful income streams, and to defend their artistic integrity, career choices, brand and reputation. More broadly, for society, it is an important tool for protecting privacy and allowing an individual to object to the use of their image without consent.

In the UK, there is no codified law of image rights or privacy. Instead, we have a patchwork of statutory and common-law causes of action, which an individual can use to protect various aspects of their image and personality. However, none of that is fit for purpose. Legal provision for image rights can be found around the world, so the Government here can and should do more. For example, some American states recognise the right through their statute, and some others through common law. California has both statutory and common-law strains of authority, which protect slightly different forms of the right.

The Celebrities Rights Act of 1985 was passed in California and extended the personality rights for a celebrity to 70 years after their death. In 2020, New York State passed a Bill that recognised rights of publicity for “deceased performers” and “deceased personalities”. Guernsey has created a statutory regime under which image rights can be registered. The legislation centres on the legal concept of a “personnage”— the person or character behind a personality that is registered. The image right becomes a property right capable of protection under the legislation through registration, which enables the image right to be protected, licensed and assigned.

The Minister will know that Equity is doing incredible work to highlight the genuine impact that this type of technology is having on our creative industry and our performers. He must therefore see the sense in our new clause, which would require the Government at least to consider the matter of synthetic media content, which thus far they have utterly failed to do.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure to serve under your chairmanship again, Ms Rees. I thank the shadow Minister, the hon. Member for Pontypridd, for raising the issues that she has done about synthetic and digitally manipulated content, which we are very conscious of. We are conscious of the risk of harm to those who work in the entertainment industry and of course, in particular, to victims of deepfake pornography.

We take intellectual property infringement extremely seriously. The Government have recently published a counter-infringement strategy, setting out a range of steps that we intend to take to strengthen the whole system approach to tackling infringement of intellectual property rights. It is widely acknowledged that the United Kingdom has an intellectual property framework that is genuinely world leading and considered among the best in the world. That includes strong protections for performers’ rights. We intend that to continue. However, we are not complacent and the law is kept under review, not least via the counter-infringement strategy I mentioned a moment ago.

Harmful synthetic media content, including the deepfakes that the hon. Member for Pontypridd mentioned, is robustly addressed by the safety duties set out in the Bill in relation to illegal content—much deepfake content, if it involves creating an image of someone, would be illegal—as well as content that could be harmful to children and content that will be on the “legal but harmful” adult list. Those duties will tackle the most serious and illegal forms of deepfake and will rightly cover certain threats that undermine our democracy. For example, a manipulated media image that contained incitement to violence, such as a deepfake of a politician telling people to attack poll workers because they are rigging an election, would obviously already fall foul of the Bill under the illegal duties.

In terms of reporting and codes of practice, the Bill already requires Ofcom to produce codes of practice setting out the ways in which providers can take steps to reduce the harm arising from illegal and harmful content, which could include synthetic media content such as deepfakes where those contain illegal content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister uses the example of a deepfake of a politician inciting people to attack poll workers during an election. Given some of the technology is so advanced that it is really difficult to spot when the deepfakes actually occur, could it be argued that Ofcom as regulator or even the platforms themselves would be adverse to removing or reporting the content as it could fall foul of the democratic content exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

The democratic content protection that the shadow Minister refers to, in clause 15, is not an exemption; it is a duty to take into account content of democratic importance. That is on line 34 of page 14. When making a decision, it has to be taken into account—it is not determinative; it is not as if a politician or somebody involved in an election gets a free pass to say whatever they like, even if it is illegal, and escapes the provisions of the Bill entirely. The platform simply has to take it into account. If it was a deepfake image that was saying such a thing, the balancing consideration in clause 15 would not even apply, because the protection applies to content of democratic importance, not to content being produced by a fake image of a politician.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is important that we get this right. One of our concerns on clause 15, which we have previously discussed, relates to this discussion of deepfakes, particularly of politicians, and timeframes. I understand the Minister’s point on illegal content. If there is a deepfake of a politician—on the eve of poll, for example—widely spreading disinformation or misinformation on a platform, how can the Minister confidently say that that would be taken seriously, in a timely manner? That could have direct implications on a poll or an election. Would the social media companies have the confidence to take that content down, given clause 15?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The protections in clause 15—they are not exemptions—would only apply to content that is of bona fide, genuine democratic importance. Obviously, a deepfake of a politician would not count as genuine, democratic content, because it is fake. If it was a real politician, such as the hon. Lady, it would benefit from that consideration. If it was a fake, it would not, because it would not be genuine content of democratic importance.

It is also worth saying that if—well, I hope when—our work with the Law Commission to review the criminal law related to the non-consensual taking and sharing of internet images is taken forward, that will then flow into the duties in the Bill. Deepfakes of internet images are rightly a concern of many people. That work would fall into the ambit of the Bill, either via clause 52, which points to illegal acts where there is an individual victim, or schedule 7, if a new internet image abuse were added to schedule 7 as a priority offence. There are a number of ways in which deepfakes could fall into the ambit of the Bill, including if they relate to extreme pornography.

The new clause would require the production of a report, not a change to the substantive duties in the Bill. It is worth saying that the Bill already provides Ofcom with powers to produce and publish reports regarding online safety matters. Those powers are set out in clause 137. The Bill will ensure that Ofcom has access to the information required to prepare those reports, including information from providers about the harm caused by deepfakes and how companies tackle the issue. We debated that extensively this morning when we talked about the strong powers that already exist under clause 85.

The hon. Lady has raised important points about intellectual property, and I have pointed to our counter-infringement strategy. She raised important points about deepfakes both in a political context and in the context of especially intimate images being generated by AI. I hope I have set out how the Bill addresses concerns in those areas. The Bill as drafted addresses those important issues in a way that is certainly adequate.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments and I am grateful for his reassurance on some of the concerns that were raised. At this stage we will not press the matter to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 27

OFCOM: power to impose duties on regulated services

“OFCOM: power to impose duties on regulated services

(1) OFCOM may carry out an assessment of the risk of harm posed by any regulated service.

(2) Where OFCOM assess a service to pose a very high risk of harm, OFCOM may, notwithstanding the categorisation of the service or the number or profile of its users, impose upon the service duties equivalent to—

(a) the children’s risk assessment duties set out in sections 10 and 25 of this Act; and

(b) the safety duties protecting children set out in sections 11 and 26 of this Act.”—(Kirsty Blackman.)

This new clause enables Ofcom to impose on any regulated service duties equivalent to the children’s risk assessment duties and the safety duties protecting children.

Brought up, and read the First time.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.

It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if

“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”

When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.

New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.

However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.

I wish to press the new clause to a Division.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.

If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.

I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.

Chris Philp Portrait Chris Philp
- Hansard - -

Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create

“functionalities allowing for control over content that is encountered, especially by children”.

Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.

I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - -

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.

Chris Philp Portrait Chris Philp
- Hansard - -

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

As another member of the Joint Committee, I totally understand the reasoning. I want to put on record my support for setting up a Committee through the approach the Minister mentioned using statutory instruments. I will not support the new clause but I strongly support the Joint Committee continuing in some form to enable scrutiny. When we look forward to the metaverse, virtual reality and all the things that are coming, it is important that that scrutiny continues. No offence to Opposition colleagues, but I do not think the new clause is the right way to do that. However, the subject is worth further exploration, and I would be very supportive of that happening.

Chris Philp Portrait Chris Philp
- Hansard - -

First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.

One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:

“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”

I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.

That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.

A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?

Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.

The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.

I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.

I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.

The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.

I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.

I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.

I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.

Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.

None Portrait The Chair
- Hansard -

Before I ask Alex Davies-Jones whether she wishes to press the new clause to a vote, I thank you all for the very respectful way in which you have conducted proceedings. It is much appreciated. Let me say on behalf of Sir Roger and myself that it has been an absolute privilege to co-chair this Bill Committee.

Online Safety Bill

Chris Philp Excerpts
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The Joint Committee’s recommendation was for a restructuring of the Bill, so that rather than having general duty of care responsibilities that were not defined, we defined those responsibilities based on existing areas of law. The core principle behind the Bill is to take things that are illegal offline, and to regulate such things online based on the legal threshold. That is what the Bill does.

In schedule 7, which did not exist in the draft phase, we have written into the Bill a long list of offences in law. I expect that, as this regime is created, the House will insert more regulations and laws into schedule 7 as priority offences in law. Even if an offence in law is not listed in the priority illegal harms schedule, it can still be a non-priority harm, meaning that even if a company does not have to look for evidence of that offence proactively, it still has to act if it is made aware of the offence. I think the law gives us a very wide range of offences, clearly defined against offences in law, where there are clearly understood legal thresholds.

The question is: what is to be done about other content that may be harmful but sits below the threshold? The Government have made it clear that we intend to bring forward amendments that set out clear priorities for companies on the reporting of such harmful content, where we expect the companies to set out what their policies are. That will include setting out clearly their policies on things such as online abuse and harassment, the circulation of real or manufactured intimate images, content promoting self-harm, content promoting eating disorders or legal suicide content—this is content relating to adults—so the companies will have to be transparent on that point.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- View Speech - Hansard - -

I congratulate the Minister on his appointment, and I look forward to supporting him in his role as he previously supported me in mine. I think he made an important point a minute ago about content that is legal but considered to be harmful. It has been widely misreported in the press that this Bill censors or prohibits such content. As the Minister said a moment ago, it does no such thing. There is no requirement on platforms to censor or remove content that is legal, and amendment 71 to clause 13 makes that expressly clear. Does he agree that reports suggesting that the Bill mandates censorship of legal content are completely inaccurate?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend, and as I said earlier, he is absolutely right. There is no requirement for platforms to take down legal speech, and they cannot be directed to do so. What we have is a transparency requirement to set out their policies, with particular regard to some of the offences I mentioned earlier, and a wide schedule of things that are offences in law that are enforced through the Bill itself. This is a very important distinction to make. I said to him on Second Reading that I thought the general term “legal but harmful” had added a lot of confusion to the way the Bill was perceived, because it created the impression that the removal of legal speech could be required by order of the regulator, and that is not the case.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s commitment, which is something that the previous Minister, the hon. Member for Croydon South (Chris Philp) also committed to in Committee. However, it should have been in the Bill to begin with, or been tabled as an amendment today so that we could discuss it on the Floor of the House. We should not have to wait until the Bill goes to the other place to discuss this fundamental, important point that I know colleagues on the Minister’s own Back Benches have been calling for. Here we are, weeks down the line, with nothing having been done to fix that problem, which we know will be a persistent problem unless action is taken. It is beyond frustrating that no indication was given in Committee of these changes, because they have wide-ranging consequences for the effects of the Bill. Clearly, the Government are distracted with other matters, but I remind the Minister that Labour has long called for a safer internet, and we are keen to get the Bill right.

Let us start with new clause 14, which provides clarification about how online services should determine whether content should be considered illegal, and therefore how the illegal safety duty should apply. The new clause is deeply problematic, and is likely to reduce significantly the amount of illegal content and fraudulent advertising that is correctly identified and acted on. First, companies will be expected to determine whether content is illegal or fraudulently based on information that is

“reasonably available to a provider”,

with reasonableness determined in part by the size and capacity of the provider. That entrenches the problems I have outlined with smaller, high-risk companies being subject to fewer duties despite the acute risks they pose. Having less onerous applications of the illegal safety duties will encourage malign actors to migrate illegal activity on to smaller sites that have less pronounced regulatory expectations placed on them. That has particularly concerning ramifications for children’s protections, which I will come on to shortly. On the other end of the scale, larger sites could use new clause 14 to argue that their size and capacity, and the corresponding volumes of material they are moderating, makes it impractical for them reliably and consistently to identify illegal content.

The second problem arises from the fact that the platforms will need to have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”.

That significantly raises the threshold at which companies are likely to determine that content is illegal. In practice, companies have routinely failed to remove content where there is clear evidence of illegal intent. That has been the case in instances of child abuse breadcrumbing, where platforms use their own definitions of what constitutes a child abuse image for moderation purposes. Charities believe it is inevitable that companies will look to use this clause to minimise their regulatory obligations to act.

Finally, new clause 14 and its resulting amendments do not appear to be adequately future-proofed. The new clause sets out that judgments should be made

“on the basis of all relevant information that is reasonably available to a provider.”

However, on Meta’s first metaverse device, the Oculus Quest product, that company records only two minutes of footage on a rolling basis. That makes it virtually impossible to detect evidence of grooming, and companies can therefore argue that they cannot detect illegal content because the information is not reasonably available to them. The new clause undermines and weakens the safety mechanisms that the Minister, his team, the previous Minister, and all members of the Joint Committee and the Public Bill Committee have worked so hard to get right. I urge the Minister to reconsider these amendments and withdraw them.

I will now move on to improving the children’s protection measures in the Bill. In Committee, it was clear that one thing we all agreed on, cross-party and across the House, was trying to get the Bill to work for children. With colleagues in the Scottish National party, Labour Members tabled many amendments and new clauses in an attempt to achieve that goal. However, despite their having the backing of numerous children’s charities, including the National Society for the Prevention of Cruelty to Children, 5Rights, Save the Children, Barnardo’s, The Children’s Society and many more, the Government sadly did not accept them. We are grateful to those organisations for their insights and support throughout the Bill’s passage.

We know that children face significant risks online, from bullying and sexist trolling to the most extreme grooming and child abuse. Our amendments focus in particular on preventing grooming and child abuse, but before I speak to them, I associate myself with the amendments tabled by our colleagues in the Scottish National party, the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I associate myself with the sensible changes they have suggested to the Bill at this stage, including a change to children’s access assessments through amendment 162 and a strengthening of duties to prevent harm to children caused by habit-forming features through amendment 190.

Since the Bill was first promised in 2017, the number of online grooming crimes reported to the police has increased by more than 80%. Last year, around 120 sexual communication with children offences were committed every single week, and those are only the reported cases. The NSPCC has warned that that amounts to a

“tsunami of online child abuse”.

We now have the first ever opportunity to legislate for a safer world online for our children.

However, as currently drafted, the Bill falls short by failing to grasp the dynamics of online child abuse and grooming, which rarely occurs on one single platform or app, as mentioned by my hon. Friend the Member for Oldham East and Saddleworth (Debbie Abrahams). In well-established grooming pathways, abusers exploit the design features of open social networks to contact children, then move their communication across to other, more encrypted platforms, including livestreaming sites and encrypted messaging services. For instance, perpetrators manipulate features such as Facebook’s algorithmic friend suggestions to make initial contact with large numbers of children, who they then groom through direct messages before moving to encrypted services such as WhatsApp, where they coerce children into sending sexual images. That range of techniques is often referred to as child abuse breadcrumbing, and is a significant enabler of online child abuse.

I will give a sense of how easy it is for abusers to exploit children by recounting the words and experiences of a survivor, a 15-year-old girl who was groomed on multiple sites:

“I’ve been chatting with this guy online who’s…twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to ‘prove my trust’ to him, like doing video chats with my chest exposed. Every time I did these things for him, he would ask for more and I felt like it was too late to back out. This whole thing has been slowly destroying me and I’ve been having thoughts of hurting myself.”

I appreciate that it is difficult listening, but that experience is being shared by thousands of other children every year, and we need to be clear about the urgency that is needed to change that.

It will come as a relief to parents and children that, through amendments 58 to 61, the Government have finally agreed to close the loophole that allowed for breadcrumbing to continue. However, I still wish to speak to our amendments 15, 16, and 17 to 19, which were tabled before the Government changed their mind. Together with the Government’s amendments, these changes will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material.

Amendment 15 would ensure that platforms have to include in their illegal content risk assessment content that

“reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

Amendment 16 would ensure that platforms have to maintain proportionate systems and processes to minimise the presence of such content on their sites. The wording of our amendments is tighter and includes aiding the discovery or dissemination of content, whereas the Government’s amendments cover only “commission or facilitation”. Can the Minister tell me why the Government chose that specific wording and opposed the amendments that we tabled in Committee, which would have done the exact same thing? I hope that in the spirit of collaboration that we have fostered throughout the passage of the Bill with the new Minister and his predecessor, the Minister will consider the merit of our amendments 15 and 16.

Labour is extremely concerned about the significant powers that the Bill in its current form gives to the Secretary of State. We see that approach to the Bill as nothing short of a shameless attempt at power-grabbing from a Government whose so-called world-leading Bill is already failing in its most basic duty of keeping people safe online. Two interlinked issues arise from the myriad of powers granted to the Secretary of State throughout the Bill: the first is the unjustified intrusion of the Secretary of State into decisions that are about the regulation of speech, and the second is the unnecessary levels of interference and threats to the independence of Ofcom that arise from the powers of direction to Ofcom in its day-to-day matters and operations. That is not good governance, and it is why Labour has tabled a range of important amendments that the Minister must carefully consider. None of us wants the Bill to place undue powers in the hands of only one individual. That is not a normal approach to regulation, so I fail to see why the Government have chosen to go down that route in this case.

Chris Philp Portrait Chris Philp
- View Speech - Hansard - -

I thank the shadow Minister for giving way—I will miss our exchanges across the Dispatch Box. She is making a point about the Secretary of State powers in, I think, clause 40. Is she at all reassured by the undertakings given in the written ministerial statement tabled by the Secretary of State last Thursday, in which the Government committed to amending the Bill in the Lords to limit the use of those powers to exceptional circumstances only, and precisely defined those circumstances as only being in connection with issues such as public health and public safety?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the former Minister for his intervention, and I am grateful for that clarification. We debated at length in Committee the importance of the regulator’s independence and the prevention of overarching Secretary of State powers, and of Parliament having a say and being reconvened if required. I welcome the fact that that limitation on the power will be tabled in the other place, but it should have been tabled as an amendment here so that we could have discussed it today. We should not have to wait for the Bill to go to the other place for us to have our say. Who knows what will happen to the Bill tomorrow, next week or further down the line with the Government in utter chaos? We need this to be done now. The Minister must recognise that this is an unparalleled level of power, and one with which the sector and Back Benchers in his own party disagree. Let us work together and make sure the Bill really is fit for purpose, and that Ofcom is truly independent and without interference and has the tools available to it to really create meaningful change and keep us all safe online once and for all.

--- Later in debate ---
John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of other right hon. and hon. Members. I welcome the Minister to his place after his much-deserved promotion; as other hon. Members have said, it is great to have somebody who is both passionate and informed as a Minister. I also pay tribute to the hon. Member for Croydon South (Chris Philp), who is sitting on the Back Benches: he worked incredibly hard on the Bill, displayed a mastery of detail throughout the process and was extremely courteous in his dealings with us. I hope that he will be speedily reshuffled back to the Front Bench, which would be much deserved—but obviously not that he should replace the Minister, who I hope will remain in his current position or indeed be elevated from it.

But enough of all this souking, as we say north of the border. As one can see from the number of amendments tabled, the Bill is not only an enormous piece of legislation but a very complex one. Its aims are admirable—there is no reason why this country should not be the safest place in the world to be online—but a glance through the amendments shows how many holes hon. Members think it still has.

The Government have taken some suggestions on board. I welcome the fact that they have finally legislated outright to stop the wicked people who attempt to trigger epileptic seizures by sending flashing gifs; I did not believe that such cruelty was possible until I was briefed about it in preparation for debates on the Bill. I pay particular tribute to wee Zach, whose name is often attached to what has been called Zach’s law.

The amendments to the Bill show that there has been a great deal of cross-party consensus on some issues, on which it has been a pleasure to work with friends in the Labour party. The first issue is addressed, in various ways, by amendments 44 to 46, 13, 14, 21 and 22, which all try to reduce the Secretary of State’s powers under the Bill. In all the correspondence that I have had about the Bill, and I have had a lot, that is the area that has most aggrieved the experts. A coalition of groups with a broad range of interests, including child safety, human rights, women and girls, sport and democracy, all agree that the Secretary of State is granted too many powers under the Bill, which threatens the independence of the regulator. Businesses are also wary of the powers, in part because they cause uncertainty.

The reduction of ministerial powers under the Bill was advised by the Joint Committee on the Draft Online Safety Bill and by the Select Committee on Digital, Culture, Media and Sport, on both of which I served. In Committee, I asked the then Minister whether any stakeholder had come forward in favour of these powers. None had.

Even DCMS Ministers do not agree with the powers. The new Minister was Chair of the Joint Committee, and his Committee’s report said:

“The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed.”

The Government have made certain concessions with respect to the powers, but they do not go far enough. As the Minister said, the powers should be removed.

We should be clear about exactly what the powers do. Under clause 40, the Secretary of State can

“modify a draft of a code of practice”.

That allows the Government a huge amount of power over the so-called independent communications regulator. I am glad that the Government have listened to the suggestions that my colleagues and I made on Second Reading and in Committee, and have committed to using the power only in “exceptional circumstances” and by further defining “public policy” motives. But “exceptional circumstances” is still too opaque and nebulous a phrase. What exactly does it mean? We do not know. It is not defined—probably intentionally.

The regulator must not be politicised in this way. Several similar pieces of legislation are going through their respective Parliaments or are already in force. In Germany, Australia, Canada, Ireland and the EU, with the Digital Services Act, different Governments have grappled with the issue of making digital regulation future-proof and flexible. None of them has added political powers. The Bill is sadly unique in making such provision.

When a Government have too much influence over what people can say online, the implications for freedom of speech are particularly troubling, especially when the content that they are regulating is not illegal. There are ways to future-proof and enhance the transparency of Ofcom in the Bill that do not require the overreach that these powers give. When we allow the Executive powers over the communications regulator, the protections must be absolute and iron-clad, but as the Bill stands, it gives leeway for abuse of those powers. No matter how slim the Minister feels the chance of that may be, as parliamentarians we must not allow it.

Amendment 187 on human trafficking is an example of a relatively minor change to the Bill that could make a huge difference to people online. Our amendment seeks to deal explicitly with what Meta and other companies refer to as domestic servitude, which is very newsworthy, today of all days, and which we know better as human trafficking. Sadly, this abhorrent practice has been part of our society for hundreds if not thousands of years. Today, human traffickers are aided by various apps and platforms. The same platforms that connect us with old friends and family across the globe have been hijacked by the very worst people in our world, who are using them to create networks of criminal enterprise, none more cruel than human trafficking.

Investigations by the BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell and co-ordinate the trafficking of young women. One would have thought that the issue would be of the utmost importance to Meta—Facebook, as it was at the time—yet, as the BBC reported, The Wall Street Journal found that

“the social media giant only took ‘limited action’ until ‘Apple Inc. threatened to remove Facebook’s products from the App Store, unless it cracked down on the practice’.”

I and my friends across the aisle who sat on the DCMS Committee and the Joint Committee on the draft Bill know exactly what it is like to have Facebook’s high heid yins before us. They will do absolutely nothing to respond to legitimate pressure. They understand only one thing: the force of law and of financial penalty. Only when its profits were in danger did Meta take the issue seriously.

The omission of human trafficking from schedule 7 is especially worrying, because if human trafficking is not directly addressed as priority illegal content, we can be certain that it will not be prioritised by the platforms. We know from their previous behaviour that the platforms never do anything that will cost them money unless they are forced to do so. We understand that it is difficult to regulate in respect of human trafficking on platforms: it requires work across borders and platforms, with moderators speaking different languages. It is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and for the costs that will be entailed. If human trafficking is not designated as a priority harm, I fear that it will fall by the wayside.

In Committee, the then Minister said that the relevant legislation was covered by other parts of the Bill and that it was not necessary to incorporate offences under the Modern Slavery Act 2015 into priority illegal content. He referred to the complexity of offences such as modern slavery, and said how illegal immigration and prostitution priority offences might cover that already. That is simply not good enough. Human traffickers use platforms as part of their arsenal at every stage of the process, from luring in victims to co-ordinating their movements and threatening their families. The largest platforms have ample capacity to tackle these problems and must be forced to be proactive. The consequences of inaction will be grave.

Chris Philp Portrait Chris Philp
- Hansard - -

It is a pleasure to follow the hon. Member for Ochil and South Perthshire (John Nicolson).

Let me begin by repeating my earlier congratulations to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) on assuming his place on the Front Bench. Let me also take this opportunity to extend my thanks to those who served on the Bill Committee with me for some 50 sitting hours—it was, generally speaking, a great pleasure—and, having stepped down from the Front Bench, to thank the civil servants who have worked so hard on the Bill, in some cases over many years.

--- Later in debate ---
Joanna Cherry Portrait Joanna Cherry
- Hansard - - - Excerpts

I hear what the hon. Gentleman is saying, but he will have heard the speech made by his colleague, the right hon. Member for Haltemprice and Howden (Mr Davis). Does he not accept that it is correct to say that there is a risk of an increase in content moderation, and does he therefore see the force of my amendment, which we have previously discussed privately and which is intended to ensure that Twitter and other online service providers are subject to anti-discrimination law in the United Kingdom under the Equality Act 2010?

Chris Philp Portrait Chris Philp
- Hansard - -

I did of course hear what was said by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis). To be honest, I think that increased scrutiny of content which might constitute abuse of harassment, whether of women or of ethnic minorities, is to be warmly welcomed. The Bill provides that the risk assessors must pay attention to the characteristics of the user. There is no cross-reference to the Equality Act—I know the hon. and learned Lady has submitted a request on that, to which my successor Minister will now be responding—but there are references to characteristics in the provisions on safety duties, and those characteristics do of course include gender and race.

In relation to the risk that these duties are over-interpreted or over-applied, for the first time ever there is a duty for social media firms to have regard to freedom of speech. At present these firms are under no obligation to have regard to it, but clause 19(2) imposes such a duty, and anyone who is concerned about free speech should welcome that. Clauses 15 and 16 go further: clause 15 creates special protections for “content of democratic importance”, while clause 16 does the same for content of journalistic importance. So while I hugely respect and admire my right hon. Friend the Member for Haltemprice and Howden, I do not agree with his analysis in this instance.

I would now like to ask a question of my successor. He may wish to refer to it later or write to me, but if he feels like intervening, I will of course give way to him. I note that four Government amendments have been tabled; I suppose I may have authorised them at some point. Amendments 72, 73, 78 and 82 delete some words in various clauses, for example clauses 13 and 15. They remove the words that refer to treating content “consistently”. The explanatory note attached to amendment 72 acknowledges that, and includes a reference to new clause 14, which defines how providers should go about assessing illegal content, what constitutes illegal content, and how content is to be determined as being in one of the various categories.

As far as I can see, new clause 14 makes no reference to treating, for example, legal but harmful content “consistently”. According to my quick reading—without the benefit of highly capable advice—amendments 72, 73, 78 and 82 remove the obligation to treat content “consistently”, and it is not reintroduced in new clause 14. I may have misread that, or misunderstood it, but I should be grateful if, by way of an intervention, a later speech or a letter, my hon. Friend the Minister could give me some clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I think that the codes of practice establish what we expect the response of companies to be when dealing with priority illegal harm. We would expect the regulator to apply those methods consistently. If my hon. Friend fears that that is no longer the case, I shall be happy to meet him to discuss the matter.

Chris Philp Portrait Chris Philp
- Hansard - -

Clause 13(6)(b), for instance, states that the terms of service must be

“applied consistently in relation to content”,

and so forth. As far as I can see, amendment 72 removes the word “consistently”, and the explanatory note accompanying the amendment refers to new clause 14, saying that it does the work of the previous wording, but I cannot see any requirement to act consistently in new clause 14. Perhaps we could pick that up in correspondence later.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

If there is any area of doubt, I shall be happy to follow it up, but, as I said earlier, I think we would expect that if the regulator establishes through the codes of practice how a company will respond proactively to identify illegal priority content on its platform, it is inherent that that will be done consistently. We would accept the same approach as part of that process. As I have said, I shall be happy to meet my hon. Friend and discuss any gaps in the process that he thinks may exist, but that is what we expect the outcome to be.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful to my hon. Friend for his comments. I merely observe that the “consistency” requirements were written into the Bill, and, as far as I can see, are not there now. Perhaps we could discuss it further in correspondence.

Let me turn briefly to clause 40 and the various amendments to it—amendments 44, 45, 13, 46 and others—and the remarks made by the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), about the Secretary of State’s powers. I intervened on the hon. Lady earlier on this subject. It also arose in Committee, when she and many others made important points on whether the powers in clause 40 went too far and whether they impinged reasonably on the independence of the regulator, in this case Ofcom. I welcome the commitments made in the written ministerial statement laid last Thursday—coincidentally shortly after my departure—that there will be amendments in the Lords to circumscribe the circumstances in which the Secretary of State can exercise those powers to exceptional circumstances. I heard the point made by the hon. Member for Ochil and South Perthshire that it was unclear what “exceptional” meant. The term has a relatively well defined meaning in law, but the commitment in the WMS goes further and says that the bases upon which the power can be exercised will be specified and limited to certain matters such as public health or matters concerning international relations. That will severely limit the circumstances in which those powers can be used, and I think it would be unreasonable to expect Ofcom, as a telecommunications regulator, to have expertise in those other areas that I have just mentioned. I think that the narrowing is reasonable, for the reasons that I have set out.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- Hansard - - - Excerpts

Those areas are still incredibly broad and open to interpretation. Would it not be easier just to remove the Secretary of State from the process and allow this place to take directly from Ofcom the code of standards that we are talking about so that it can be debated fully in the House?

Chris Philp Portrait Chris Philp
- Hansard - -

I understand my hon. Friend’s point. Through his work as the Chairman of the Select Committee he has done fantastic work in scrutinising the Bill. There might be circumstances where one needed to move quickly, which would make the parliamentary intervention he describes a little more difficult, but he makes his point well.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

So why not quicken up the process by taking the Secretary of State out of it? We will still have to go through the parliamentary process regardless.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government are often in possession of information—for example, security information relating to the UK intelligence community—that Ofcom, as the proposer of a code or a revised code, may not be in possession of. So the ability of the Secretary of State to propose amendments in those narrow fields, based on information that only the Government have access to, is not wholly unreasonable. My hon. Friend will obviously comment further on this in his speech, and no doubt the other place will give anxious scrutiny to the question as well.

I welcome the architecture in new clause 14 in so far as it relates to the definition of illegal content; that is a helpful clarification. I would also like to draw the House’s attention to amendment 16 to clause 9, which makes it clear that acts that are concerned with the commission of a criminal offence or the facilitation of a criminal offence will also trigger the definitions. That is a very welcome widening.

I do not want to try the House’s patience by making too long a speech, given how much the House has heard from me already on this topic, but there are two areas where, as far as I can see, there are no amendments down but which others who scrutinise this later, particularly in the other place, might want to consider. These are areas that I was minded to look at a bit more over the summer. No doubt it will be a relief to some people that I will not be around to do so. The first of the two areas that might bear more thought is clause 137, which talks about giving academic researchers access to social media platforms. I was struck by Frances Haugen’s evidence on this. The current approach in the Bill is for Ofcom to do a report that will takes two years, and I wonder if there could be a way of speeding that up slightly.

The second area concerns the operation of algorithms promoting harmful content. There is of course a duty to consider how that operates, but when it comes algorithms promoting harmful content, I wonder whether we could be a bit firmer in the way we treat that. I do not think that would restrain free speech, because the right of free speech is the right to say something; it is not the right to have an algorithm automatically promoting it. Again, Frances Haugen had some interesting comments on that.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I agree that there is scope for more to be done to enable those in academia and in broader civil society to understand more clearly what the harm landscape looks like. Does my hon. Friend agree that if they had access to the sort of information he is describing, we would be able to use their help to understand more fully and more clearly what we can do about those harms?

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. and learned Friend is right, as always. We can only expect Ofcom to do so much, and I think inviting expert academic researchers to look at this material would be welcome. There is already a mechanism in clause 137 to produce a report, but on reflection it might be possible to speed that up. Others who scrutinise the Bill may also reach that conclusion. It is important to think particularly about the operation of algorithmic promotion of harmful content, perhaps in a more prescriptive way than we do already. As I have said, Frances Haugen’s evidence to our Committee in this area was particularly compelling.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend on both points. I discussed the point about researcher access with him last week, when our roles were reversed, so I am sympathetic to that. There is a difference between that and the researcher access that the Digital Services Act in Europe envisages, which will not have the legal powers that Ofcom will have to compel and demand access to information. It will be complementary but it will not replace the primary powers that Ofcom will have, which will really set our regime above those elsewhere. It is certainly my belief that the algorithmic amplification of harmful content must be addressed in the transparency reports and that, where it relates to illegal activities, it must absolutely be within the scope of the regulator to state that actively promoting illegal content to other people is an offence under this legislation.

Chris Philp Portrait Chris Philp
- Hansard - -

On my hon. Friend’s first point, he is right to remind the House that the obligations to disclose information to Ofcom are absolute; they are hard-edged and they carry criminal penalties. Researcher access in no way replaces that; it simply acts as a potential complement to it. On his second point about algorithmic promotion, of course any kind of content that is illegal is prohibited, whether algorithmically promoted or otherwise. The more interesting area relates to content that is legal but perceived as potentially harmful. We have accepted that the judgments on whether that content stays up or not are for the platforms to make. If they wish, they can choose to allow that content simply to stay up. However, it is slightly different when it comes to algorithmically promoting it, because the platform is taking a proactive decision to promote it. That may be an area that is worth thinking about a bit more.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, if a platform has a policy not to accept a certain sort of content, I think the regulators should expect it to say in its transparency report what it is doing to ensure that it is not actively promoting that content through a newsfeed, on Facebook or “next up” on YouTube. I expect that to be absolutely within the scope of the powers we have in place.

Chris Philp Portrait Chris Philp
- Hansard - -

In terms of content that is legal but potentially harmful, as the Bill is drafted, the platforms will have to set out their policies, but their policies can say whatever they like, as we discussed earlier. A policy could include actively promoting content that is harmful through algorithms, for commercial purposes. At the moment, the Bill as constructed gives them that freedom. I wonder whether that is an area that we can think about making slightly more prescriptive. Giving them the option to leave the content up there relates to the free speech point, and I accept that, but choosing to algorithmically promote it is slightly different. At the moment, they have the freedom to choose to algorithmically promote content that is toxic but falls just on the right side of legality. If they want to do that, that freedom is there, and I just wonder whether it should be. It is a difficult and complicated topic and we are not going to make progress on it today, but it might be worth giving it a little more thought.

I think I have probably spoken for long enough on this Bill, not just today but over the last few months. I broadly welcome these amendments but I am sure that, as the Bill completes its stages, in the other place as well, there will be opportunities to slightly fine-tune it that all of us can make a contribution to.

Margaret Hodge Portrait Dame Margaret Hodge
- View Speech - Hansard - - - Excerpts

First, congratulations to the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins). I think his is one of the very few appointments in these latest shenanigans that is based on expertise and ability. I really welcome him, and the work he has done on the Bill this week has been terrific. I also thank the hon. Member for Croydon South (Chris Philp). When he held the position, he was open to discussion and he accepted a lot of ideas from many of us across the House. As a result, I think we have a better Bill before us today than we would have had. My gratitude goes to him as well.

I support much of the Bill, and its aim of making the UK the safest place to be online is one that we all share. I support the systems-based approach and the role of Ofcom. I support holding the platforms to account and the importance of protecting children. I also welcome the cross-party work that we have done as Back Benchers, and the roles played by both Ministers and by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). I thank him for his openness and his willingness to talk to us. Important amendments have been agreed on fraudulent advertising, bringing forward direct liability so there is not a two-year wait, and epilepsy trolling—my hon. Friend the Member for Batley and Spen (Kim Leadbeater) promoted that amendment.

I also welcome the commitment to bring forward amendments in the Lords relating to the amendments tabled by the hon. Member for Brigg and Goole (Andrew Percy) and the right hon. and learned Member for Kenilworth and Southam—I think those amendments are on the amendment paper but it is difficult to tell. It is important that the onus on platforms to be subject to regulation should be based not on size and functionality but on risk of harm. I look forward to seeing those amendments when they come back from the other place. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosques in Christchurch, New Zealand is probably the most egregious example, as the individual concerned had been on 8chan before committing that crime.

I am speaking to amendments 156 and 157 in my name and in the names of other hon. and right hon. Members. These amendments would address the issue of anonymous abuse. I think we all accept that anonymity is hugely important, particularly to vulnerable groups such as victims of domestic violence, victims of child abuse and whistleblowers. We want to retain anonymity for a whole range of groups and, in framing these amendments, I was very conscious of our total commitment to doing so.

Equally, freedom of speech is very important, as the right hon. Member for Haltemprice and Howden (Mr Davis) said, but freedom of speech has never meant freedom to harm, which is not a right this House should promote. It is difficult to define, and it is difficult to get the parameters correct, but we should not think that freedom of speech is an absolute right without constraints.