Debates between Chris Philp and Maria Miller during the 2019 Parliament

Tue 28th Jun 2022
Tue 21st Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022

Independent Cultural Review of London Fire Brigade

Debate between Chris Philp and Maria Miller
Monday 28th November 2022

(1 year, 5 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Philp Portrait Chris Philp
- View Speech - Hansard - -

I completely agree with the hon. Lady that the behaviour and the incidents that she just enumerated that were uncovered by the report are completely unacceptable. They have no place in any modern public service, whether that is the fire service or anywhere else. I am sure the whole House will join her and me in condemning that sort of behaviour unreservedly.

I spoke to London fire commissioner Andy Roe on Friday to set out my strong feelings that this behaviour is totally unacceptable and needs to completely end. As the hon. Lady said, he has committed to implementing all 23 of the report’s recommendations, including, importantly, outsourcing the complaints service, so that complaints are dealt with externally to the London Fire Brigade, and going back and looking again at all the complaints made over the last five years, to make sure they have been properly investigated—clearly, in many cases they have not been. He committed to ensuring that anyone found guilty of the sort of behaviour that she outlined from the report will be removed from their position. As I say, the behaviour that has been uncovered is totally unacceptable, and I am sure the whole House will join in condemning it.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - - - Excerpts

I welcome my right hon. Friend’s statement and agree that there is absolutely no place for racism or bullying in our society or any of our public services. Will he outline what he might also be doing to ensure that disciplinary measures are dealt with in a timely manner? There was a disciplinary issue in my local police force, as opposed to fire service; that was dealt with well, but it took three years. Will my right hon. Friend try to ensure that such cases will be dealt with in a more timely manner in future, whether in the fire service or police force?

Chris Philp Portrait Chris Philp
- View Speech - Hansard - -

My right hon. Friend is right about timeliness; that is one of the reasons why the London Fire Brigade Commissioner has said that he will be outsourcing the handling of complaints: to make sure that they are dealt with faster. Things work a bit differently at the police force, but there is an issue with timeliness. A number of police officers, including both the Commissioner and Deputy Commissioner of the Metropolitan Police have raised the issue with me as well. We are looking at a number of ways of speeding up the process, including potentially through legislation. I completely recognise what my right hon. Friend has said and we are actively working on that at the moment.

Online Safety Bill (Sixteenth sitting)

Debate between Chris Philp and Maria Miller
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that my hon. Friend will discuss the Law Commission’s recommendations on intimate image abuse. When I raised this issue in an earlier sitting, he was slightly unsighted by the fact that the recommendations were about to come out—I can confirm again that they will come out on 7 July, after some three years of deliberation. It is unfortunate that will be a week after the end of the Committee’s deliberations, and I hope that the timing will not preclude the Minister from mopping it up in his legislation.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Online Safety Bill (Fourteenth sitting)

Debate between Chris Philp and Maria Miller
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

It is a fair question. There might be circumstances in which somebody simply misjudges a situation—has not interpreted it correctly—and ends up committing a criminal offence; stumbling into it almost by accident. Most criminal offences require some kind of mens rea—some kind of intention to commit a criminal offence. If a person does something by accident, without intention, that does not normally constitute a criminal offence. Most criminal offences on the statute book require the person committing the offence to intend to do something bad. If we replace the word “intent” with “without consent”, the risk is that someone who does something essentially by accident will have committed a criminal offence.

I understand that the circumstances in which that might happen are probably quite limited, and the context of the incidents that the hon. Member for Pontypridd and my right hon. Friend the Member for Basingstoke have described would generally support the fact that there is a bad intention, but we have to be a little careful not accidentally to draw the line too widely. If a couple are exchanging images, do they have to consent prior to the exchange of every single image? We have to think carefully about such circumstances before amending the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have to say, just as an aside, that the Minister has huge levels of empathy, so I am sure that he can put himself into the shoes of someone who receives such an image. I am not a lawyer, but I know that there is a concept in law of acting recklessly, so if someone acts recklessly, as my hon. Friend has set out in his Bill, they can be committing a criminal offence. That is why I thought he might want to consider not having the conditional link between the two elements of subsection(1)(b), but instead having them as an either/or. If he goes back to the Law Commission’s actual recommendations, rather than the interpretation he was given by the MOJ, he will see that they set out that one of the conditions should be that defendants who are posting in this way are likely to cause harm. If somebody is acting in a way that is likely to cause harm, they would be transgressing. The Bill acknowledges that somebody can act recklessly. It is a well-known concept in law that people can be committing an offence if they act recklessly—reckless driving, for example. I wonder whether the Minister might think about that, knowing how difficult it would be to undertake what the hon. Member for Pontypridd is talking about, as it directly contravenes the Law Commission’s recommendations. I do not think what I am suggesting would contravene the Law Commission’s recommendations.

Chris Philp Portrait Chris Philp
- Hansard - -

I will commit to consider the clause further, as my right hon. Friend has requested. It is important to do so in the context of the Law Commission’s recommendations, but she has pointed to wording in the Law Commission’s original report that could be used to improve the drafting here. I do not want to make a firm commitment to change, but I will commit to considering whether the clause can be improved upon. My right hon. Friend referred to the “likely to cause harm” test, and asked whether recklessness as to whether someone suffers alarm, distress or humiliation could be looked at as a separate element. We need to be careful; if we sever that from sexual gratification, we need to have some other qualification on sexual gratification. We might have sexual gratification with consent, which would be fine. If we severed them, we would have to add another qualification.

It is clear that there is scope for further examination of clause 156. That does not necessarily mean it will be possible to change it, but it is worth examining it further in the light of the comments made by my right hon. Friend. The testimony we heard from witnesses, the testimony of my right hon. Friend and what we heard from the hon. Member for Pontypridd earlier do demonstrate that this is a widespread problem that is hugely distressing and intrusive and that it represents a severe violation. It does need to be dealt with properly.

We need to be cognisant of the fact that in some communities there is a culture of these kinds of pictures being freely exchanged between people who have not met or communicated before—on some dating websites, for example. We need to draft the clause in such a way that it does not inadvertently criminalise those communities—I have been approached by members of those communities who are concerned.

Online Safety Bill (Eleventh sitting)

Debate between Chris Philp and Maria Miller
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

Chris Philp Portrait Chris Philp
- Hansard - -

Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.

The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where

“OFCOM consider that an exemption…is appropriate”

and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.

On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.

The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.

Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.

Online Safety Bill (Twelfth sitting)

Debate between Chris Philp and Maria Miller
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

Chris Philp Portrait Chris Philp
- Hansard - -

First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.

I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.

My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.

I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.

Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.

Online Safety Bill (Ninth sitting)

Debate between Chris Philp and Maria Miller
Chris Philp Portrait Chris Philp
- Hansard - -

I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,

“anything communicated by means of an internet service”,

which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.

The remaining question—

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.

What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.

A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”

What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.

The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.

The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.

The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.

As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.

My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.

Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.

My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.

Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.

Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.

Online Safety Bill (Tenth sitting)

Debate between Chris Philp and Maria Miller
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - -

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

My hon. Friend has talked about the Department’s counter-disinformation unit. Do the Government anticipate that that function to continue, or will they expect Ofcom to do it?

Chris Philp Portrait Chris Philp
- Hansard - -

The work of the counter-disinformation unit is valuable. We look at these things on a spending review by spending review basis, and as far as I am aware we intend to continue with the counter-disinformation unit over the current spending review period. Clearly, I cannot commit future Ministers in perpetuity, but my personal view—if I am allowed to express it—is that that unit performs a useful function and could valuably be continued into the future. I think it is useful for the Government, as well as Ofcom, to directly have eyes on this issue, but I cannot speak for future Ministers. I can only give my right hon. Friend my own view.

I hope that I have set out my approach. We have heard the calls to publish the list so that parliamentarians can scrutinise it, and we also heard them on Second Reading.

I will now turn to the question raised by my hon. Friend the Member for Don Valley regarding freedom of expression. Those on one side of the debate are asking us to go further and to be clearer, while those on the other side have concerns about freedom of expression. As I have said, I honestly do not think that these legal but harmful provisions infringe on freedom of speech, for three reasons. First, even when the Secretary of State decides to designate content and Parliament approves of that decision through the affirmative procedure—Parliament gets to approve, so the Secretary of State is not acting alone—that content is not being banned. The Bill does not say that content designated as legal but harmful should immediately be struck from every corner of the internet. It simply says that category 1 companies—the big ones—have to do a proper risk assessment of that content and think about it properly.

Secondly, those companies have to have a policy to deal with that content, but that policy is up to them. They could have a policy that says, “It is absolutely fine.” Let us say that health disinformation is on the list, as one would expect it to be. A particular social media firm could have a policy that says, “We have considered this. We know it is risky, but we are going to let it happen anyway.” Some people might say that that is a weakness in the Bill, while others might say that it protects freedom of expression. It depends on one’s point of view, but that is how it works. It is for the company to choose and set out its policy, and the Bill requires it to enforce it consistently. I do not think that the requirements I have laid out amount to censorship or an unreasonable repression of free speech, because the platforms can still set their own terms and conditions.

There is also the general duty to have regard to free speech, which is introduced in clause 19(2). At the moment, no such duty exists. One might argue that the duty could be stronger, as my hon. Friend suggested previously, but it is unarguable that, for the first time ever, there is a duty on the platforms to have regard to free speech.

Online Safety Bill (Seventh sitting)

Debate between Chris Philp and Maria Miller
Chris Philp Portrait Chris Philp
- Hansard - -

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.

Chris Philp Portrait Chris Philp
- Hansard - -

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

Online Safety Bill (Eighth sitting)

Debate between Chris Philp and Maria Miller
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Chris Philp Portrait Chris Philp
- Hansard - -

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

Online Safety Bill (Sixth sitting)

Debate between Chris Philp and Maria Miller
Chris Philp Portrait Chris Philp
- Hansard - -

Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.

Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered

“by means of the service”.

That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,

“by means of the service”,

appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will my hon. Friend give way?

Chris Philp Portrait Chris Philp
- Hansard - -

One last time, because I am conscious that we need to make some progress this afternoon.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.

Chris Philp Portrait Chris Philp
- Hansard - -

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

My apologies. I will rise later.

Chris Philp Portrait Chris Philp
- Hansard - -

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.

In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out

“a duty to operate a service using proportionate systems and processes”

that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.

In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.

My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to

“prevent children of any age from encountering”

harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.

I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?

Chris Philp Portrait Chris Philp
- Hansard - -

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties