Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,

“anything communicated by means of an internet service”,

which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.

The remaining question—

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.

What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.

The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.

A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”

What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.

The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.

The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.

The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.

As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.

My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.

Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.

My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.

Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.

Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.