All 21 Maria Miller contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

Maria Miller Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

For too long, the tech giants have been able to dismiss the harms they create for the people we represent because they do not take seriously their responsibility for how their products are designed and used, which is why this legislation is vital.

The Bill will start to change the destructive culture in the tech industry. We live simultaneously in online and offline worlds, and we expect the rules and the culture to be the same in both, but at the moment, they are not. When I visited the big tech companies in Silicon Valley as Secretary of State in 2014 to talk about online moderation, which was almost completely absent at that stage, and child abuse images, which were not regularly removed, I rapidly concluded that the only way to solve the problem and the cultural deficit I encountered would be to regulate. I think this Bill has its roots in those meetings, so I welcome it and the Government’s approach.

I am pleased to see that measures on many of the issues on which I have been campaigning in the years since 2014 have come to fruition in this Bill, but there is still room for improvement. I welcome the criminalisation of cyber-flashing, and I pay tribute to Grazia, Clare McGlynn and Bumble for all their work with me and many colleagues in this place.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

Scotland banned cyber-flashing in 2010, but that ban includes a motivation test, rather than just a consent test, so a staggering 95% of cyber-flashing goes unpunished. Does the right hon. Lady agree that we should not make the same mistake?

Maria Miller Portrait Mrs Miller
- Hansard - -

I will come on to that shortly, and the hon. Lady knows I agree with her. This is something the Government need to take seriously.

The second thing I support in this Bill is limiting anonymous online abuse. Again, I pay tribute to the Football Association, with which I have worked closely, Glitch, the Centenary Action Group, Compassion in Politics, Hope not Hate and Kick It Out. They have all done a tremendous job, working with many of us in this place, to get to this point.

Finally, I support preventing children from accessing pornography, although I echo what we heard earlier about it being three years too late. It is shameful that this measure was not enacted earlier.

The Minister knows that three demands are coming his way from me. We need to future-proof our approach to the law in this area. Tech moves quickly—quicker than the Government’s approach to legislation, which leaves us playing whack-a-mole. The devious methods of causing harm change rapidly, as do the motivations of perpetrators, to answer the point raised by the hon. Member for Bath (Wera Hobhouse). What stays the same is the lack of consent from victims, so will the Government please look at that as a way of future-proofing our law? A worrying example of that is deepfake technology that creates pornographic images of women. That is currently totally lawful. Nudification software is commercially available and uses images—only of women —to create nude images. I have already stated publicly that that should be banned. It has been in South Korea and Taiwan, yet our law is playing catch-up.

The second issue that the Government need to address is the fact that they are creating many more victims as a result of this Bill. We need to make sure that victim support is in place to augment the amazing work of organisations such as the Revenge Porn Helpline. Finally, to echo the point made by my hon. Friend the Member for Watford (Dean Russell), let me say that this is a complex area, as we are proving with every speech in this debate. I pay tribute to the Select Committee Chair, who is no longer in his place, and the Joint Committee Chair, but I believe that we need a joint standing committee to scrutinise the implementation of this Bill when it is enacted. This is a world-class piece of legislation to change culture, but we also need other countries to adopt a similar approach. A global approach is needed if this is to work to end the wild west.

Online Safety Bill (First sitting)

Maria Miller Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You talk about illegal content and that Ofcom would not have a view on particular laws, but do you think there are harmful areas of content that are not currently covered by the law? I am thinking particularly about the issue of intimate image abuse, which is currently under Law Commission review, with recommendations expected very soon. Have you had any thoughts, particularly in the area of policy, about how you deal with issues that should be against the law but currently are not, given that part of your regulatory process is to determine whether companies are operating within the law?

Richard Wronka: I would start by saying that this is a fluid area. We have had a number of conversations with the Law Commission in particular and with other stakeholders, which has been really helpful. We recognise that the Bill includes four new offences, so there is already some fluidity in this space. We are aware that there are other Law Commission proposals that the Government are considering. Incitement to self-harm and flashing imagery that might trigger epilepsy are a couple of issues that come to mind there. Ultimately, where the criminal law sits is a matter for Parliament. We are a regulator: our role here is to make sure that the criminal law is reflected in the regulatory regime properly, rather than to determine or offer a view on where the criminal law should sit. Linking back to our point just a minute ago, we think it is really important that there is as much clarity as possible about how platforms can take some of those potentially quite tricky decisions about whether content meets the criminal threshold.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I have a question for the Children’s Commissioner. You talked just now about doing more on the advocacy of individual cases. I asked a question of Ofcom in the first panel about the issue of support for victims. Its response was that complaints processes will be part of what it will regulate. Do you think that will be enough to answer your concerns, or are you expecting more than simply ensuring that platforms do what they should do?

Dame Rachel de Souza: I absolutely think that we need to look at independent advocacy and go further. I do not think the Bill does enough to respond to individual cases of abuse and to understand issues and concerns directly from children. Children should not have to exhaust platforms’ ineffective complaints routes. It can take days, weeks, months. Even a few minutes or hours of a nude image being shared online can be enormously traumatising for children.

That should inform Ofcom’s policies and regulation. As we know, the risks and harms of the online world are changing constantly. It serves a useful purpose as an early warning mechanism within online safety regulation. I would like to see independent advocacy that allows a proper representation service for children. We need to hear from children directly, and I would like to see the Bill go further on this.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Two hopefully quick questions. I have been listening carefully. Could you summarise the main changes you will make to your products that your users will notice make them safer, whether they are children or adults? I have heard a lot about problems, but what are the changes you will actually make? Within that, could you talk about how you will improve your complaints system, which earlier witnesses said is inadequate?

Katy Minshall: We would certainly look to engage with Ofcom positively on the requirements it sets out. I am sorry to sound repetitive, but the challenge is that the Bill depends on so many things that do not exist yet and the definitions around what we mean by content harmful to adults or to children. In practice, that makes it challenging to say to you exactly today what approaches we would take. To be clear, we would of course look to continue working with the Government and now Ofcom with the shared objective of making the online space safer for everyone.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to probe you a little on that. Harmful content not being defined means that you will not make any changes other than around that. It is quite a large Bill; surely there are other things you will do differently, no?

Katy Minshall: The lesson of the past three or four years is that we cannot wait for the Bill. We at Twitter are continuing to make changes to our product and our policies to improve safety for everyone, including children.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q So the bill is irrelevant to you.

Katy Minshall: The Bill is a really important piece of regulation, which is why I was so pleased to come today and share our perspectives. We are continuing to engage positively with Ofcom. What I am trying to say is that until we see the full extent of the requirements and definitions, it is hard to set out exactly what steps we would take with regards to the Bill.

Ben Bradley: To add to that point, it is hard to be specific about some of the specific changes we would make because a lot of the detail of the Bill defers to Ofcom guidance and the codes of practice. Obviously we all have the duties around child safety and adult safety, but the Ofcom guidance will suggest specific measures that we can take to do that, some of which we may take already, and some of which may go further than what we already do. Once we see the details of the codes, we will be able to give a clearer answer.

Broadly from a TikTok perspective, through the design of the product and the way we approach safety, we are in a good place for when the new regime comes in, because we are regulated by Ofcom in the VSP regime, but we would have to wait for the full amount of detail. But outside some of the companies that you will hear from today, this will touch 20,000 companies and will raise the floor for all the companies that will be regulated under the regime.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q But you cannot give any further detail about specific changes you will make as a result of this legislation because you have not seen the guidance and the codes.

Ben Bradley: Yes, the codes of practice will recommend specific steps that we should take to achieve our duties. Until we see the detail of those codes it is hard to be specific about some of the changes that we would make.

None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.

Online Safety Bill (Second sitting)

Maria Miller Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

You are nodding, Ms Eagelton.

Jessica Eagelton: I agree with Professor McGlynn. Thinking about the broader landscape and intimate image abuse as well, I think there are some significant gaps. There is quite a piecemeal approach at the moment and issues that we are seeing in terms of enforcing measures on domestic abuse as well.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Q Thank you to all the panellists; it is incredibly helpful to have you here. The strength of the Bill will really be underpinned by the strength of the criminal law that underpins it, and schedule 7 lists offences that relate to sexual images, including revenge pornography, as priority offences. Can the witnesses say whether they think the law is sufficient to protect women from having their intimate pictures shared without their consent, or indeed whether the Bill will do anything to prevent the making and sharing of deepfake images? What would you like to see?

Professor Clare McGlynn: You make a very good point about how, in essence, criminal offences are now going to play a key part in the obligations of platforms under this Bill. In general, historically, the criminal law has not been a friend to women and girls. The criminal law was not written, designed or interpreted with women’s harms in mind. That means that you have a very piecemeal, confusing, out-of-date criminal law, particularly as regards online abuse, yet that is the basis on which we have to go forward. That is an unfortunate place for us to be, but I think we can strengthen it.

We could strengthen schedule 7 by, for example, including trafficking offences. There are tens of thousands of cases of trafficking, as we know from yourselves and whistleblowers, that platforms could be doing so much more about, but that is not a priority offence. The Obscene Publications Act distribution of unlawful images offence is not included. That means that incest porn, for example, is not a priority offence; it could be if we put obscene publications in that provision. Cyber-flashing, which again companies could take a lot of steps to act against, is not listed as a priority offence. Blackmail—sexual extortion, which has risen exponentially during the pandemic—again is not listed as a priority offence.

Deepfake pornography is a rising phenomenon. It is not an offence in English law to distribute deepfake pornography at the moment. That could be a very straightforward, simple change in the Bill. Only a few words are needed. It is very straightforward to make that a criminal offence, thanks to Scots law, where it is actually an offence to distribute altered images. The way the Bill is structured means the platforms will have to go by the highest standard, so in relation to deepfake porn, it would be interpreted as a priority harm—assuming that schedule 7 is actually altered to include all the Scottish offences, and the Northern Irish ones, which are absent at the moment.

The deepfake example points to a wider problem with the criminal law on online abuse: the laws vary considerably across the jurisdictions. There are very different laws on down-blousing, deepfake porn, intimate image abuse, extreme pornography, across all the different jurisdictions, so among the hundreds of lawyers the platforms are appointing, I hope they are appointing some Scots criminal lawyers, because that is where the highest standard tends to be.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Would the other panellists like to comment on this?

Jessica Eagelton: I think something that will particularly help in this instance is having that broad code of practice; that is a really important addition that must be made to the Bill. Refuge is the largest specialist provider of gender-based violence services in the country. We have a specialist tech abuse team who specialise in technology-facilitated domestic abuse, and what they have seen is that, pretty consistently, survivors are being let down by the platforms. They wait weeks and weeks for responses—months sometimes—if they get a response at all, and the reporting systems are just not up to scratch.

I think it will help to have the broad code of practice that Janaya mentioned. We collaborated with others to produce a workable example of what that could look like, for Ofcom to hopefully take as a starting point if it is mandated in the Bill. That sets out steps to improve the victim journey through content reporting, for example. Hopefully, via the code of practice, a victim of deepfakes and other forms of intimate image abuse would be able to have a more streamlined, better response from platforms.

I would also like to say, just touching on the point about schedule 7, that from the point of view of domestic abuse, there is another significant gap in that: controlling and coercive behaviour is not listed, but it should be. Controlling and coercive behaviour is one of the most common forms of domestic abuse. It carries serious risk; it is one of the key aggravating factors for domestic homicide, and we are seeing countless examples of that online, so we think that is another gap in schedule 7.

None Portrait The Chair
- Hansard -

Ms Walker?

Janaya Walker: Some of these discussions almost reiterate what I was saying earlier about the problematic nature of this, in that so much of what companies are going to be directed to do will be tied only to the specific schedule 7 offences. There have been lots of discussions about how you respond to some harms that reach a threshold of criminality and others that do not, but that really contrasts with the best practice approach to addressing violence against women and girls, which is really trying to understand the context and all of the ways that it manifests. There is a real worry among violence against women and girls organisations about the minimal response to content that is harmful to adults and children, but will not require taking such a rigorous approach.

Having the definition of violence against women and girls on the face of the Bill allows us to retain those expectations on providers as technology changes and new forms of abuse emerge, because the definition is there. It is VAWG as a whole that we are expecting the companies to address, rather than a changing list of offences that may or may not be captured in criminal law.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Lulu, how do you drive a culture change?

Lulu Freemont: TechUK’s membership is really broad. We have cyber and defence companies in our membership, and large platforms and telcos. We speak on behalf of the sector. We would say that there is a real commitment to safety and security.

To bring it back to regulation, the risk-based approach is very much the right one—one that we think has the potential to really deliver—but we have to think about the tech ecosystem and its diversity. Lots of TechUK members are on the business-to-business side and are thinking about the role that they play in supporting the infrastructure for many of the platforms to operate. They are not entirely clear that they are exempt in the Bill. We understand that it is a very clear policy intention to exempt those businesses, but they do not have the level of legal clarity that they need to understand their role as access facilities within the tech.

That is just one example of a part of the sector that you would not expect to be part of this culture change or regulation but which is being caught in it slightly as an unintended consequence of legal differences or misinterpretations. Coming from that wide-sector perspective, we think that we need clarity on those issues to understand the different functionalities, and each platform and service will be different in their approach to this stuff.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Jared Sine, did you have anything to add?

Jared Sine: Sure. I would add a couple of thoughts. We run our own age verification scans, which we do through the traditional age gate but also through a number of other scans that we run.

Again, online dating platforms are a little different. We warn our users upfront that, as they are going to be meeting people in real life, there is a fine balance between safety and privacy, and we tend to lean a little more towards safety. We announce to our users that we are going to run message scans to make sure there is no inappropriate behaviour. In fact, one of the tools we have rolled out is called “Are you sure? Does this bother you?”, through which our AI looks at the message a user is planning to send and, if it is an inappropriate message, a flag will pop up that says, “Are you sure you want to send this?” Then, if they go ahead and send it, the person receiving it at the other end will get a pop-up that says, “This may not be something you want to see. Go ahead and click here if you want to.” If they open it, they then get another pop-up that asks “Does this bother you?” and, if it does, you can report the user immediately.

We think that is an important step to keep our platform safe. We make sure our users know that it is happening, so it is not under the table. However, we think there has to be a balance between safety and privacy, especially when we have users who are meeting in person. We have actually demonstrated on our platforms that this reduces harassment and behaviour that would otherwise be untoward or that you would not want on the platform.

We think that we have to be careful not to tie the hands of industry to be able to come up with technological solutions and advances that can work side by side with third-party tools and solutions. We have third-party ID verification tools that we use. If we identify or believe a user is under the age of 18, we push them through an ID verification process.

The other thing to remember, particularly as it relates to online dating, is that companies such as ours and Bumble have done the right thing by saying “18-plus only on our platforms”. There is no law that says that an online dating platform has to be 18-plus, but we think it is right thing to do. I am a father of five kids; I would not want kids on my platform. We are very vigilant in taking steps to make sure we are using the latest and greatest tools available to try to make sure that our platforms are safe.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Rachel, we have, in you, what we are told is a leading, pre-eminent authority on the issue of age verification, so we are listening very carefully to what you say. I am thinking about the evidence we had earlier today, which said that it is reasonably straightforward for a large majority of young people to subvert age verification through the use of VPNs. You have been advocating third-party verification. How could we also deal with this issue of subverting the process through the use of the VPNs?

Dr Rachel O'Connell: I am the author of the technical standard PAS 1296, an age checking code of practice, which is becoming a global standard at the moment. We worked a lot with privacy and security and identity experts. It should have taken nine months, but it took a bit longer. There was a lot of thought that went into it. Those systems were developed to, as I just described, ensure a zero data, zero knowledge kind of model. What they do is enable those verifications to take place and reduce the requirement. There is a distinction between monitoring your systems, as was said earlier, for age verification purposes and abuse management. They are very different. You have to have abuse management systems. It is like saying that if you have a nightclub, you have to have bouncers. Of course you have to check things out. You need bouncers at the door. You cannot let people go into the venue, then afterwards say that you are spotting bad behaviour. You have to check at the door that they are the appropriate age to get into the venue.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can they not just hop on a VPN and bypass the whole system anyway?

Dr Rachel O'Connell: I think you guys will be aware of the DCMS programme of work about the verification of children last year. As part of that, there was a piece of research that asked children what they would think about age verification. The predominant thing that came across from young children is that they are really tired of having to deal with weirdos and pervs. It is an everyday occurrence for them.

To just deviate slightly to the business model, my PhD is in forensics and tracking paedophile activity on the internet way back in the ’90s. At that time, guys would have to look for kids. Nowadays, on TikTok and various livestream platforms, the algorithms recognise that an individual—a man, for example—is very interested in looking at content produced by kids. The algorithms see that a couple of times and go, “You don’t have to look anymore. We are going to seamlessly connect you with kids who livestream. We are also going to connect you with other men that like looking at this stuff.”

If you are on these livestream sites at 3 o’clock in the morning, you can see these kids who are having sleepovers or something. They put their phone down to record whatever the latest TikTok dance is, and they think that they are broadcasting to other kids. You would assume that, but what they then hear is the little pops of love hearts coming on to the screen and guys’ voices saying, “Hey sweetie, you look really cute. Lick your lips. Spread your legs.” You know where I am going with this.

The Online Safety Bill should look at the systems and processes that underpin these platforms, because there is gamification of kids. Kids want to become influencers—maybe become really famous. They see the views counter and think, “Wow, there are 200 people looking at us.” Those people are often men, who will co-ordinate their activities at the back. They will push the boys a little bit further, and if a girl is on her own, they will see. If the child does not respond to the request, they will drop off. The kid will think, “Oh my God. Well, maybe I should do it this one time.”

What we have seen is a quadrupling of child sexual abuse material online that has been termed “self-generated”, because the individual offender hasn’t actually produced it. From a psychological perspective, it is a really bad name, but that is a separate topic. Imagine if that was your kid who had been coerced into something that had then been labelled as “self-generated”. The businesses models that underpin those processes that happen online are certainly something that should be really within scope.

We do not spend enough time thinking about the implications of the use of recommendation engines and so on. I think the idea of the VPN is a bit of a red herring. Children want safety. They do not want to have to deal with this sort of stuff online. There are other elements. If you were a child and felt that you might be a little bit fat, you could go on YouTube and see whether you could diet or something. The algorithms will pick that up also. There is a tsunami of dieting and thinspiration stuff. There is psychological harm to children as a result of the systems and processes that these companies operate.

There was research into age verification solutions and trials run with BT. Basically, the feedback from both parents and children was, “Why doesn’t this exist already?”. If you go into your native EE app where it says, “Manage my family” and put in your first name, last name and mobile number and your child’s first name, last name and date of birth, it is then verified that you are their parent. When the child goes on Instagram or TikTok, they put in their first and last name. The only additional data point is the parent’s mobile number. The parent gets a notification and they say yes or no to access.

There are solutions out there. As others have mentioned, the young people want them and the parents want them. Will people try to work around them? That can happen, but if it is a parent-initiated process or a child-initiated process, you have the means to know the age bands of the users. From a business perspective, it makes a lot of sense because you can have a granular approach to the offerings you give to each of your customers in different age bands.

Nima Elmi: Just to add to what Rachel has said, I think she has articulated extremely well the complexities of the issues around not only age verification, but business models. Ultimately, this is such a complex matter that it requires continued consultation across industry, experts and civil society to identity pragmatic recommendations for industry when it comes to not only verifying the age of their users, but thinking about the nuanced differences between platforms, purposes, functionality and business models, and what that means.

In the context of the work we do here at Bumble, we are clear about our guidelines requiring people to be 18-plus to download our products from app stores, as well as ensuring that we have robust moderation processes to identify and remove under-18s from our platforms. There is an opportunity here for the Bill to go further in providing clarity and guidance on the issue of accessibility of children to services.

Many others have said over the course of today’s evidence that there needs to be a bit more colour put into definitions, particularly when certain sections of the Bill refer to what constitutes a “significant number of users” for determining child accessibility to platforms. Coupled with the fact that age verification or assurance is a complex area in and of itself and the nuance between how social media may engage with it versus a dating or social networking platform, I think that more guidance is very much needed and a much more nuanced approach would be welcome.

None Portrait The Chair
- Hansard -

I have three Members and the Minister to get in before 5 o’clock, so I urge brief questions and answers please.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Ms McDonald?

Rhiannon-Faye McDonald: I was just thinking of what I could add to what Susie has said. My understanding is that it is difficult to deal with cross-platform abuse because of the ability to share information between different platforms—for example, where a platform has identified an issue or offender and not shared that information with other platforms on which someone may continue the abuse. I am not an expert in tech and cannot present you with a solution to that, but I feel that sharing intelligence would be an important part of the solution.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q What risks do end-to-end encrypted platforms pose to children, and how should the Bill seek to mitigate those risks specifically?

Susie Hargreaves: We are very clear that end-to-end encryption should be within scope, as you have heard from other speakers today. Obviously, the huge threat on the horizon is the end-to-end encryption on Messenger, which would result in the loss of millions of images of child sexual abuse. In common with previous speakers, we believe that the technology is there. We need not to demonise end-to-end encryption, which in itself is not bad; what we need to do is ensure that children do not suffer as a consequence. We must have mitigations and safety mechanisms in place so that we do not lose these child sexual abuse images, because that means that we will not be able to find and support those children.

Alongside all the child protection charities, we are looking to ensure that protections equivalent to the current ones are in place in the future. We do not accept that the internet industry cannot put them in place. We know from experts such as Dr Hany Farid, who created PhotoDNA, that those mechanisms and protections exist, and we need to ensure that they are put in place so that children do not suffer as a consequence of the introduction of end-to-end encryption. Rhiannon has her own experiences as a survivor, so I am sure she would agree with that.

Rhiannon-Faye McDonald: I absolutely would. I feel very strongly about this issue, which has been concerning me for quite some time. I do not want to share too much, but I am a victim of online grooming and child sex abuse. There were images and videos involved, and I do not know where they are and who has seen them. I will never know that. I will never have any control over it. It is horrifying. Even though my abuse happened 19 years ago, I still walk down the street wondering whether somebody has seen those images and recognises me from them. It has a lifelong impact on the child, and it impacts on recovery. I feel very strongly that if end-to-end encryption is implemented on platforms, there must be safeguards in place to ensure we can continue to find and remove these images, because I know how important that is to the subject of those images.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Online Safety Bill (Third sitting)

Maria Miller Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q This evidence session is underlining to me how complicated these issues are. I am really grateful for your expertise, because we are navigating through a lot of issues. With slight trepidation I open the conversation up into another area—the issue of protection for children. One of the key objectives of the legislation is to ensure a higher level of protection for children than for adults. In your view, does the Bill achieve that? I am particularly interested in your views on whether the risks of harm to children should be set out on the face of the Bill, and if so, what harms should be included. Can I bring Mat in here?

Mat Ilic: Thank you so much. The impact of social media in children’s lives has been a feature of our work since 2015, if not earlier; we have certainly researched it from that period. We found that it was a catalyst to serious youth violence and other harms. Increasingly, we are seeing it as a primary issue in lots of the child exploitation and missing cases that we deal with—in fact, in half of the cases we have seen in some of the areas that we work in it featured as the primary reason rather than as a coincidental reason. The online harm is the starting point rather than a conduit.

In relation to the legislation, all our public statements on this have been informed by user research. I would say that is one of the central principles to think through in the primary legislation—a safety-by-design focus. We have previously called this the toy car principle, which means any content or product that is designed with children in mind needs to be tested in a way that is explicitly for children, as Mr Moy talked about. It needs to have some age-specific frameworks built in, but we also need to go further than that by thinking about how we might raise the floor, rather than necessarily trying to tackle explicit harms. Our point is that we need to remain focused on online safety for children and the drivers of online harm and not the content.

The question is, how can that be done? One way is the legal design requirement for safety, and how that might play out, as opposed to having guiding principles that companies might adopt. Another way is greater transparency on how companies make particular decisions, and that includes creating or taking off content that pertains to children. I want to underline the point about empowerment for children who have been exposed to or experience harm online, or offline as a result of online harm. That includes some kind of recourse to be able to bring forward cases where complaints, or other issues, were not taken seriously by the platforms.

If you read the terms and conditions of any given technology platform, which lots of young people do not do on signing up—I am sure lots of adults do not do that either—you realise that even with the current non-legislative frameworks that the companies deploy to self-regulate, there is not enough enforcement in the process. For example, if I experience some kind of abuse and complain, it might never be properly addressed. We would really chime on the enforcement of the regulatory environment; we would try to raise the floor rather than chase specific threats and harms with the legislation.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.

Online Safety Bill (Fourth sitting)

Maria Miller Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

May I ask a supplementary to that before I come on to my main question?

None Portrait The Chair
- Hansard -

Absolutely.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you so much for coming along. You spoke in your initial comments to my colleague about encryption. The challenges of encryption around child abuse images have been raised with us previously. How can we balance the need to allow people to have encrypted options, if possible, with the need to ensure that this does not adversely affect organisations such as the Internet Watch Foundation, which does so much good in protecting children and rooting out child abuse imagery?

Stephen Almond: I share your concern about this. To go back to what I was saying before, I think the approach that is set out in the Bill is proportionate and targeted. The granting of, ultimately, backstop powers to Ofcom to issue technology notices and to require services to deal with this horrendous material will have a significant impact. I think this will ensure that the regime operates in a risk-based way, where risks can be identified. There will be the firm expectation on service providers to take action, and that will require them to think about all the potential technological solutions that are available to them, be they content scanning or alternative ways of meeting their safety duties.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q My main question is about child safety, which is a prime objective for the Government in this legislation. Do you feel that the Bill’s definition of “likely to be accessed by children” should be more closely aligned with the one used in the ICO’s age-appropriate design code?

Stephen Almond: The objectives of both the Online Safety Bill and the children’s code are firmly aligned in respect of protecting children online. We have reviewed the definitions and, from our perspective, there are distinctions in the definition that is applied in the Bill and the children’s code, but we find no significant tension between them. My focus at the ICO, working in co-operation with Ofcom, will ultimately be on ensuring that there is clarity for business on how the definitions apply to their services, and that organisations know when they are in scope of the children’s code and what actions they should take.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Do you think any further aspects of the age-appropriate design code should be incorporated into the Bill?

Stephen Almond: We are not seeking to incorporate further aspects of the code into the Bill. We think it is important that the regimes fit together coherently, but that that is best achieved through regulatory co-operation between the ICO and Ofcom. The incorporation of the children’s code would risk creating some form of regulatory overlap and confusion.

I can give you a strong assurance that we have a good track record of working closely with Ofcom in this area. Last year, the children’s code came into force, and not too longer after it, Ofcom’s video-sharing platform regime came into force. We have worked very closely to make sure that those regimes are introduced in a harmonised way and that people understand how they fit together.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to focus on one particular issue, which is anonymity. Kick It Out has done so much with the FA to raise awareness of that issue. I was interested in your views on how the Bill treats that. The Bill mentions anonymity and pseudonymity, but it does so only once. Should the Bill take a clearer stance on online anonymity? Do you have any views on whether people should be able to use the internet fully anonymously, or should they disclose their identity to the platform? Do you have any thoughts on that? You have done a huge amount of work on it.

Sanjay Bhandari: There is quite a lot in that question. In terms of whether people should be fully anonymous or not, it depends on what you mean by fully. I am a lawyer, so I have 30 years specialising in the grey, rather than in the black and white. It really does depend on what you mean by fully. In my experience, nothing is absolute. There is no absolute right to freedom of speech; I cannot come in here and shout “Fire!” and make you all panic. There is also no absolute right to anonymity; I cannot use my anonymity online as a cloak to commit fraud. Everything is qualified. It is a question of what is the balance of those qualifications and what those qualifications should be, in the particular context of the problem that we are seeking to address.

The question in this context is around the fact that anonymity online is actually very important in some contexts. If you are gay in a country where that is illegal, being anonymous is a fantastic way to be able to connect with people like you. In a country that has a more oppressive regime, anonymity is another link to the outside world. The point of the Bill is to try to get the balance so that anonymity is not abused. For example, when a football player misses a penalty in a cup final, the point of the Bill is that you cannot create a burner account and instantly send them a message racially abusing them and then delete the account—because that is what happens now. The point of the Bill, which we are certainly happy with in general terms, is to draw a balance in the way that identity verification must be offered as an option, and to give users more power over who they interact with, including whether they wish to engage only with verified accounts.

We will come back and look in more detail at whether we would like more amendments, and we will also work with other organisations. I know that my colleague Stephen Kinsella of Clean up the Internet has been looking at those anonymity provisions and at whether verification should be defined and someone’s status visible on the face of the platforms, for instance. I hope that answers those two or three questions.

Maria Miller Portrait Mrs Miller
- Hansard - -

That is very helpful; thank you.

None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

Maria Miller Portrait Mrs Miller
- Hansard - -

No, I am talking about director liability and the enforcement on companies.

Eva Hartshorn-Sanders: Right. I think the responsibility on both companies and senior executives is a really critical part of this legislative package. You see how adding liability alongside financial penalties works in health and safety legislation and corporate manslaughter provisions to motivate changes not only within company culture but in the work that they are doing and what they factor into the decisions they make. It is a critical part of this Bill.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there more that could or should be added to the Bill?

Eva Hartshorn-Sanders: I think it is a good start. I would want to have another look at it to say more. There is a review after two years, as set out in clause 149, so there could be a factor that gets added into that, as well.

Maria Miller Portrait Mrs Miller
- Hansard - -

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q A great deal of the discussion we are having about this Bill is its scope—what is covered and what is not covered. Many of us will look regularly at newspapers online, particularly the comments sections, which can be quite colourful. Should comments on newspaper publisher platforms be included in the scope of the Bill?

Owen Meredith: Yes, I think they should be included within the news publisher exemption as it is spelt out. As far as I understand, that has always been the intention, since the original White Paper many years ago that led to where we are today. There is a very good reason for that, not least the fact that the comments on news publisher websites are still subject to the responsibility of the editor and the publisher; they are subject to the regulation of the Independent Press Standards Organisation, in the case of those publishers who are regulated under the self-regulation system by IPSO, as the majority of my members are. There is a very different environment in news publisher websites’ comments sections, where you are actively seeking to engage with those and read those as a user, whereas on social media platforms that content can come to you without you wishing to engage with it.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I just probe on that slightly? You say the comments are the responsibility of the editor. Does that mean that if something is published on there that is defamatory, it would then be attributed to the editor?

Owen Meredith: Everything published by the news site is ultimately the responsibility of the editor.

Matt Rogerson: I think there are various cases. I think Delfi is the relevant case in relation to comments, where if a publisher is notified of a defamatory comment within their comments section, they are legally liable for it if they do not take it down. To speak from a Guardian perspective, we would like comments sections to be included within the exemption. The self-regulation we have in place for our comments section has been quite a journey. We undertook quite a big bit of research on all the comments that had been left over an 11-year period. We tightened up significantly the processes that we had in place. We currently use a couple of steps to make sure those comments sections are well moderated. We use machine learning against very tightly defined terms, and then every single comment that is taken down is subject to human review. I think that works in the context of a relatively small website such as The Guardian, but it would be a much bigger challenge for a platform of the size of Facebook.

None Portrait The Chair
- Hansard -

Kim Leadbeater?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You say that transparency is so important. Can you give us any specifics about particular areas that should be subject to transparency?

Frances Haugen: Specifically for children or across the whole platform?

Maria Miller Portrait Mrs Miller
- Hansard - -

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Online Safety Bill (Fifth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve on the Committee. I want to apologise for missing the evidence sessions. Unfortunately, I came down with covid, but I have been following the progress of the Committee.

This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.

I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

May I join others in welcoming line-by-line scrutiny of the Bill? I am sure that the Minister will urge us to ensure that we do not make the perfect the enemy of the good. This is a very lengthy and complex Bill, and a great deal of time and scrutiny has already gone into it. I am sure that we will all pay due regard to that excellent work.

The hon. Member for Pontypridd is absolutely right to say that in many ways the world is watching what the Government are doing regarding online regulation. This will set a framework for many countries around the world, and we must get it right. We are ending the myth that social media and search engines are not responsible for their content. Their use of algorithms alone demonstrates that, while they may not publish all of the information on their sites, they are the editors at the very least and must take responsibility.

We will no doubt hear many arguments about the importance of free speech during these debates and others. I would like gently to remind people that there are many who feel that their free speech is currently undermined by the way in which the online world operates. Women are subject to harassment and worse online, and children are accessing inappropriate material. There are a number of areas that require specific further debate, particularly around the safeguarding of children, adequate support for victims, ensuring that the criminal law is future-proof within this framework, and ensuring that we pick up on the comments made in the evidence sessions regarding the importance of guidance and codes of practice. It was slightly shocking to hear from some of those giving evidence that the operators did not know what was harmful, as much has been written about the harm caused by the internet.

I will listen keenly to the Minister’s responses on guidance and codes of practice, and secondary legislation more generally, because it is critical to how the Bill works. I am sure we will have many hours of interesting and informed debate on this piece of legislation. While there has already been a great deal of scrutiny, the Committee’s role is pivotal to ensure that the Bill is as good as it can be.

Question put and agreed to.

Clause 1 accordingly ordered to stand part of the Bill.

Clause 2

Key Definitions

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 3 stand part.

That schedules 1 and 2 be the First and Second schedules to the Bill.

Clause 4 stand part.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger.

I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.

I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?

That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.

First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?

Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.

Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

Some of the evidence we heard suggested that the current precedent was that the Secretary of State had very little to do with independent regulators in this realm, but that the Bill overturns that precedent. Does the right hon. Lady have any concerns that the Bill hands too much power to the Secretary of State to intervene and influence regulators that should be independent?

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.

It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.

Online Safety Bill (Sixth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.

Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered

“by means of the service”.

That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,

“by means of the service”,

appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Will my hon. Friend give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One last time, because I am conscious that we need to make some progress this afternoon.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The purpose of this clause is to ensure that children at risk of online harms are given protections from harmful, age-inappropriate content through specific children’s safety duties for user-to-user services likely to be accessed by children.

It is welcome that the Bill contains strong provisions to ensure that service providers act upon and mitigate the risks identified in the required risk assessment, and to introduce protective systems and processes to address what children encounter. This amendment aims to ensure that online platforms are proactive in their attempts to mitigate the opportunity for sex offenders to abuse children.

As we have argued with other amendments, there are missed opportunities in the Bill to be preventive in tackling the harm that is created. The sad reality is that online platforms create an opportunity for offenders to identify, contact and abuse children, and to do so in real time through livestreaming. We know there has been a significant increase in online sexual exploitation during the pandemic. With sex offenders unable to travel or have physical contact with children, online abuse increased significantly.

In 2021, UK law enforcement received a record 97,727 industry reports relating to online child abuse, a 29% increase on the previous year, which is shocking. An NSPCC freedom of information request to police forces in England and Wales last year showed that online grooming offences reached record levels in 2020-21, with the number of sexual communications with a child offences in England and Wales increasing by almost 70% in three years. There has been a deeply troubling trend in internet-facilitated abuse towards more serious sexual offences against children, and the average age of children in child abuse images, particularly girls, is trending to younger ages.

In-person contact abuse moved online because of the opportunity there for sex offenders to continue exploiting children. Sadly, they can do so with little fear of the consequences, because detection and disruption of livestreamed abuse is so low. The duty to protect children from sexual offenders abusing them in real time and livestreaming their exploitation cannot be limited to one part of the internet and tech sector. While much of the abuse might take place on the user-to-user services, it is vital that protections against such abuse are strengthened across the board, including in the search services, as set out in clause 26.

At the moment there is no list of harms in the Bill that must be prioritised by regulated companies. The NSPCC and others have suggested including a new schedule, similar to schedule 7, setting out what the primary priority harms should be. It would be beneficial for the purposes of parliamentary scrutiny for us to consider the types of priority harm that the Government intend the Bill to cover, rather than leaving that to secondary legislation. I hope the Minister will consider that and say why it has not yet been included.

To conclude, while we all hope the Bill will tackle the appalling abuse of children currently taking place online, this cannot be achieved without tackling the conditions in which these harms can take place. It is only by requiring that steps be taken across online platforms to limit the opportunities for sex offenders to abuse children that we can see the prevalence of this crime reduced.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise, hopefully to speak to clause 11 more generally—or will that be a separate stand part debate, Ms Rees?

None Portrait The Chair
- Hansard -

That is a separate debate.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

My apologies. I will rise later.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 26 stand part.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.

In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out

“a duty to operate a service using proportionate systems and processes”

that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.

In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.

My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to

“prevent children of any age from encountering”

harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.

I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - -

For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Online Safety Bill (Seventh sitting)

Maria Miller Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will speak to new clause 1. Although duties about complaints procedures are welcome, it has been pointed out that service providers’ user complaints processes are often obscure and difficult to navigate—that is the world we are in at the moment. The lack of any external complaints option for individuals who seek redress is worrying.

The Minister has just talked about the super-complaints mechanism—which we will come to later in proceedings—to allow eligible entities to make complaints to Ofcom about a single regulated service if that complaint is of particular importance or affects a particularly large number of service users or members of the public. Those conditions are constraints on the super-complaints process, however.

An individual who felt that they had been failed by a service’s complaints system would have no source of redress. Without redress for individual complaints once internal mechanisms have been exhausted, victims of online abuse could be left with no further options, consumer protections could be compromised, and freedom of expression could be impinged upon for people who felt that their content had been unfairly removed.

Various solutions have been proposed. The Joint Committee recommended the introduction of an online safety ombudsman to consider complaints for which recourse to internal routes of redress had not resulted in resolution and the failure to address risk had led to significant and demonstrable harm. Such a mechanism would give people an additional body through which to appeal decisions after they had come to the end of a service provider’s internal process. Of course, we as hon. Members are all familiar with the ombudsman services that we already have.

Concerns have been raised about the level of complaints such an ombudsman could receive. However, as the Joint Committee noted, complaints would be received only once the service’s internal complaints procedure had been exhausted, as is the case for complaints to Ofcom about the BBC. The new clause seeks to ensure that we find the best possible solution to the problem. There needs to be a last resort for users who have suffered serious harm on services. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact on individuals.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

As the right hon. Member for Basingstoke has mentioned the revenge porn helpline, I will mention the NSPCC’s Report Remove tool for children. It does exactly the same thing, but for younger people—the revenge porn helpline is specifically only for adults. Both those tools together cover the whole gamut, which is massively helpful.

The right hon. Lady’s suggestion about the hypothecation of fines is a very good one. I was speaking to the NSPCC yesterday, and one of the issues that we were discussing was super-complaints. Although super-complaints are great and I am very glad that they are included in the Bill, the reality is that some of the third-sector organisations that are likely to be undertaking super-complaints are charitable organisations that are not particularly well funded. Given how few people work for some of those organisations and the amazing amount of work they do, if some of the money from fines could support not just victims but the initial procedure for those organisations to make super-complaints, it would be very helpful. That is, of course, if the Minister does not agree with the suggestion of creating a user advocacy panel, which would fulfil some of that role and make that support for the charitable organisations less necessary—although I am never going to argue against support for charities: if the Minister wants to hypothecate it in that way, that would be fantastic.

I tabled amendments 78 and 79, but the statement the Minister made about the definition of users gives me a significant level of comfort about the way that people will be able to access a complaints procedure. I am terribly disappointed that the Minister is not a regular Reddit user. I am not, either, but I am well aware of what Reddit entails. I have no desire to sign up to Reddit, but knowing that even browsing the site I would be considered a user and therefore able to report any illegal content I saw, is massively helpful. On that basis, I am comfortable not moving amendments 78 and 79.

On the suggestion of an ombudsman—I am looking at new clause 1—it feels like there is a significant gap here. There are ombudsman services in place for many other areas, where people can put in a complaint and then go to an ombudsman should they feel that it has not been appropriately addressed. As a parliamentarian, I find that a significant number of my constituents come to me seeking support to go to the ombudsman for whatever area it is in which they feel their complaint has not been appropriately dealt with. We see a significant number of issues caused by social media companies, in particular, not taking complaints seriously, not dealing with complaints and, in some cases, leaving illegal content up. Particularly in the initial stages of implementation—in the first few years, before companies catch up and are able to follow the rules put in place by the Bill and Ofcom—a second-tier complaints system that is removed from the social media companies would make things so much better than they are now. It would provide an additional layer of support to people who are looking to make complaints.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I am sure the hon. Lady will agree with me that it is not either/or—it is probably both. Ultimately, she is right that an ombudsman would be there to help deal with what I think will be a lag in implementation, but if someone is a victim of online intimate image abuse, in particular, they want the material taken down immediately, so we need to have organisations such as those that we have both mentioned there to help on the spot. It has to be both, has it not?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. Both those helplines do very good work, and they are absolutely necessary. I would strongly support their continuation in addition to an ombudsman-type service. Although I am saying that the need for an ombudsman would likely be higher in the initial bedding-in years, it will not go away—we will still need one. With NHS complaints, the system has been in place for a long time, and it works pretty well in the majority of cases, but there are still cases it gets wrong. Even if the social media companies behave in a good way and have proper complaints procedures, there will still be instances of them getting it wrong. There will still be a need for a higher level. I therefore urge the Minister to consider including new clause 1 in the Bill.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I can see that there is substantial demand to comment, so I shall start by giving way to my right hon. Friend the Member for Basingstoke.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The Minister is doing an excellent job explaining the complex nature of the Bill. Ultimately, however, as he and I know, it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it. If my hon. Friend looks back at his comments, he will see that that is exactly the point he was making. Although it is possibly not necessary with this clause, I think he needs to give some assurances that later in the Bill he will look at hypothecating some of the money to be generated from fines to address the issues of individual constituents, who on a daily basis are suffering at the hands of the social media companies. I apologise for the length of my intervention.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Eighth sitting)

Maria Miller Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On clause 37, it is welcome that Ofcom will have to prepare and issue a code of practice for service providers with duties relating to illegal content in the form of terrorism or child sexual exploitation and abuse content. The introduction of compliance measures relating to fraudulent advertising is also very welcome. We do, however, have some important areas to amend, including the role of different expert groups in assisting Ofcom during its consultation process, which I have already outlined in relation to animal cruelty.

On clause 38, Labour supports the notion that Ofcom must have specific principles to adhere to when preparing the codes of practice, and of course, the Secretary of State must have oversight of those. However, as I will touch on as we proceed, Labour feels that far too much power is given to the Secretary of State of the day in establishing those codes.

Labour believes that that schedule 4 is overwhelmingly loose in its language, and we have concerns about the ability of Ofcom—try as it might—to ensure that its codes of practice are both meaningful to service providers and in compliance with the Bill’s legislative requirements. Let me highlight the schedule’s broadness by quoting from it. Paragraph 4 states:

“The online safety objectives for regulated user-to-user services are as follows”.

I will move straight to paragraph 4(a)(iv), which says

“there are adequate systems and processes to support United Kingdom users”.

Forgive me if I am missing something here, but surely an assessment of adequacy is too subjective for these important codes of practice. Moreover, the Bill seems to have failed to consider the wide-ranging differences that exist among so-called United Kingdom users. Once again, there is no reference to future-proofing against emerging technologies. I hope that the Minister will therefore elaborate on how he sees the codes of practice and their principles, objectives and content as fit for purpose. More broadly, it is remarkable that schedule 4 is both too broad in its definitions and too limiting in some areas—we might call it a Goldilocks schedule.

I turn to new clause 20. As we have discussed, a significant majority of online child abuse takes place in private messages. Research from the NSPCC shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels. Recent data from the Office for National Statistics shows that private messaging plays a central role in contact between children and people whom they have not met offline before. When children are contacted by someone they do not know, in nearly three quarters of cases that takes place by private message.

Schedule 4 introduces new restrictions on Ofcom’s ability to require a company to use proactive technology to identify or disrupt abuse in private messaging. That will likely restrict Ofcom’s ability to include in codes of practice widely used industry-standard tools such as PhotoDNA and CSAI Match, which detect known child abuse images, and artificial intelligence classifiers to detect self-generated images and grooming behaviour. That raises significant questions about whether the regulator can realistically produce codes of practice that respond to the nature and extent of the child abuse threat.

As it stands, the Bill will leave Ofcom unable to require companies to proactively use technology that can detect child abuse. Instead, Ofcom will be wholly reliant on the use of CSEA warning notices under clause 103, which will enable it to require the use of proactive technologies only where there is evidence that child abuse is already prevalent—in other words, where significant online harm has already occurred. That will necessitate the use of a laborious and resource-intensive process, with Ofcom having to build the evidence to issue CSEA warning notices company by company.

Those restrictions will mean that the Bill will be far less demanding than comparable international legislation in respect of the requirement on companies to proactively detect and remove online child abuse. So much for the Bill being world leading. For example, the EU child abuse legislative proposal published in May sets out clear and unambiguous requirements on companies to proactively scan for child abuse images and grooming behaviour on private messages.

If the regulator is unable to tackle online grooming sufficiently proactively, the impact will be disproportionately felt by girls. NSPCC data shows that an overwhelming majority of criminal offences target girls, with those aged 12 to 15 the most likely to be victims of online grooming. Girls were victims in 83% of offences where data was recorded. Labour recognises that once again there are difficulties between our fundamental right to privacy and the Bill’s intentions in keeping children safe. This probing new clause is designed to give the Government an opportunity to report on the effectiveness of their proposed approach.

Ultimately, the levels of grooming taking place on private messaging platforms are incredibly serious. I have two important testimonies that are worth placing on the record, both of which have been made anonymous to protect the victims but share the same sentiment. The first is from a girl aged 15. She said:

“I’m in a serious situation that I want to get out of. I’ve been chatting with this guy online who’s like twice my age. This all started on Instagram but lately all our chats have been on WhatsApp. He seemed really nice to begin with, but then he started making me do these things to prove my trust to him, like doing video chats with my chest exposed.”

The second is from a boy aged 17. He said:

“I’ve got a fitness page on Instagram to document my progress but I get a lot of direct messages from weird people. One guy said he’d pay me a lot of money to do a private show for him. He now messages me almost every day asking for more explicit videos and I’m scared that if I don’t do what he says, then he will leak the footage and my life would be ruined”.

Those testimonies go to show how fundamentally important it is for an early assessment to be made of the effectiveness of the Government’s approach following the Bill gaining Royal Assent.

We all have concerns about the use of proactive technology in private messaging and its potential impact on personal privacy. End-to-end encryption offers both risks and benefits to the online environment, but the main concern is based on risk profiles. End-to-end encryption is particularly problematic on social networks because it is embedded in the broader functionality of the service, so all text, DMs, images and live chats could be encrypted. Consequently, its impact on detecting child abuse becomes even greater. There is an even greater risk with Meta threatening to bring in end-to-end encryption for all its services. If platforms cannot demonstrate that they can mitigate those risks to ensure a satisfactory risk profile, they should not be able to proceed with end-to-end encryption until satisfactory measures and mitigations are in place.

Tech companies have made significant efforts to frame this issue in the false binary that any legislation that impacts private messaging will damage end-to-end encryption and will mean that encryption will not work or is broken. That argument is completely false. A variety of novel technologies are emerging that could allow for continued CSAM scanning in encrypted environments while retaining the privacy benefits afforded by end-to-end encryption.

Apple, for example, has developed its NeuralHash technology, which allows for on-device scans for CSAM before a message is sent and encrypted. That client-side implementation—rather than service-side encryption—means that Apple does not learn anything about images that do not match the known CSAM database. Apple’s servers flag accounts that exceed a threshold number of images that match a known database of CSAM image hashes, so that Apple can provide relevant information to the National Centre for Missing and Exploited Children. That process is secure and expressly designed to preserve user privacy.

Homomorphic encryption technology can perform image hashing on encrypted data without the need to decrypt the data. No identifying information can be extracted and no details about the encrypted image are revealed, but calculations can be performed on the encrypted data. Experts in hash scanning—including Professor Hany Farid of the University of California, Berkeley, who developed PhotoDNA—insist that scanning in end-to-end encrypted environments without damaging privacy will be possible if companies commit to providing the engineering resources to work on it.

To move beyond the argument that requiring proactive scanning for CSAM means breaking or damaging end-to-end encryption, amendments to the Bill could provide a powerful incentive for companies to invest in technology and engineering resources that will allow them to continue scanning while pressing ahead with end-to-end encryption, so that privacy is preserved but appropriate resources for and responses to online child sexual abuse can continue. It is highly unlikely that some companies will do that unless they have the explicit incentive to do so. Regulation can provide such an incentive, and I urge the Minister to make it possible.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

It is a pleasure to follow the shadow Minister, who made some important points. I will focus on clause 37 stand part. I pay tribute to the Minister for his incredible work on the Bill, with which he clearly wants to stop harm occurring in the first place. We had a great debate on the matter of victim support. The Bill requires Ofcom to produce a number of codes of practice to help to achieve that important aim.

Clause 37 is clear: it requires codes of practice on illegal content and fraudulent advertising, as well as compliance with “the relevant duties”, and it is on that point that I hope the Minister can help me. Those codes will help Ofcom to take action when platforms do things that they should not, and will, I hope, provide a way for platforms to comply in the first place rather than falling foul of the rules.

How will the codes help platforms that are harbouring material or configuring their services in a way that might be explicitly or inadvertently promoting violence against women and girls? The Minister knows that women are disproportionately the targets of online abuse on social media or other platforms. The impact, which worries me as much as I am sure it worries him, is that women and girls are told to remove themselves from social media as a way to protect themselves against extremely abusive or harassing material. My concern is that the lack of a specific code to tackle those important issues might inadvertently mean that Ofcom and the platforms overlook them.

Would a violence against women and girls code of practice help to ensure that social media platforms were monitored by Ofcom for their work to prevent tech-facilitated violence against women and girls? A number of organisations think that it would, as does the Domestic Abuse Commissioner herself. Those organisations have drafted a violence against women and girls code of practice, which has been developed by an eminent group of specialists—the End Violence Against Women Coalition, Glitch, Carnegie UK Trust, the NSPCC, 5Rights, and Professors Clare McGlynn and Lorna Woods, both of whom gave evidence to us. They believe it should be mandatory for Ofcom to adopt a violence against women and girls code to ensure that this issue is taken seriously and that action is taken to prevent the risks in the first place. Clause 37 talks about codes, but it is not specific on that point, so can the Minister help us? Like the rest of the Committee, he wants to prevent women from experiencing these appalling acts online, and a code of practice could help us deal with that better.

--- Later in debate ---
I hope that clarifies how the Bill operates. As I said, we are giving careful thought to finding ways—which I hope we can—to strengthen those powers in clause 103.
Maria Miller Portrait Dame Maria Miller
- Hansard - -

I think my hon. Friend’s list goes on to page 37, which means there would be a number of different relevant duties that would presumably then be subject to the ability to issue codes of practice. However, the point I was making in my earlier contribution is that this list does not include the issue of violence against women and girls. In looking at this exhaustive list that my hon. Friend has included in the Bill, I must ask whether he might inadvertently be excluding the opportunity for Ofcom to produce a code of practice on the issue of violence against women and girls. Having heard his earlier comments, I felt that he was slightly sympathetic to that idea.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, and as Members have pointed out, women and girls suffer disproportionately from abuse online; unfortunately, tragically and disgracefully, they are disproportionately victims of such abuse. The duties in the Bill obviously apply to everybody—men and women—but women will obviously disproportionately benefit, because they are disproportionately victims.

Obviously, where there are things that are particular to women, such as particular kinds of abuse that women suffer that men do not, or particular kinds of abuse that girls suffer that boys do not, then we would expect the codes of practice to address those kinds of abuse, because the Bill states that they must keep children safe, in clause 37(10)(b), and adults safe, in clause 37(10)(c). Obviously, women are adults and we would expect those particular issues that my right hon. Friend mentioned to get picked up by those measures.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

My hon. Friend is giving me a chink of light there, in that subsection (10)(c) could actively mean that a code of practice that specifically dealt with violence against women and girls would be admissible as a result of that particular point. I had not really thought of it in that way—am I thinking about it correctly?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend makes an interesting point. To avoid answering a complicated question off the cuff, perhaps I should write to her. However, I certainly see no prohibition in these words in the clause that would prevent Ofcom from writing a particular code of practice. I would interpret these words in that way, but I should probably come back to her in writing, just in case I am making a mistake.

As I say, I interpret those words as giving Ofcom the latitude, if it chose to do so, to have codes of practice that were specific. I would not see this clause as prescriptive, in the sense that if Ofcom wanted to produce a number of codes of practice under the heading of “adults”, it could do so. In fact, if we track back to clause 37(3), that says:

“OFCOM must prepare and issue one or more codes of practice”.

That would appear to admit the possibility that multiple codes of practice could be produced under each of the sub-headings, including in this case for adults and in the previous case for children. [Interruption.] I have also received some indication from officials that I was right in my assessment, so hopefully that is the confirmation that my right hon. Friend was looking for.

Question put and agreed to.

Clause 37 accordingly ordered to stand part of the Bill.

Clause 38 ordered to stand part of the Bill.

Schedule 4

Codes of practice under section 37: principles, objectives, content

Amendment proposed: 63, in schedule 4, page 176, line 29, at end insert “and

(x) there are adequate safeguards to monitor cruelty towards humans and animals;”.—(Alex Davies-Jones.)

This amendment would ensure that ensuring adequate safeguards to monitor cruelty towards humans and animals is one of the online safety objectives for user-to-user services.

Question put, That the amendment be made.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?

None Portrait The Chair
- Hansard -

We are not debating clause 40, Dame Maria, but we will come to it eventually.

Online Safety Bill (Ninth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am afraid it was not me that cited new information. It was my hon. Friend the Member for Watford who said he had had further discussions with Ministers. I am delighted to hear that he found those discussions enlightening, as I am sure they—I want to say they always are, but let us say they often are.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

Before my hon. Friend moves on, can I ask a point of clarification? The hon. Member for Ochil and South Perthshire is right that this is an important point, so we need to understand it thoroughly. I think he makes a compelling argument about the exceptional circumstances. If Ofcom did not agree that a change that was being requested was in line with what my hon. Friend the Minister has said, how would it be able to discuss or, indeed, challenge that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend raises a good question. In fact, I was about to come on to the safeguards that exist to address some of the concerns that have been raised this morning. Let me jump to the fourth of the safeguards, which in many ways is the most powerful and directly addresses my right hon. Friend’s question.

In fact, a change has been made. The hon. Member for Ochil and South Perthshire asked what changes had been made, and one important change—perhaps the change that my hon. Friend the Member for Watford found convincing—was the insertion of a requirement for the codes, following a direction, to go before Parliament and be voted on using the affirmative procedure. That is a change. The Bill previously did not have that in it. We inserted the use of the affirmative procedure to vote on a modified code in order to introduce extra protections that did not exist in the draft of the Bill that the Joint Committee commented on.

I hope my right hon. Friend the Member for Basingstoke will agree that if Ofcom had a concern and made it publicly known, Parliament would be aware of that concern before voting on the revised code using the affirmative procedure. The change to the affirmative procedures gives Parliament extra control. It gives parliamentarians the opportunity to respond if they have concerns, if third parties raise concerns, or if Ofcom itself raises concerns.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her rapid description of that amendment. We will come to clause 189 in due course. The definition of “content” in that clause is,

“anything communicated by means of an internet service”,

which sounds like it is quite widely drafted. However, we will obviously debate this issue properly when we consider clause 189.

The remaining question—

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I intervene rather than making a subsequent substantive contribution because I am making a very simple point. My hon. Friend the Minister is making a really compelling case about the need for freedom of speech and the need to protect it within the context of newspapers online. However, could he help those who might be listening to this debate today to understand who is responsible if illegal comments are made on newspaper websites? I know that my constituents would be concerned about that, not particularly if illegal comments were made about a Member of Parliament or somebody else in the public eye, but about another individual not in the public eye.

What redress would that individual have? Would it be to ask the newspaper to take down that comment, or would it be that they could find out the identity of the individual who made the comment, or would it be that they could take legal action? If he could provide some clarity on that, it might help Committee members to understand even further why he is taking the position that he is taking.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for that intervention. First, clearly if something illegal is said online about someone, they would have the normal redress to go to the police and the police could seek to exercise their powers to investigate the offence, including requesting the company that hosts the comments—in this case, it would be a newspaper’s or broadcaster’s website—to provide any relevant information that might help to identify the person involved; they might have an account, and if they do not they might have a log-on or IP address. So, the normal criminal investigatory procedures would obviously apply.

Secondly, if the content was defamatory, then—I realise that only people like Arron Banks can sue for libel, but there is obviously civil recourse for libel. And I think there are powers in the civil procedure rules that allow for court orders to be made that require organisations, such as news media websites, to disclose information that would help to identify somebody who is a respondent in a civil case.

Thirdly, there are obviously the voluntary steps that the news publisher might take to remove content. News publishers say that they do that; obviously, their implementation, as we know, is patchy. Nevertheless, there is that voluntary route.

Regarding any legal obligation that may fall on the shoulders of the news publisher itself, I am not sure that I have sufficient legal expertise to comment on that. However, I hope that those first three areas of redress that I have set out give my right hon. Friend some assurance on this point.

Finally, I turn to a question asked by the hon. Member for Aberdeen North. She asked whether the exemption for “one-to-one live aural communications”, as set out in clause 49(2)(d), could inadvertently allow grooming or child sexual exploitation to occur via voice messages that accompany games, for example. The exemption is designed to cover what are essentially phone calls such as Skype conversations—one-to-one conversations that are essentially low-risk.

We believe that the Bill contains other duties to ensure that services are designed to reduce the risk of grooming and to address risks to children, if those risks exist, such as on gaming sites. I would be happy to come back to the hon. Lady with a better analysis and explanation of where those duties sit in the Bill, but there are very strong duties elsewhere in the Bill that impose those obligations to conduct risk assessments and to keep children safe in general. Indeed, the very strongest provisions in the Bill are around stopping child sexual exploitation and abuse, as set out in schedule 6.

Finally, there is a power in clause 174(1) that allows us, as parliamentarians and the Government, to repeal this exemption using secondary legislation. So, if we found in the future that this exemption caused a problem, we could remove it by passing secondary legislation.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the hon. Member’s contribution, and for her support for the amendment and our comments on the clause.

The Bill should be made clearer, and I would appreciate an update on the Minister’s assessment of the provisions in the Bill. Platforms and service providers need clarity if they are to take effective action against illegal content. Gaps in the Bill give rise to serious questions about the overwhelming practical challenges of the Bill. None of us wants a two-tier internet, in which user experience and platforms’ responsibilities in the UK differ significantly from those in the rest of the world. Clarifying the definition of illegal content and acknowledging the complexity of the situation when content originates abroad are vital if this legislation is to tackle wide-ranging, damaging content online. That is a concern I raised on Second Reading, and a number of witnesses reiterated it during the oral evidence sessions. I remind the Committee of the comments of Kevin Bakhurst from Ofcom, who said:

“We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of ‘illegal content’ is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 8, Q7.]

That has been reiterated by myriad other stakeholders, so I would be grateful for the Minister’s comments.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise to speak on clause 52 stand part, particularly —the Minister will not be surprised—the element in subsection (4)(c) around the offences specified in schedule 7. The debate has been very wide ranging throughout our sittings. It is extraordinary that we need a clause defining what is illegal. Presumably, most people who provide goods and services in this country would soon go out of business if they were not knowledgeable about what is illegal. The Minister is helping the debate very much by setting out clearly what is illegal, so that people who participate in the social media world are under no illusion as to what the Government are trying to achieve through this legislation.

The truth is that the online world has unfolded without a regulatory framework. New offences have emerged, and some of them are tackled in the Bill, particularly cyber-flashing. Existing offences have taken on a new level of harm for their victims, particularly when it comes to taking, making and sharing intimate images without consent. As the Government have already widely acknowledged, because the laws on that are such a patchwork, it is difficult for the enforcement agencies in this country to adequately protect the victims of that heinous crime, who are, as the Minister knows, predominately women.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank right hon. and hon. Members who have participated in the debate on this extremely important clause. It is extremely important because the Bill’s strongest provisions relate to illegal content, and the definition of illegal content set out in the clause is the starting point for those duties.

A number of important questions have been asked, and I would like to reply to them in turn. First, I want to speak directly about amendment 61, which was moved by the shadow Minister and which very reasonably and quite rightly asked the question about physically where in the world a criminal offence takes place. She rightly said that in the case of violence against some children, for example, that may happen somewhere else in the world but be transmitted on the internet here in the United Kingdom. On that, I can point to an existing provision in the Bill that does exactly what she wants. Clause 52(9), which appears about two thirds of the way down page 49 of the Bill, states:

“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom.”

What that is saying is that it does not matter whether the act of concern takes place physically in the United Kingdom or somewhere else, on the other side of the world. That does not matter in looking at whether something amounts to an offence. If it is criminal under UK law but it happens on the other side of the world, it is still in scope. Clause 52(9) makes that very clear, so I think that that provision is already doing what the shadow Minister’s amendment 61 seeks to do.

The shadow Minister asked a second question about the definition of illegal content, whether it involves a specific act and how it interacts with the “systems and processes” approach that the Bill takes. She is right to say that the definition of illegal content applies item by item. However, the legally binding duties in the Bill, which we have already debated in relation to previous clauses, apply to categories of content and to putting in place “proportionate systems and processes”—I think that that is the phrase used. Therefore, although the definition is particular, the duty is more general, and has to be met by putting in place systems and processes. I hope that my explanation provides clarification on that point.

The shadow Minister asked another question about the precise definitions of how the platforms are supposed to decide whether content meets the definition set out. She asked, in particular, questions about how to determine intent—the mens rea element of the offence. She mentioned that Ofcom had had some comments in that regard. Of course, the Government are discussing all this closely with Ofcom, as people would expect. I will say to the Committee that we are listening very carefully to the points that are being made. I hope that that gives the shadow Minister some assurance that the Government’s ears are open on this point.

The next and final point that I would like to come to was raised by all speakers in the debate, but particularly by my right hon. Friend the Member for Basingstoke, and is about violence against women and girls—an important point that we have quite rightly debated previously and come to again now. The first general point to make is that clause 52(4)(d) makes it clear that relevant offences include offences where the intended victim is an individual, so any violence towards and abuse of women and girls is obviously included in that.

As my right hon. Friend the Member for Basingstoke and others have pointed out, women suffer disproportionate abuse and are disproportionately the victims of criminal offences online. The hon. Member for Aberdeen North pointed out how a combination of protected characteristics can make the abuse particularly impactful—for example, if someone is a woman and a member of a minority. Those are important and valid points. I can reconfirm, as I did in our previous debate, that when Ofcom drafts the codes of practice on how platforms can meet their duties, it is at liberty to include such considerations. I echo the words spoken a few minutes ago by my right hon. Friend the Member for Basingstoke: the strong expectation across the House—among all parties here—is that those issues will be addressed in the codes of practice to ensure that those particular vulnerabilities and those compounded vulnerabilities are properly looked at by social media firms in discharging those duties.

My right hon. Friend also made points about intimate image abuse when the intimate images are made without the consent of the subject—the victim, I should say. I would make two points about that. The first relates to the Bill and the second looks to the future and the work of the Law Commission. On the Bill, we will come in due course to clause 150, which relates to the new harmful communications offence, and which will criminalise a communication—the sending of a message—when there is a real and substantial risk of it causing harm to the likely audience and there is intention to cause harm. The definition of “harm” in this case is psychological harm amounting to at least serious distress.

Clearly, if somebody is sending an intimate image without the consent of the subject, it is likely that that will cause harm to the likely audience. Obviously, if someone sends a naked image of somebody without their consent, that is very likely to cause serious distress, and I can think of few reasons why somebody would do that unless it was their intention, meaning that the offence would be made out under clause 150.

My right hon. Friend has strong feelings, which I entirely understand, that to make the measure even stronger the test should not involve intent at all, but should simply be a question of consent. Was there consent or not? If there was no consent, an offence would have been committed, without needing to go on to establish intention as clause 150 provides. As my right hon. Friend has said, Law Commission proposals are being developed. My understanding is that the Ministry of Justice, which is the Department responsible for this offence, is expecting to receive a final report, I am told, over the summer. It would then clearly be open to Parliament to legislate to put the offence into law, I hope as quickly as possible.

Once that happens, through whichever legislative vehicle, it will have two implications. First, the offence will automatically and immediately be picked up by clause 52(4)(d) and brought within the scope of the Bill because it is an offence where the intended victim is an individual. Secondly, there will be a power for the Secretary of State and for Parliament, through clause 176, I think—I am speaking from memory; yes, it is clause 176, not that I have memorised every clause in the Bill—via statutory instrument not only to bring the offence into the regular illegal safety duties, but to add it to schedule 7, which contains the priority offences.

Once that intimate image abuse offence is in law, via whichever legislative vehicle, that will have that immediate effect with respect to the Bill, and by statutory instrument it could be made a priority offence. I hope that gives my right hon. Friend a clear sense of the process by which this is moving forward.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank the Minister for such a clear explanation of his plan. Can he confirm that the Bill is a suitable legislative vehicle? I cannot see why it would not be. I welcome his agreement about the need for additional legislation over and above the communications offence. In the light of the way that nudification software and deepfake are advancing, and the challenges that our law enforcement agencies have in interpreting those quite complex notions, a straightforward law making it clear that publishing such images is a criminal offence would not only help law enforcement agencies, but would help the perpetrators to understand that what they are doing is a crime and they should stop.

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Tenth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to reply to the various queries that have been made. I will start with the points on vaccine disinformation raised by the hon. Members for Ochil and South Perthshire and for Pontypridd. The Government strongly agree with the points they made about the damaging effects of vaccine misinformation and the fact that many of our fellow citizens have probably died as a result of being misled into refusing the vaccine when it is, of course, perfectly safe. We strongly share the concerns they have articulated.

Over the past two years, the Department for Digital, Culture, Media and Sport has worked together with other Departments to develop a strong operational response to this issue. We have established a counter-disinformation unit within DCMS whose remit is to identify misinformation and work with social media firms to get it taken down. The principal focus of that unit during the pandemic was, of course, covid. In the past three months, it has focused more on the Russia-Ukraine conflict, for obvious reasons.

In some cases, Ministers have engaged directly with social media firms to encourage them to remove content that is clearly inappropriate. For example, in the Russia-Ukraine context, I have had conversations with social media companies that have left up clearly flagrant Russian disinformation. This is, therefore, an area that the Government are concerned about and have been acting on operationally already.

Obviously, we agree with the intention behind the amendment. However, the way to handle it is not to randomly drop an item into the Bill and leave the rest to a statutory instrument. Important and worthy though it may be to deal with disinformation, and specifically harmful health-related disinformation, there are plenty of other important things that one might add that are legal but harmful to adults, so we will not accept the amendment. Instead, we will proceed as planned by designating the list via a statutory instrument. I know that a number of Members of Parliament, probably including members of this Committee, would find it helpful to see a draft list of what those items might be, not least to get assurance that health-related misinformation and disinformation is on that list. That is something that we are considering very carefully, and more news might be forthcoming as the Bill proceeds through Parliament.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

My hon. Friend has talked about the Department’s counter-disinformation unit. Do the Government anticipate that that function to continue, or will they expect Ofcom to do it?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The work of the counter-disinformation unit is valuable. We look at these things on a spending review by spending review basis, and as far as I am aware we intend to continue with the counter-disinformation unit over the current spending review period. Clearly, I cannot commit future Ministers in perpetuity, but my personal view—if I am allowed to express it—is that that unit performs a useful function and could valuably be continued into the future. I think it is useful for the Government, as well as Ofcom, to directly have eyes on this issue, but I cannot speak for future Ministers. I can only give my right hon. Friend my own view.

I hope that I have set out my approach. We have heard the calls to publish the list so that parliamentarians can scrutinise it, and we also heard them on Second Reading.

I will now turn to the question raised by my hon. Friend the Member for Don Valley regarding freedom of expression. Those on one side of the debate are asking us to go further and to be clearer, while those on the other side have concerns about freedom of expression. As I have said, I honestly do not think that these legal but harmful provisions infringe on freedom of speech, for three reasons. First, even when the Secretary of State decides to designate content and Parliament approves of that decision through the affirmative procedure—Parliament gets to approve, so the Secretary of State is not acting alone—that content is not being banned. The Bill does not say that content designated as legal but harmful should immediately be struck from every corner of the internet. It simply says that category 1 companies—the big ones—have to do a proper risk assessment of that content and think about it properly.

Secondly, those companies have to have a policy to deal with that content, but that policy is up to them. They could have a policy that says, “It is absolutely fine.” Let us say that health disinformation is on the list, as one would expect it to be. A particular social media firm could have a policy that says, “We have considered this. We know it is risky, but we are going to let it happen anyway.” Some people might say that that is a weakness in the Bill, while others might say that it protects freedom of expression. It depends on one’s point of view, but that is how it works. It is for the company to choose and set out its policy, and the Bill requires it to enforce it consistently. I do not think that the requirements I have laid out amount to censorship or an unreasonable repression of free speech, because the platforms can still set their own terms and conditions.

There is also the general duty to have regard to free speech, which is introduced in clause 19(2). At the moment, no such duty exists. One might argue that the duty could be stronger, as my hon. Friend suggested previously, but it is unarguable that, for the first time ever, there is a duty on the platforms to have regard to free speech.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have a short comment on clause 56, which is an important clause because it will provide an analysis of how the legislation is working, and that is what Members want to see. To the point that the hon. Member for Pontypridd set out, it is right that Ofcom probably will not report until 2026, given the timeframe for the Bill being enacted. I would not necessarily want Ofcom to report sooner, because system changes take a long time to bed in. It does pose the question, however, of how Parliament will be able to analyse whether the legislation or its approach need to change between now and 2026. That reiterates the need—which I and other hon. Members have pointed out—for some sort of standing committee to scrutinise the issues. I do not personally think it would be right to get Ofcom to report earlier, because it might be an incomplete report.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have some brief comments on the clause. The Labour party very much welcomes the addition to user verification duties in the revised Bill. A range of groups, including Clean Up the Internet, have long campaigned for a verification requirement process, so this is a positive step forward.

We do, however, have some concerns about the exact principles and minimum standards for the user verification duty, which I will address when we consider new clause 8. We also have concerns about subsection (2), which states:

“The verification process may be of any kind (and in particular, it need not require documentation to be provided).”

I would be grateful if the Minister could clarify exactly what that process will look like in practice.

Lastly, as Clean Up the Internet has said, we need further clarification on whether users will be given a choice of how they verify and of the verification provider itself. We can all recognise that there are potential down- sides to the companies that own the largest platforms —such as Meta, Google, Twitter and ByteDance—developing their own in-house verification processes and making them the only option for users wishing to verify on their platform. Indeed, some users may have reservations about sharing even more personal data with those companies. Users of multiple social media platforms can find it inconvenient and confusing, and could be required to go through multiple different verification processes on different platforms to achieve the same outcome of confirming their real name.

There is a risk of the largest platforms seeking to leverage their dominance of social media to capture the market for ID verification services, raising competition concerns. I would be grateful if the Minister could confirm his assessment of the potential issues around clause 57 as it stands.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The right hon. Lady’s speech inspired me to stand up and mention a couple of things. My first question is about using empowerment around this clause. The clause applies only to adults. I can understand the issues that there may be with verifying the identity of children, but if that means that children are unable to block unverified accounts because they cannot verify their own account, the internet becomes a less safe place for children than for adults in this context, which concerns me.

To be honest, I do not know how children’s identities could be verified, but giving them access to the filters that would allow them to block unverified accounts, whether or not they are able to verify themselves—because they are children and therefore may not have the identity documentation they need—would be very helpful.

I appreciate the points that the right hon. Member was making, and I completely agree with her on the requirement for user verification, but I have to say that I believe there is a place for anonymity on the internet. I can understand why, for a number of people, that is the only way that they can safely access some of the community support that they need.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Just for clarity, the twin-track approach does not outlaw anonymity. It just means that people have verified accounts by default; they do not have to opt into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank the hon. Lady for giving way. I can understand the intent behind what she is saying and I have a huge amount of sympathy for it, but we know as a matter of fact that many of the images that are lodged on these sorts of websites were never intended to be pornographic in the first place. They may be intimate images taken by individuals of themselves—or, indeed, of somebody else—that are then posted as pornographic images. I am slightly concerned that an image such as that may not be caught by the hon. Lady’s amendments. Would she join me in urging the Government to bring forward the Law Commission’s recommendations on the taking, making and sharing of intimate images online without consent, which are far broader? They would probably do what she wants to do but not run into the problem of whether an image was meant to be pornographic in the first place.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful to the right hon. Member for her intervention. She knows that I have the utmost respect for all that she has tried to achieve in this area in the House along with my right hon. Friend the Member for Kingston upon Hull North.

We feel these amendments would encapsulate the specific issue of consent-based imagery or video content for which consent has not been obtained. Many of these people do not even know that the content has been taken in the first place, and it is then uploaded to these websites. It would be the website’s duty to verify that consent had been obtained and that the people in the video were of the age of consent. That is why we urge hon. Members to back the amendments.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - -

The Minister must be careful about using the revenge pornography legislation as an example of protection. He will know well that that legislation requires relationships between the people involved. It is a very specific piece of legislation. It does not cover the sorts of examples that the shadow Minister was giving.

Online Safety Bill (Eleventh sitting)

Maria Miller Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair again, Sir Roger. I add my tribute to our former colleague, Jo Cox, on this sad anniversary. Our thoughts are with her family today, including our colleague and my hon. Friend, the Member for Batley and Spen.

We welcome the “polluter pays” principle on which this and the following clauses are founded. Clause 70 establishes a duty for providers to notify Ofcom if their revenue is at or above the specified threshold designated by Ofcom and approved by the Secretary of State. It also creates duties on providers to provide timely notice and evidence of meeting the threshold. The Opposition do not oppose those duties. However, I would be grateful if the Minister could clarify what might lead to a provider or groups of providers being exempt from paying the fee. Subsection (6) establishes that

“OFCOM may provide that particular descriptions of providers of regulated services are exempt”,

subject to the Secretary of State’s approval. Our question is what kinds of services the Minister has in mind for that exemption.

Turning to clauses 71 to 76, as I mentioned, it is appropriate that the cost to Ofcom of exercising its online safety functions is paid through an annual industry fee, charged to the biggest companies with the highest revenues, and that smaller companies are exempt but still regulated. It is also welcome that under clause 71, Ofcom can make reference to factors beyond the provider’s qualifying worldwide revenue when determining the fee that a company must pay. Acknowledging the importance of other factors when computing that fee can allow for a greater burden of the fees to fall on companies whose activities may disproportionately increase Ofcom’s work on improving safety.

My hon. Friend the Member for Pontypridd has already raised our concerns about the level of funding needed for Ofcom to carry out its duties under the Bill. She asked about the creation of a new role: that of an adviser on funding for the online safety regulator. The impact assessment states that the industry fee will need to average around £35 million a year for the next 10 years to pay for operating expenditure. Last week, the Minister referred to a figure of around £88 million that has been announced to cover the first two years of the regime while the industry levy is implemented, and the same figure was used on Second Reading by the Secretary of State. Last October’s autumn Budget and spending review refers on page 115 to

“over £110 million over the SR21 period for the government’s new online safety regime through the passage and implementation of the Online Safety Bill, delivering on the government’s commitment to make the UK the safest place to be online.”

There is no reference to the £88 million figure or to Ofcom in the spending review document. Could the Minister tell us a bit more about that £88 million and the rest of the £110 million announced in the spending review, as it is relevant to how Ofcom is going to be resourced and the industry levy that is introduced by these clauses?

The Opposition feel it is critical that when the Bill comes into force, there is no gap in funding that would prevent Ofcom from carrying out its duties. The most obvious problem is that the level of funding set out in the spending review was determined when the Bill was in draft form, before more harms were brought into scope. The Department for Digital, Culture, Media and Sport has also confirmed that the figure of £34.9 million a year that is needed for Ofcom to carry out its online safety duties was based on the draft Bill.

We welcome many of the additional duties included in the Bill since its drafting, such as on fraudulent advertising, but does the Minister think the same level of funding will be adequate as when the calculation was made, when the Bill was in draft form? Will he reconsider the calculations his Department has made of the level of funding that Ofcom will need for this regime to be effective in the light of the increased workload that this latest version of the Bill introduces?

In March 2021, Ofcom put out a press release stating that 150 people would be employed in the new digital and technology hub in Manchester, but that that number would be reached in 2025. Therefore, as well as the level of resource being based on an old version of the Bill, the timeframe reveals a gap of three years until all the staff are in place. Does the Minister believe that Ofcom will have everything that is needed from the start, and in subsequent years as the levy gets up and going, in order to carry out its duties?

Of course, this will depend on how long the levy might need to be in place. My understanding of the timeframe is that first, the Secretary of State must issue guidance to Ofcom about the principles to be included in the statement of principles that Ofcom will use to determine the fees payable under clause 71. Ofcom must consult with those affected by the threshold amount to inform the final figure it recommends to the Secretary of State, and must produce a statement about what amounts comprise the provider’s qualifying world revenue and the qualifying period. That figure and Ofcom’s guidance must be agreed by the Secretary of State and laid before Parliament. Based on those checks and processes, how quickly does the Minister envisage the levy coming into force?

The Minister said last week that Ofcom is resourced for this work until 2023-24. Will the levy be in place by then to fund Ofcom’s safety work into 2024-25? If not, can the Minister confirm that the Government will cover any gaps in funding? I am sure he will agree, as we all do, that the duties in the Bill must be implemented as quickly as possible, but the necessary funding must also be in place so that Ofcom as a regulator can enforce the safety duty.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by associating myself with the remarks by the hon. Member for Worsley and Eccles South. We are in complete concurrence with the concept that the polluter should pay. Where there are regulatory costs caused by the behaviour of the social media firms that necessitates the Bill, it is absolutely right that those costs should fall on them and not on the general taxpayer. I absolutely agree with the principles that she outlined.

The hon. Lady raised a question about clause 70(6) and the potential exemption from the obligation to pay fees. That is a broadly drawn power, and the phrasing used is where

“OFCOM consider that an exemption…is appropriate”

and where the Secretary of State agrees. The Bill is not being prescriptive; it is intentionally providing flexibility in case there are circumstances where levying the fees might be inappropriate or, indeed, unjust. It is possible to conceive of an organisation that somehow exceeds the size threshold, but so manifestly does not need regulation that it would be unfair or unjust to levy the fees. For example, if a charity were, by some accident of chance, to fall into scope, it might qualify. But we expect social media firms to pay these bills, and I would not by any means expect the exemption to be applied routinely or regularly.

On the £88 million and the £110 million that have been referenced, the latter amount is to cover the three-year spending review period, which is the current financial year—2022-23—2023-24 and 2024-25. Of that £110 million, £88 million is allocated to Ofcom in the first two financial years; the remainder is allocated to DCMS for its work over the three-year period of the spending review. The £88 million for Ofcom runs out at the end of 2023-24.

The hon. Lady then asked whether the statutory fees in these clauses will kick in when the £88 million runs out—whether they will be available in time. The answer is yes. We expect and intend that the fees we are debating will become effective in 2024-25, so they will pick up where the £88 million finishes.

Ofcom will set the fees at a level that recoups its costs, so if the Bill becomes larger in scope, for example through amendments in the Commons or the Lords—not that I wish to encourage amendments—and the duties on Ofcom expand, we would expect the fees to be increased commensurately to cover any increased cost that our legislation imposes.

Online Safety Bill (Twelfth sitting)

Maria Miller Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will be brief. Labour welcomes clause 110, which addresses the process of starting enforcement. We support the process, particularly the point that ensures that Ofcom must first issue a “provisional notice of contravention” to an entity before it reaches its final decision.

The clause ultimately ensures that the process for Ofcom issuing a provisional notice of contravention can take place only after a full explanation and deadline has been provided for those involved. Thankfully, this process means that Ofcom can reach a decision only after allowing the recipient a fair opportunity to make relevant representations too. The process must be fair for all involved and that is why we welcome the provisions outlined in the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I hope that I am speaking at the right stage of the Bill, and I promise not to intervene at any further stages where this argument could be put forward.

Much of the meat of the Bill is within chapter 6. It establishes what many have called the “polluter pays” principle, where an organisation that contravenes can then be fined—a very important part of the Bill. We are talking about how Ofcom is going to be able to make the provisions that we have set out work in practice. A regulated organisation that fails to stop harm contravenes and will be fined, and fined heavily.

I speak at this point in the debate with slight trepidation, because these issues are also covered in clause 117 and schedule 12, but it is just as relevant to debate the point at this stage. It is difficult to understand where in the Bill the Government set out how the penalties that they can levy as a result of the powers under this clause will be used. Yes, they will be a huge deterrent, and that is good in its own right and important, but surely the real opportunity is to make the person who does the harm pay for righting the wrong that they have created.

That is not a new concept. Indeed, it is one of the objectives that the Government set out in the intentions behind their approach to the draft victims Bill. It is a concept used in the Investigatory Powers Act 2016. It is the concept behind the victims surcharge. So how does this Bill make those who cause harm take greater responsibility for the cost of supporting victims to recover from what they have suffered? That is exactly what the Justice Ministers set out as being so important in their approach to victims. In the Bill, that is not clear to me.

At clause 70, the Minister helpfully set out that there was absolutely no intention for Ofcom to have a role in supporting victims individually. In reply to the point that I made at that stage, he said that the victims Bill would address some of the issues—I am sure that he did not say all the issues, but some of them at least. I do not believe that it will. The victims Bill establishes a code and a duty to provide victim support, but it makes absolutely no reference to how financial penalties on those who cause harm—as set out so clearly in this Bill—will be used to support victims. How will they support victims’ organisations, which do so much to help in particular those who do not end up in court, before a judge, because what they have suffered does not warrant that sort of intervention?

I believe that there is a gap. We heard that in our evidence session, including from Ofcom itself, which identified the need for law enforcement, victim-support organisations and platforms themselves to find what the witnesses described as an effective way for the new “ecosystem” to work. Victim-support organisations went further and argued strongly for the need for victims’ voices to be heard independently. The NSPCC in particular made a very powerful argument for children’s voices needing to be heard and for having independent advocacy. There would be a significant issue with trust levels if we were to rely solely on the platforms themselves to provide such victim support.

There are a couple of other reasons why we need the Government to tease the issue out. We are talking about the most significant culture change imaginable for the online platforms to go through. There will be a lot of good will, I am sure, to achieve that culture change, but there will also be problems along the way. Again referring back to our evidence sessions, the charity Refuge said that reporting systems are “not up to scratch” currently. There is a lot of room for change. We know that Revenge Porn Helpline has seen a continual increase in demand for its services in support of victims, in particular following the pandemic. It also finds revenue and funding a little hand to mouth.

Victim support organisations will have a crucial role in assisting Ofcom with the elements outlined in chapter 6, of which clause 110 is the start, in terms of monitoring the reality for users of how the platforms are performing. The “polluter pays” principle is not working quite as the Government might want it to in the Bill. My solution is for the Minister to consider talking to his colleagues in the Treasury about whether this circle could be squared—whether we could complete the circle—by having some sort of hypothecation of the financial penalties, so that some of the huge amount that will be levied in penalties can be put into a fund that can be used directly to support victims’ organisations. I know that that requires the Department for Digital, Culture, Media and Sport and the Ministry of Justice to work together, but my hon. Friend is incredibly good at collaborative working, and I am sure he will be able to achieve that.

This is not an easy thing. I know that the Treasury would not welcome Committees such as this deciding how financial penalties are to be used, but this is not typical legislation. We are talking about enormous amounts of money and enormous numbers of victims, as the Minister himself has set out when we have tried to debate some of these issues. He could perhaps undertake to raise this issue directly with the Treasury, and perhaps get it to look at how much money is currently going to organisations to support victims of online abuse and online fraud—the list goes on—and to see whether we will have to take a different approach to ensure that the victims we are now recognising get the support he and his ministerial colleagues want to see.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

First, on the substance of the clause, as the shadow Minister said, the process of providing a provisional notice of contravention gives the subject company a fair chance to respond and put its case, before the full enforcement powers are brought down on its head, and that is of course only reasonable, given how strong and severe these powers are. I am glad there is once again agreement between the two parties.

I would like to turn now to the points raised by my right hon. Friend the Member for Basingstoke, who, as ever, has made a very thoughtful contribution to our proceedings. Let me start by answering her question as to what the Bill says about where fines that are levied will go. We can discover the answer to that question in paragraph 8 of schedule 12, which appears at the bottom of page 206 and the top of page 207—in the unlikely event that Members had not memorised that. If they look at that provision, they will see that the Bill as drafted provides that fines that are levied under the powers provided in it and that are paid to Ofcom get paid over to the Consolidated Fund, which is essentially general Treasury resources. That is where the money goes under the Bill as drafted.

My right hon. Friend asks whether some of the funds could be, essentially, hypothecated and diverted directly to pay victims. At the moment, the Government are dealing with victims, or pay for services supporting victims, not just via legislation—the victims Bill—but via expenditure that, I think, is managed by the Ministry of Justice to support victims and organisations working with victims in a number of ways. I believe that the amount earmarked for this financial year is in excess of £300 million, which is funded just via the general spending review. That is the situation as it is today.

I am happy to ask colleagues in Government the question that my right hon. Friend raises. It is really a matter for the Treasury, so I am happy to pass her idea on to it. But I anticipate a couple of responses coming from the Treasury in return. I would anticipate it first saying that allocating money to a particular purpose, including victims, is something that it likes to do via spending reviews, where it can balance all the demands on Government revenue, viewed in the round.

Secondly, it might say that the fine income is very uncertain; we do not know what it will be. One year it could be nothing; the next year it could be billions and billions of pounds. It depends on the behaviour of these social media firms. In fact, if the Bill does its job and they comply with the duties as we want and expect them to, the fines could be zero, because the firms do what they are supposed to. Conversely, if they misbehave, as they have been doing until now, the fines could be enormous. If we rely on hypothecation of these fines as a source for funding victim services, it might be that, in a particular year, we discover that there is no income, because no fines have been levied.

Online Safety Bill (Thirteenth sitting)

Maria Miller Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We support clause 132, which ensures that Ofcom is required to understand and measure public opinion concerning providers of regulated services, as well as the experiences and interests of those using the regulated services in question. The Bill in its entirety is very much a learning curve for us all, and I am sure we all agree that, as previously maintained, the world really is watching as we seek to develop and implement the legislation. That is why it is vital that Ofcom is compelled to conduct and arrange its own research to ensure that we are getting an accurate picture of how our regulatory framework is affecting people. I stress to the Minister that it is imperative that Ofcom consults all service providers—big and small—which the CBI stressed to me in recent meetings.

We also welcome the provisions outlined in subsection (2) that confirm that Ofcom must include a statement of its research in its annual report to the Secretary of State and the devolved Administrations. It is important that Ofcom, as a regulator, takes a research-led approach, and Labour is pleased to see these provisions included in the Bill.

We welcome the inclusion of clause 133, which extends the communication panel’s remit to include online safety. This will mean that the panel is able to give advice on matters relating to different types of online content under the Bill, and on the impacts of online content on UK users of regulated services. It is a welcome step forward, so we have not sought to amend the clause.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I want to make one short comment about clauses 132 and 133, which are really important. There is no intention to interfere with or fetter the way that Ofcom operates, but there is an obligation on this Committee, and on Parliament, to indicate what we would expect to see from Ofcom by way of the clauses, because they are an essential part of the transparency that we are trying to inject into the sector.

Research about users’ experiences is hugely important, and such reports contain important insights into how platforms are used, and the levels of misinformation and disinformation that people are exposed to. Ofcom already produces highly authoritative reports on various aspects of the online world, including the fact that three in four adults do not think about whether the online information that they see is truthful. Indeed, one in three adults believes that all or most information that they find online is truthful. We know that there is a significant gap between consumers perception and reality, so it is important to ensure that research has good exposure among those using the internet.

We do not often hear about the problems of how the online world works, and the level of disinformation and inaccuracy is not well known, so will the Minister elaborate on how he expects Ofcom to ensure that people are aware of the reality of the online world? Platforms will presumably be required to have regard to the content of Ofcom reports, but will Ofcom be required to publicise its reports? It is not clear that such a duty is in the Bill at the moment, so does the Minister expect Ofcom to have a role in educating people, especially children, about the problem of inaccurate data or other aspects of the online world?

We know that a number of platforms spend a great deal of money on going into schools and talking about their products, which may or may not entail accurate information. Does Ofcom not have an important role to play in this area? Educating users about the changes in the Bill would be another potential role for Ofcom in order to recalibrate users’ expectations as to what they might reasonably expect platforms to offer as a result of the legislation. It is important that we have robust regulatory frameworks in place, and this Bill clearly does that. However, it also requires users to be aware of the changes that have been made so that they can report the problems they experience in a timely manner.

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Fourteenth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would like to make a couple of comments. The shadow Minister mentioned education and prevention projects, which are key. In Scotland, our kids’ sex, health and relationship education in schools teaches consent from the earliest possible age. That is vital. We have a generation of men who think it is okay to send these images and not seek consent. As the shadow Minister said, the problem is everywhere. So many women have received images that they had no desire to see. They did not ask for them, and they did not consent to receive them, but they get them.

Requiring someone to prove the intent behind the offence is just impossible. It is so unworkable, and that makes it really difficult. This is yet another issue that makes it clear that we need to have reference to violence against women and girls on the face of the Bill. If that were included, we would not be making such a passionate case here. We would already have a code of conduct and assessments that have to take place on the basis of the specific harm to women and girls from such offences. We would not be making the case so forcefully because it would already be covered.

I wish the Minister would take on board how difficult it is for women and girls online, how much of an issue this specific action causes and how much pain and suffering it causes. It would great if the Minister could consider moving somewhat on this issue in order to protect women and girls.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I want to make sure that the record is clear that while I did receive a dick pic, I am not a millennial. That shoes how widespread this problem is. My children would want that on the record.

Research done by YouGov showed that half of millennial women have been sent a photo of a penis, and that nine in 10 women who have ever received such a picture did not want to have it sent to them. To anybody who is trying to—I do not feel anybody today is—advocate that this is a small issue or a minority problem, the data suggest that it is not.

For the record, I think the reason I was sent that picture was not sexual at all. I think it was intimidatory. I was sitting in a train carriage on my way into Parliament on a hot day, and I think it was sent as intimidation because I could not leave that carriage and I had, in error, left my AirDrop on. Okay, that was my fault, but let us not victim blame.

I very much welcome the Minister’s approach, because he is the first person to take forward a series of new offences that are needed to clarify the law as it affects people in this area. As he was talking, I was reflecting on his use of the word “clarity”, and I think he is absolutely right. He is rightly looking to the Law Commission as the expert for how we interpret and how we get the most effective law in place.

Although we are not talking about the intimate image abuse recommendations in this part of the Bill, I draw to the Committee’s attention that I, and others, will have received an email from the Law Commission today setting out that it will bring forward its recommendations next month. I hope that that means that the Minister will bring forward something concrete to us about those particular offences in the coming weeks. He is right that when it comes to cyber-flashing, we need to get it right. We need to make sure that we follow the experts. The Law Commission was clear when it undertook its review that the current law does not adequately address these issues. I was pleased when it made that recommendation.

A great many people have looked at these issues, and I pay tribute to each and every one of them, though they come to slightly different conclusions about how we interpret the Law Commission’s recommendations and how we move forward. Professor Clare McGlynn is an expert. Bumble has done work on this; my hon. Friend the Member for Brecon and Radnorshire (Fay Jones) has done a great deal of work too, and I recognise her contribution.

The offence is particularly pernicious because it is as prevalent as indecent exposure. It is right that the offence is recognised in the Sex Offenders Act 2003 as a result. As the hon. Member for Pontypridd said, it is another form of gendered crime online. On the evidence of harm that it causes, she referenced the evidence that we got from Professor McGlynn about Gaia Pope. That was particularly concerning. I do not think any of us in the Committee would argue that this is not the most serious of offences, and I commend the Minister for bringing forward a serious set of recommendations to tackle it.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a fair question. There might be circumstances in which somebody simply misjudges a situation—has not interpreted it correctly—and ends up committing a criminal offence; stumbling into it almost by accident. Most criminal offences require some kind of mens rea—some kind of intention to commit a criminal offence. If a person does something by accident, without intention, that does not normally constitute a criminal offence. Most criminal offences on the statute book require the person committing the offence to intend to do something bad. If we replace the word “intent” with “without consent”, the risk is that someone who does something essentially by accident will have committed a criminal offence.

I understand that the circumstances in which that might happen are probably quite limited, and the context of the incidents that the hon. Member for Pontypridd and my right hon. Friend the Member for Basingstoke have described would generally support the fact that there is a bad intention, but we have to be a little careful not accidentally to draw the line too widely. If a couple are exchanging images, do they have to consent prior to the exchange of every single image? We have to think carefully about such circumstances before amending the clause.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have to say, just as an aside, that the Minister has huge levels of empathy, so I am sure that he can put himself into the shoes of someone who receives such an image. I am not a lawyer, but I know that there is a concept in law of acting recklessly, so if someone acts recklessly, as my hon. Friend has set out in his Bill, they can be committing a criminal offence. That is why I thought he might want to consider not having the conditional link between the two elements of subsection(1)(b), but instead having them as an either/or. If he goes back to the Law Commission’s actual recommendations, rather than the interpretation he was given by the MOJ, he will see that they set out that one of the conditions should be that defendants who are posting in this way are likely to cause harm. If somebody is acting in a way that is likely to cause harm, they would be transgressing. The Bill acknowledges that somebody can act recklessly. It is a well-known concept in law that people can be committing an offence if they act recklessly—reckless driving, for example. I wonder whether the Minister might think about that, knowing how difficult it would be to undertake what the hon. Member for Pontypridd is talking about, as it directly contravenes the Law Commission’s recommendations. I do not think what I am suggesting would contravene the Law Commission’s recommendations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I will commit to consider the clause further, as my right hon. Friend has requested. It is important to do so in the context of the Law Commission’s recommendations, but she has pointed to wording in the Law Commission’s original report that could be used to improve the drafting here. I do not want to make a firm commitment to change, but I will commit to considering whether the clause can be improved upon. My right hon. Friend referred to the “likely to cause harm” test, and asked whether recklessness as to whether someone suffers alarm, distress or humiliation could be looked at as a separate element. We need to be careful; if we sever that from sexual gratification, we need to have some other qualification on sexual gratification. We might have sexual gratification with consent, which would be fine. If we severed them, we would have to add another qualification.

It is clear that there is scope for further examination of clause 156. That does not necessarily mean it will be possible to change it, but it is worth examining it further in the light of the comments made by my right hon. Friend. The testimony we heard from witnesses, the testimony of my right hon. Friend and what we heard from the hon. Member for Pontypridd earlier do demonstrate that this is a widespread problem that is hugely distressing and intrusive and that it represents a severe violation. It does need to be dealt with properly.

We need to be cognisant of the fact that in some communities there is a culture of these kinds of pictures being freely exchanged between people who have not met or communicated before—on some dating websites, for example. We need to draft the clause in such a way that it does not inadvertently criminalise those communities—I have been approached by members of those communities who are concerned.

Online Safety Bill (Fifteenth sitting)

Maria Miller Excerpts
Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

The hon. Lady will recall the issue that I raised earlier in the Committee’s deliberations, regarding the importance of victim support that gives people somewhere to go other than the platforms. I think that is what she is now alluding to. Does she not believe that the organisations that are already in place, with the right funding—perhaps from the fines coming from the platforms themselves—would be in a position to do this almost immediately, and that we should not have to set up yet another body, or have I misunderstood what she has said?

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I want briefly to interject to underline the point I made in my intervention on the hon. Member for Worsley and Eccles South. I welcome the discussion about victims’ support, which picks up on what we discussed on clause 110. At that point I mentioned the NSPCC evidence that talked about the importance of third party advocacy services, due to the lack of trust in the platforms, as well as for some of the other reasons that the hon. Members for Worsley and Eccles South, for Batley and Spen, and for Aberdeen North have raised.

When we discussed clause 110, the Minister undertook to think about the issue seriously and to talk to the Treasury about whether funding could be taken directly from fines rather than those all going into the Treasury coffers. I hope the debate on new clause 3 will serve to strengthen his resolve, given the strength of support for such a measure, whether that is through a formal user advocacy service or by using existing organisations. I hope he uses the debate to strengthen his arguments about such a measure with the Treasury.

I will not support the new clause tabled by the hon. Member for Worsley and Eccles South, because I think the Minister has already undertaken to look at this issue. As I say, I hope this discussion strengthens his resolve to do so.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I hope that I did not in any way confuse the debate earlier, because these two things are very separate. The idea of a user-advocacy service and individual victim support are two separate issues. The Minister has already taken up the issue of victim support, which is what the Children’s Commissioner was talking about, but that is separate from advocacy, which is much broader and not necessarily related to an individual problem.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Indeed, but the Children’s Commissioner was very clear about certain elements being missing in the Bill, as is the NSPCC and other organisations. It is just not right for the Minister to land it back with the Children’s Commissioner as part of her role, because she has to do so many other things. The provisions in the Bill in respect of a parent or adult assisting a young people in a grooming situation are a very big concern. The Children’s Commissioner cited her own survey of 2,000 children, a large proportion of whom had not succeeded in getting content about themselves removed. From that, we see that she understands that the problem exists. We will push the new clause to a Division.

Question put, That the clause be read a Second time.

Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Sixteenth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister is right to raise the issue of women and girls being disproportionately—one might say overwhelmingly—the victims of certain kinds of abuse online. We heard my right hon. Friend the Member for Basingstoke, the shadow Minister and others set that out in a previous debate. The shadow Minister is right to raise the issue.

Tackling violence against women and girls has been a long-standing priority of the Government. Indeed, a number of important new offences have already been and are being created, with protecting women principally in mind—the offence of controlling or coercive behaviour, set out in the Serious Crime Act 2015 and amended in the Domestic Abuse Act 2021; the creation of a new stalking offence in 2012; a revenge porn offence in 2015; and an upskirting offence in 2019. All of those offences are clearly designed principally to protect women and girls who are overwhelmingly the victims of those offences. Indeed, the cyber-flashing offence created by clause 156 —the first time we have ever had such an offence in this jurisdiction—will, again, overwhelmingly benefit women and girls who are the victims of that offence.

All of the criminal offences I have mentioned—even if they are not mentioned in schedule 7, which I will come to in a moment—will automatically flow into the Bill via the provisions of clause 52(4)(d). Criminal offences where the victim is an individual, which these clearly all are, automatically flow into the provisions of the Bill, including the offences I just listed, which have been created particularly with women in mind.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I hope that my hon. Friend will discuss the Law Commission’s recommendations on intimate image abuse. When I raised this issue in an earlier sitting, he was slightly unsighted by the fact that the recommendations were about to come out—I can confirm again that they will come out on 7 July, after some three years of deliberation. It is unfortunate that will be a week after the end of the Committee’s deliberations, and I hope that the timing will not preclude the Minister from mopping it up in his legislation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her question and for her tireless work in this area. As she says, the intimate image abuse offence being worked on is an extremely important piece in the jigsaw puzzle to protect women, particularly as it has as its threshold—at least in the previous draft—consent, without any test of intent, which addresses some points made by the Committee previously. As we have discussed before, it is a Ministry of Justice lead, and I am sure that my right hon. Friend will make representations to MOJ colleagues to elicit a rapid confirmation of its position on the recommendations, so that we can move to implement them as quickly as possible.

I remind the Committee of the Domestic Abuse Act 2021, which was also designed to protect women. Increased penalties for stalking and harassment have been introduced, and we have ended the automatic early release of violent and sex offenders from prison—something I took through Parliament as a Justice Minister a year or two ago. Previously, violent and sex offenders serving standard determinate sentences were often released automatically at the halfway point of their sentence, but we have now ended that practice. Rightly, a lot has been done outside the Bill to protect women and girls.

Let me turn to what the Bill does to further protect women and girls. Schedule 7 sets out the priority offences—page 183 of the Bill. In addition to all the offences I have mentioned previously, which automatically flow into the illegal safety duties, we have set out priority offences whereby companies must not just react after the event, but proactively prevent the offence from occurring in the first place. I can tell the Committee that many of them have been selected because we know that women and girls are overwhelmingly the victims of such offences. Line 21 lists the offence of causing

“intentional harassment, alarm or distress”.

Line 36 mentions the offence of harassment, and line 37 the offence of stalking. Those are obviously offences where women and girls are overwhelmingly the victims, which is why we have picked them out and put them in schedule 7—to make sure they have the priority they deserve.

Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Seventeenth sitting)

Maria Miller Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
None Portrait The Chair
- Hansard -

Before I ask Alex Davies-Jones whether she wishes to press the new clause to a vote, I thank you all for the very respectful way in which you have conducted proceedings. It is much appreciated. Let me say on behalf of Sir Roger and myself that it has been an absolute privilege to co-chair this Bill Committee.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

On a point of order, Ms Rees. On behalf of the Back Benchers, I thank you and Sir Roger for your excellent chairpersonships, and the Minister and shadow Ministers for the very courteous way in which proceedings have taken place. It has been a great pleasure to be a member of the Bill Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am content with the Minister’s assurance that the provisions of new clause 41 are covered in the Bill, and therefore do not wish to press it to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Schedule 2

Recovery of OFCOM’s initial costs

Recovery of initial costs

1 (1) This Schedule concerns the recovery by OFCOM of an amount equal to the aggregate of the amounts of WTA receipts which, in accordance with section 401(1) of the Communications Act and OFCOM’s statement under that section, are retained by OFCOM for the purpose of meeting their initial costs.

(2) OFCOM must seek to recover the amount described in sub-paragraph (1) (“the total amount of OFCOM’s initial costs”) by charging providers of regulated services fees under this Schedule (“additional fees”).

(3) In this Schedule—

“initial costs” means the costs incurred by OFCOM before the day on which section 75 comes into force on preparations for the exercise of their online safety functions;

“WTA receipts” means the amounts described in section 401(1)(a) of the Communications Act which are paid to OFCOM (certain receipts under the Wireless Telegraphy Act 2006).

Recovery of initial costs: first phase

2 (1) The first phase of OFCOM’s recovery of their initial costs is to take place over a period of several charging years to be specified in regulations under paragraph 7 (“specified charging years”).

(2) Over that period OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the total amount of OFCOM’s initial costs.

(3) OFCOM may not charge providers additional fees in respect of any charging year which falls before the first specified charging year.

(4) OFCOM may require a provider to pay an additional fee in respect of a charging year only if the provider is required to pay a fee in respect of that year under section 71 (and references in this Schedule to charging providers are to be read accordingly).

(5) The amount of an additional fee payable by a provider is to be calculated in accordance with regulations under paragraph 7.

Further recovery of initial costs

3 (1) The second phase of OFCOM’s recovery of their initial costs begins after the end of the last of the specified charging years.

(2) As soon as reasonably practicable after the end of the last of the specified charging years, OFCOM must publish a statement specifying—

(a) the amount which is at that time the recoverable amount (see paragraph 6), and

(b) the amounts of the variables involved in the calculation of the recoverable amount.

(3) OFCOM’s statement must also specify the amount which is equal to that portion of the recoverable amount which is not likely to be paid or recovered. The amount so specified is referred to in sub-paragraphs (4) and (5) as “the outstanding amount”.

(4) Unless a determination is made as mentioned in sub-paragraph (5), OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the outstanding amount.

(5) The Secretary of State may, as soon as reasonably practicable after the publication of OFCOM’s statement, make a determination specifying an amount by which the outstanding amount is to be reduced, and in that case OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the difference between the outstanding amount and the amount specified in the determination.

(6) Additional fees mentioned in sub-paragraph (4) or (5) must be charged in respect of the charging year immediately following the last of the specified charging years (“year 1”).

(7) The process set out in sub-paragraphs (2) to (6) is to be repeated in successive charging years, applying those sub-paragraphs as if—

(a) in sub-paragraph (2), the reference to the end of the last of the specified charging years were to the end of year 1 (and so on for successive charging years);

(b) in sub-paragraph (6), the reference to year 1 were to the charging year immediately following year 1 (and so on for successive charging years).

(8) Any determination by the Secretary of State under this paragraph must be published in such manner as the Secretary of State considers appropriate.

(9) Sub-paragraphs (4) and (5) of paragraph 2 apply to the charging of additional fees under this paragraph as they apply to the charging of additional fees under that paragraph.

(10) The process set out in this paragraph comes to an end in accordance with paragraph 4.

End of the recovery process

4 (1) The process set out in paragraph 3 comes to an end if a statement by OFCOM under that paragraph records that—

(a) the recoverable amount is nil, or

(b) all of the recoverable amount is likely to be paid or recovered.

(2) Or the Secretary of State may bring that process to an end by making a determination that OFCOM are not to embark on another round of charging providers of regulated services additional fees.

(3) The earliest time when such a determination may be made is after the publication of OFCOM’s first statement under paragraph 3.

(4) A determination under sub-paragraph (2)—

(a) must be made as soon as reasonably practicable after the publication of a statement by OFCOM under paragraph 3;

(b) must be published in such manner as the Secretary of State considers appropriate.

(5) A determination under sub-paragraph (2) does not affect OFCOM’s power—

(a) to bring proceedings for the recovery of the whole or part of an additional fee for which a provider became liable at any time before the determination was made, or

(b) to act in accordance with the procedure set out in section 120 in relation to such a liability.

Providers for part of a year only

5 (1) For the purposes of this Schedule, the “provider” of a regulated service, in relation to a charging year, includes a person who is the provider of the service for part of the year.

(2) Where a person is the provider of a regulated service for part of a charging year only, OFCOM may refund all or part of an additional fee paid to OFCOM under paragraph 2 or 3 by that provider in respect of that year.

Calculation of the recoverable amount

6 For the purposes of a statement by OFCOM under paragraph 3, the “recoverable amount” is given by the formula—

C – (F – R) - D

where—

C is the total amount of OFCOM’s initial costs,

F is the aggregate amount of the additional fees received by OFCOM at the time of the statement in question,

R is the aggregate amount of the additional fees received by OFCOM that at the time of the statement in question have been, or are due to be, refunded (see paragraph 5(2)), and

D is the amount specified in a determination made by the Secretary of State under paragraph 3 (see paragraph 3(5)) at a time before the statement in question or, where more than one such determination has been made, the sum of the amounts specified in those determinations.

If no such determination has been made before the statement in question, D=).

Regulations about recovery of initial costs

7 (1) The Secretary of State must make regulations making such provision as the Secretary of State considers appropriate in connection with the recovery by OFCOM of their initial costs.

(2) The regulations must include provision as set out in sub-paragraphs (3), (4) and (6).

(3) The regulations must specify the total amount of OFCOM’s initial costs.

(4) For the purposes of paragraph 2, the regulations must specify—

(a) the charging years in respect of which additional fees are to be charged, and

(b) the proportion of the total amount of initial costs which OFCOM must seek to recover in each of the specified charging years.

(5) The following rules apply to provision made in accordance with sub-paragraph (4)(a)—

(a) the initial charging year may not be specified;

(b) only consecutive charging years may be specified;

(c) at least three charging years must be specified;

(d) no more than five charging years may be specified.

(6) The regulations must specify the computation model that OFCOM must use to calculate fees payable by individual providers of regulated services under paragraphs 2 and 3 (and that computation model may be different for different charging years).

(7) The regulations may make provision about what OFCOM may or must do if the operation of this Schedule results in them recovering more than the total amount of their initial costs.

(8) The regulations may amend this Schedule or provide for its application with modifications in particular cases.

(9) Before making regulations under this paragraph, the Secretary of State must consult—

(a) OFCOM,

(b) providers of regulated user-to-user services,

(c) providers of regulated search services,

(d) providers of internet services within section 67(2), and

(e) such other persons as the Secretary of State considers appropriate.

Interpretation

8 In this Schedule—

“additional fees” means fees chargeable under this Schedule in respect of the recovery of OFCOM’s initial costs;

“charging year” has the meaning given by section76;

“initial charging year” has the meaning given by section76;

“initial costs” has the meaning given by paragraph 1(3), and the “total amount” of initial costs means the amount described in paragraph 1(1);

“recoverable amount” has the meaning given by paragraph 6;

“specified charging year” means a charging year specified in regulations under paragraph 7 for the purposes of paragraph 2.” —(Chris Philp.)

This new Schedule requires Ofcom to seek to recover their costs which they have incurred (before clause 75 comes into force) when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services.

Brought up, read the First and Second time, and added to the Bill.

Online Safety Bill

Maria Miller Excerpts
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - -

My hon. Friend did not have the joy of being on the Bill Committee, as I did with my hon. Friend the Member for Croydon South (Chris Philp), who was the Minister at that point. The point that my hon. Friend has just made about free speech is so important for women and girls who are not able to go online because of the violent abuse that they receive, and that has to be taken into account by those who seek to criticise the Bill. We have to make sure that people who currently feel silenced do not feel silenced in future and can participate online in the way that they should be able to do. My hon. Friend is making an excellent point and I welcome him to his position.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is entirely right on that point. The structure of the Bill is very simple. There is a legal priority of harms, and things that are illegal offline will be regulated online at the level of the criminal threshold. There are protections for freedom of speech and there is proper transparency about harmful content, which I will come on to address.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank the hon. Gentleman, who is absolutely right. In Committee, we debated at length the impact search engines have, and they should be included in the Bill’s categorisation of difficult issues. In one recent example on a search engine, the imagery that comes up when we search for desk ornaments is utterly appalling and needs to be challenged and changed. If we are to truly tackle antisemitism, racism and extremist content online, then the provisions need to be included in the Bill, and journalistic exemptions should not apply to this type of content. Often, they operate more discretely and are less likely to attract sanctions. Furthermore, any amendment will provide no answer to the many extremist publishers who seek to exploit the terms of the exemption. For those reasons, we need to go further.

The amendments are not a perfect or complete solution. Deficiencies remain, and the amendments do not address the fact that the exemption continues to exclude dozens of independent local newspapers around the country on the arbitrary basis that they have no fixed address. The Independent Media Association, which represents news publishers, describes the news publisher criteria as

“punishing quality journalism with high standards”.

I hope the Minister will reflect further on that point. As a priority, we need to ensure that the exemption cannot be exploited by bad actors. We must not give a free pass to those propagating racist, misogynistic or antisemitic harm and abuse. By requiring some standards of accountability for news providers, however modest, the amendments are an improvement on the Bill as drafted. In the interests of national security and the welfare of the public, we must support the amendments.

Finally, I come to a topic that I have spoken about passionately in this place on a number of occasions and that is extremely close to my heart: violence against women and girls. Put simply, in their approach to the Bill the Government are completely failing and falling short in their responsibilities to keep women and girls safe online. Labour has been calling for better protections for some time now, yet still the Government are failing to see the extent of the problem. They have only just published an initial indicative list of priority harms to adults, in a written statement that many colleagues may have missed. While it is claimed that this will add to scrutiny and debate, the final list of harms will not be on the face of the Bill but will included in secondary legislation after the Bill has received Royal Assent. Non-designated content that is harmful will not require action on the part of service providers, even though by definition it is still extremely harmful. How can that be acceptable?

Many campaigners have made the case that protections for women and girls are not included in the draft Bill at all, a concern supported by the Petitions Committee in its report on online abuse. Schedule 7 includes a list of sexual offences and aggravated offences, but the Government have so far made no concessions here and the wider context of violence against women and girls has not been addressed. That is why I urge the Minister to carefully consider our new clause 3, which seeks to finally name violence against women and girls as a priority harm. The Minister’s predecessor said in Committee that women and girls receive “disproportionate” levels of abuse online. The Minister in his new role will likely be well briefed on the evidence, and I know this is an issue he cares passionately about. The case has been put forward strongly by hon. Members on all sides of the House, and the message is crystal clear: women and girls must be protected online, and we see this important new clause as the first step.

Later on, we hope to see the Government move further and acknowledge that there must be a code of practice on tackling violence against women and girls content online.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The hon. Lady raises the issue of codes of practice. She will recall that in Committee we talked about that specifically and pressed the then Minister on that point. It became very clear that Ofcom would be able to issue a code of practice on violence against women and girls, which she talked about. Should we not be seeking an assurance that Ofcom will do that? That would negate the need to amend the Bill further.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the right hon. Lady’s comments. We did discuss this at great length in Committee, and I know she cares deeply and passionately about this issue, as do I. It is welcome that Ofcom can issue a code of practice on violence against women and girls, and we should absolutely be urging it to do that, but we also need to make it a fundamental aim of the Bill. If the Bill is to be truly world leading, if it is truly to make us all safe online, and if we are finally to begin to tackle the scourge of violence against women and girls in all its elements—not just online but offline—then violence against women and girls needs to be named as a priority harm in the Bill. We need to take the brave new step of saying that enough is enough. Words are not enough. We need actions, and this is an action the Minister could take.

Online Safety Bill

Maria Miller Excerpts
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will make a bit of progress, because I am testing Mr Speaker’s patience.

We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.

We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.

Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.

I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.

--- Later in debate ---
Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

My right hon. Friend is correct. We spoke privately before this debate, and he said this is almost five Bills in one. There will be a patchwork of legislation, and there is a time limit. This is a carry-over Bill, and we have to get it on the statute book.

This Bill is not perfect by any stretch of the imagination, and I take the Opposition’s genuine concerns about legal but harmful material. The shadow Minister mentioned the tragic case of Molly Russell. I heard her father being interviewed on the “Today” programme, and he spoke about how at least three quarters of the content he had seen that had prompted that young person to take her life had been legal but harmful. We have to stand up, think and try our best to ensure there is a safer space for young people. This Bill does part of that work, but only part. The work will be done in the execution of the Bill, through the wording on age verification and age assurance.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Given the complexities of the Bill, and given the Digital, Culture, Media and Sport Committee’s other responsibilities, will my hon. Friend join me in saying there should be a special Committee, potentially of both Houses, to keep this area under constant review? That review, as he says, is so badly needed.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

I thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.

There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.

I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.

--- Later in debate ---
I am not on my pity pot here; this is not about me. It is happening all over Scotland. Women in work are being forced out of employment. If Governments north and south of the border are to tackle online harms, we must follow through with responsible legislation. Only last week, the First Minister of Scotland, who denied any validity to the concerns I raised in 2019, eventually admitted they were true. But her response must be to halt her premature and misguided legislation, which is without any protection for the trans community, women or girls. We must make the connection from online harms all the way through to meaningful legislation at every stage.
Maria Miller Portrait Dame Maria Miller
- View Speech - Hansard - -

I rise to speak to the seven new clauses in my name and those of right hon. and hon. Members from across the House. The Government have kindly said publicly that they are minded to listen to six of the seven amendments that I have tabled on Report. I hope they will listen to the seventh, too, once they have heard my compelling arguments.

First, I believe it is important that we discuss these amendments, because the Government have not yet tabled amendments. It is important that we in this place understand the Government’s true intention on implementing the Law Commission review in full before the Bill completes its consideration.

Secondly, the law simply does not properly recognise as a criminal offence the posting online of intimate images—whether real or fake—without consent. Victims say that having a sexual image of them posted online without their consent is akin to a sexual assault. Indeed, Clare McGlynn went even further by saying that there is a big difference between a physical sexual assault and one committed online: victims are always rediscovering the online images and waiting for them to be redistributed, and cannot see when the abuse will be over. In many ways, it is even more acute.

Just in case anybody in the Chamber is unaware of the scale of the problem after the various contributions that have been made, in the past five years more than 12,000 people reported to the revenge porn helpline almost 200,000 pieces of content that fall into that category. Indeed, since 2014 there have been 28,000 reports to the police of intimate images being distributed without consent.

The final reason why I believe it is important that we discuss the new clauses is that Ofcom will be regulating online platforms based on their adherence to the criminal law, among other things. It is so important that the criminal law actually recognises where criminal harm is done, but at the moment, when it comes to intimate image abuse, it does not. Throughout all the stages of the Bill’s passage, successive Ministers have said very positive things to me about the need to address this issue in the criminal law, but we still have not seen pen being put to paper, so I hope the Minister will forgive me for raising this yet again so that he can respond.

New clauses 45 to 50 simply seek to take the Law Commission’s recommendations on intimate image abuse and put them into law as far as the scope of the Bill will allow. New clause 45 would create a base offence for posting explicit images online without consent. Basing the offence on consent, or the lack of it, makes it comparable with three out of four offences already recognised in the Sexual Offences Act 2003. Subsection (10) of the new clause recognises that it is a criminal offence to distribute fake images, deepfakes or images using nudification software, which are currently not covered in law at all.

New clauses 46 and 47 recognise cases where there is a higher level of culpability for the perpetrator, where they intend to cause alarm, distress or humiliation. Two in three victims report that they know the perpetrators, as a current or former partner. In evidence to the Public Bill Committee, on which I was very pleased to serve, we heard from the Anjelou Centre and Imkaan that some survivors of this dreadful form of abuse are also at risk of honour-based violence. There are yet more layers of abuse.

New clause 48 would make it a crime to threaten to share an intimate image—this can be just as psychologically destructive as actually sharing it—and using the image to coerce, control or manipulate the victim. I pay real tribute to the team from the Law Commission, under the leadership of Penney Lewis, who did an amazing job of work over three years on their enquiry to collect this information. In the responses to the enquiry there were four mentions of suicide or contemplated suicide as a result of threats to share these sorts of images online without consent. Around one in seven young women and one in nine young men have experienced a threat to share an intimate or sexual image. One in four calls to the Revenge Porn Helpline relate to threats to share. The list of issues goes on. In 2020 almost 3,000 people, mostly men, received demands for money related to sexual images—“sextorsion”, as it is called. This new clause would make it clear that such threats are criminal, the police need to take action and there will be proper protection for victims in law.

New clauses 49 and 50 would go further. The Law Commission is clear that intimate image abuse is a type of sexual offending. Therefore, victims should have the same protection afforded to those of other sexual offences. That is backed up by the legal committee of the Council of His Majesty’s District Judges, which argues that it is appropriate to extend automatic lifetime anonymity protections to victims, just as they would be extended to victims of offences under the Modern Slavery Act 2015. Women’s Aid underlined that point, recognising that black and minoritised women are also at risk of being disowned, ostracised or even killed if they cannot remain anonymous. The special measures in these new clauses provide for victims in the same way as the Domestic Abuse Act 2021.

I hope that my hon. Friend the Minister can confirm that the Government intend to introduce the Law Commission’s full recommendations into the Bill, and that those in scope will be included before the Bill reaches its next stage in the other place. I also hope that he will outline how those measures not in scope of the Bill—specifically on the taking and making of sexual images without consent, which formed part of the Law Commission’s recommendations—will be addressed in legislation swiftly. I will be happy to withdraw my new clauses if those undertakings are made today.

Finally, new clause 23, which also stands in my name, is separate from the Law Commission’s recommendations. It would require a proportion of the fines secured by Ofcom to be used to fund victims’ services. I am sure that the Treasury thinks that it is an innovative way of handling things, although one could argue that it did something similar only a few days ago with regard to the pollution of waterways by water companies. I am sure that the Minister might want to refer to that.

The Bill identifies that many thousands more offences are committed as crimes than are currently recognised within law. I hope that the Minister can outline how appropriate measures will be put in place to ensure support for victims, who will now, possibly for the first time, have some measures in place to assist them. I raised earlier the importance of keeping the Bill and its effectiveness under review. I hope that the House will think about how we do that materially, so we do not end up having another five or 10 years without such a Bill and having to play catch-up in such a complex area.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.

We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.

We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I understand. We are ahead of the Lords on publication, so yes is the answer.

I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.

Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.

Online Safety Bill

Maria Miller Excerpts
Consideration of Lords amendments
Tuesday 12th September 2023

(7 months, 2 weeks ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Watch Debate Read Debate Ministerial Extracts Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Once again, I wish to thank all the Members who have put together a good piece of legislation. In the spirit of generosity, let me say that the Government have tried their very best on a tricky issue, and I give credit to those on both sides of the House for this step in the right direction.
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - -

This Bill may well have been with us since April 2021 and been subject to significant change, but it remains a Bill about keeping people safer online and it remains groundbreaking. I welcome it back after scrutiny in the Lords and join others in paying tribute to those who have campaigned for social media platforms to release information following the death of a child. I am pleased that some are able to be with us today to hear this debate and the commitment to that issue.

This will never be a perfect Bill, but we must recognise that it is good enough and that we need to get it on to the statute book. The Minister has helped by saying clearly that this is not the endgame and that scrutiny will be inherent in the future of this legislation. I hope that he will heed the comments of my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who encouraged him to set up a bespoke Committee, which was one of the recommendations from the initial scrutiny of the Bill.

I will confine my remarks to the Government’s Lords amendment 263 and those surrounding it, which inserted the amendments I tabled on Report into the Bill. They relate to the sharing of intimate images online, including deepfakes, without consent. I wish wholeheartedly to say thank you to the Minister, who always listens intently, to the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar), who has recently joined him, and to the Secretary of State for Science, Innovation and Technology. They have all not only listened to the arguments on intimate image abuse, but acted. The changes today are no less a testament to their commitment to this Bill than any other area. Focusing on children’s safety is very important, but the safety of adults online is also important. We started on a journey to address intimate image abuse way back in 2015, with the Criminal Justice and Courts Act 2015, and we have learned to provide that protection much better, mostly through the work of the Law Commission and its report on how we should be tackling intimate image abuse online.

The Bill, as it has been amended, has been changed fundamentally on the treatment of intimate image abuse, in line with the debate on Report in this place. That has created four new offences. The base offence removes the idea of intent to cause distress entirely and relies only on whether there was consent from the person appearing in the image. Two more serious offences do include intent, with one being sending an image with intent to cause alarm and distress. We also now have the offence of threatening to share an image, which will protect people from potential blackmail, particularly from an abusive partner. That will make a huge difference for victims, who are still overwhelmingly women.

In his closing comments, will the Minister address the gaps that still exist, particularly around the issue of the images themselves, which, because of the scope of the Bill, will not become illegal? He and his colleagues have indicated that more legislation might be in the planning stages to address those particular recommendations by the Law Commission. Perhaps he could also comment on something that the Revenge Porn Helpline is increasingly being told by victims, which is that online platforms will not remove an image even though it may have been posted illegally, and that will not change in the future. Perhaps he can give me and those victims who might be listening today some comfort that either there are ways of addressing that matter now or that he will address it in the very near future.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- View Speech - Hansard - - - Excerpts

As we reflect on the Bill today, it is important to say that it has been improved as it has progressed through the Parliament. That is due in no small measure to Members from across the parties—both here and in the other place—who have engaged very collegiately, and to individuals and groups outside this place, particularly the Samaritans and those who have lived experience of the consequences of the dangers of the internet.

People from my constituency have also been involved, including the family of Joe Nihill, whom I have mentioned previously. At the age of 23, Joe took his own life after accessing dangerous suicide-related online content. His mother, Catherine, and sister-in-law, Melanie, have bravely campaigned to use the Online Safety Bill as an opportunity to ensure that what happened to Joe so tragically does not happen to others. I thank the Minister and his team for meeting Joe’s mother, his sister-in-law and me, and for listening to what we had to say. I recognise that, as a result, the Bill has improved, in particular with the Government’s acceptance of Lords amendment 391, which was first tabled by Baroness Morgan of Cotes. It is welcome that the Government have accepted the amendment, which will enable platforms to be placed in category 1 based on their functionality, even if they do not have a large reach. That is important, because some of the worst and most dangerous online suicide and self-harm related material appears on smaller platforms rather than the larger ones.

I also welcome the fact that the Bill creates a new communications offence of encouraging or assisting self-harm and makes such content a further priority for action, which is important. The Bill provides an historic opportunity to ensure that tackling suicide and self-harm related online content does not end with this Bill becoming law. I urge the Government to listen very carefully to what the Samaritans have said. As my hon. Friend the shadow Minister asked, will the Government commit to a review of the legislation to ensure that it has met the objective of making our country the safest place in the world in which to go online? Importantly, can the Government confirm when the consultation on the new offence of encouraging or assisting self-harm will take place?

As I mentioned in an intervention, it is clear that the Government want to tackle harmful suicide and self-harm related content with the Bill, but, as we have heard throughout our discussions, the measures do not go far enough. The Samaritans were correct to say that the Bill represents a welcome advance and that it has improved recently, but it still does not go far enough in relation to dangerous suicide and self-harm online content. How will the Government engage with people who have lived experience—people such as Melanie and Catherine—to ensure that the new laws make things better? Nobody wants the implementation of the Bill to be the end of the matter. We must redouble our efforts to make the internet as safe a place as possible, reflect on the experiences of my constituents, Joe Nihill and his family, and understand that there is a lot of dangerous suicide and self-harm related content out there. We are talking about people who exploit the vulnerable, regardless of their age.

I urge all those who are following the progress of the Bill and who look at this issue not to make the mistake of thinking that when we talk about dangerous online suicide and self-harm related content, it is somehow about freedom of speech. It is about protecting people. When we talk about dangerous online material relating to suicide and self-harm, it is not a freedom of speech issue; it is an issue of protecting people.

--- Later in debate ---
Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

I am interested in that intervention, but I fear it would lead us into a very long discussion and I want to keep my comments focused on my amendment. However, it would be interesting to hear from the Minister in response to that point, because it is a huge topic for debate.

On the point about whether someone is real or not real online, I believe passionately that not only famous people or those who can afford it should be able to show that they are a real and verified person. I say, “Roll out the blue ticks.”—or the equivalents—and not just to make the social media performs more money; as we have seen, we need it as a safety mechanism and a personal responsibility mechanism.

All the evidence and endless polling show that the public want to know who is and who is not real online, and it does not take rocket science to understand why. Dealing with faceless, anonymous accounts is very scary and anonymous abusers are terrifying. Parents are worried that they do not know who their children are speaking to, and anonymous, unverified accounts cannot be traced if details are not held.

That is before we get to how visible verification can help to tackle fraud. We should empower people to avoid fake accounts. We know that people are less likely to engage with an unverified account, and it would make it easy to catch scammers. Fraud was the most common form of crime in 2022, with 41% of all crimes being fraud, 23% of all reported fraud being initiated on social media and 80% of fraud being cyber-related. We can imagine just how fantastically clever the scams will become through AI.

Since we started this process, tech companies have recognised the value of identity verification to the public, so much so that they now sell it on Twitter as blue ticks, and the Government understand the benefits of identity verification options. The Government have done a huge amount of work on that. I thank them for agreeing to two of the three pillars of my campaign, and I believe we can get there on visibility; I know from discussions with Government that Ofcom will be looking carefully at that.

Making things simple for social media users is incredibly important. For the user verification provisions in this Bill to fulfil their potential and prevent harm, including illegal harm, we believe that users need to be able to see who is and is not verified—that is, who is a real person—and all the evidence says that that is what the public wants.

While Ministers in this place and the other place have resisted putting visible verification on the face of the Bill, I am grateful to the Government for their work on this. After a lot of to-ing and fro-ing, we are reassured that the Bill as now worded gives Ofcom the powers to do what the public wants and what we are suggesting through codes and guidance. We hope that Ofcom will consider the role of anonymous, inauthentic and non-verified accounts as it prepares its register of risks relating to illegal content and in its risk profiles.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I pay tribute to the way my hon. Friend has focused on this issue through so many months and years. Does she agree that, in light of the assurances that she has had from the Minister, this is just the sort of issue that either a stand-alone committee or some kind of scrutiny group could keep an eye on? If those guidelines do not work as the Minister is hoping, the action she has suggested will need to be taken.

Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

Absolutely. Given the fast nature of social media and the tech world, and how quickly they adapt—often for their own benefit, sadly—I think that a committee with that focus could work.

To wrap up, I thank MPs from across the House, and you, Madam Deputy Speaker, for your grace today. I have had help from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) in particular, for which I am very grateful. In the other place, Lord Clement-Jones, Lord Stevenson, Baroness Morgan, Baroness Fall and Baroness Wyld have all been absolutely excellent in pushing through these matters. I look forward to hearing what the Minister says, and thank everybody for their time.