All 9 Debates between Kirsty Blackman and Maria Miller

Thu 23rd Jun 2022
Tue 21st Jun 2022
Tue 14th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 24th May 2022
Tue 24th May 2022

Online Safety Bill (Fifteenth sitting)

Debate between Kirsty Blackman and Maria Miller
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want briefly to interject to underline the point I made in my intervention on the hon. Member for Worsley and Eccles South. I welcome the discussion about victims’ support, which picks up on what we discussed on clause 110. At that point I mentioned the NSPCC evidence that talked about the importance of third party advocacy services, due to the lack of trust in the platforms, as well as for some of the other reasons that the hon. Members for Worsley and Eccles South, for Batley and Spen, and for Aberdeen North have raised.

When we discussed clause 110, the Minister undertook to think about the issue seriously and to talk to the Treasury about whether funding could be taken directly from fines rather than those all going into the Treasury coffers. I hope the debate on new clause 3 will serve to strengthen his resolve, given the strength of support for such a measure, whether that is through a formal user advocacy service or by using existing organisations. I hope he uses the debate to strengthen his arguments about such a measure with the Treasury.

I will not support the new clause tabled by the hon. Member for Worsley and Eccles South, because I think the Minister has already undertaken to look at this issue. As I say, I hope this discussion strengthens his resolve to do so.

Online Safety Bill (Fourteenth sitting)

Debate between Kirsty Blackman and Maria Miller
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I would like to make a couple of comments. The shadow Minister mentioned education and prevention projects, which are key. In Scotland, our kids’ sex, health and relationship education in schools teaches consent from the earliest possible age. That is vital. We have a generation of men who think it is okay to send these images and not seek consent. As the shadow Minister said, the problem is everywhere. So many women have received images that they had no desire to see. They did not ask for them, and they did not consent to receive them, but they get them.

Requiring someone to prove the intent behind the offence is just impossible. It is so unworkable, and that makes it really difficult. This is yet another issue that makes it clear that we need to have reference to violence against women and girls on the face of the Bill. If that were included, we would not be making such a passionate case here. We would already have a code of conduct and assessments that have to take place on the basis of the specific harm to women and girls from such offences. We would not be making the case so forcefully because it would already be covered.

I wish the Minister would take on board how difficult it is for women and girls online, how much of an issue this specific action causes and how much pain and suffering it causes. It would great if the Minister could consider moving somewhat on this issue in order to protect women and girls.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I want to make sure that the record is clear that while I did receive a dick pic, I am not a millennial. That shoes how widespread this problem is. My children would want that on the record.

Research done by YouGov showed that half of millennial women have been sent a photo of a penis, and that nine in 10 women who have ever received such a picture did not want to have it sent to them. To anybody who is trying to—I do not feel anybody today is—advocate that this is a small issue or a minority problem, the data suggest that it is not.

For the record, I think the reason I was sent that picture was not sexual at all. I think it was intimidatory. I was sitting in a train carriage on my way into Parliament on a hot day, and I think it was sent as intimidation because I could not leave that carriage and I had, in error, left my AirDrop on. Okay, that was my fault, but let us not victim blame.

I very much welcome the Minister’s approach, because he is the first person to take forward a series of new offences that are needed to clarify the law as it affects people in this area. As he was talking, I was reflecting on his use of the word “clarity”, and I think he is absolutely right. He is rightly looking to the Law Commission as the expert for how we interpret and how we get the most effective law in place.

Although we are not talking about the intimate image abuse recommendations in this part of the Bill, I draw to the Committee’s attention that I, and others, will have received an email from the Law Commission today setting out that it will bring forward its recommendations next month. I hope that that means that the Minister will bring forward something concrete to us about those particular offences in the coming weeks. He is right that when it comes to cyber-flashing, we need to get it right. We need to make sure that we follow the experts. The Law Commission was clear when it undertook its review that the current law does not adequately address these issues. I was pleased when it made that recommendation.

A great many people have looked at these issues, and I pay tribute to each and every one of them, though they come to slightly different conclusions about how we interpret the Law Commission’s recommendations and how we move forward. Professor Clare McGlynn is an expert. Bumble has done work on this; my hon. Friend the Member for Brecon and Radnorshire (Fay Jones) has done a great deal of work too, and I recognise her contribution.

The offence is particularly pernicious because it is as prevalent as indecent exposure. It is right that the offence is recognised in the Sex Offenders Act 2003 as a result. As the hon. Member for Pontypridd said, it is another form of gendered crime online. On the evidence of harm that it causes, she referenced the evidence that we got from Professor McGlynn about Gaia Pope. That was particularly concerning. I do not think any of us in the Committee would argue that this is not the most serious of offences, and I commend the Minister for bringing forward a serious set of recommendations to tackle it.

Online Safety Bill (Tenth sitting)

Debate between Kirsty Blackman and Maria Miller
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have a short comment on clause 56, which is an important clause because it will provide an analysis of how the legislation is working, and that is what Members want to see. To the point that the hon. Member for Pontypridd set out, it is right that Ofcom probably will not report until 2026, given the timeframe for the Bill being enacted. I would not necessarily want Ofcom to report sooner, because system changes take a long time to bed in. It does pose the question, however, of how Parliament will be able to analyse whether the legislation or its approach need to change between now and 2026. That reiterates the need—which I and other hon. Members have pointed out—for some sort of standing committee to scrutinise the issues. I do not personally think it would be right to get Ofcom to report earlier, because it might be an incomplete report.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The right hon. Lady’s speech inspired me to stand up and mention a couple of things. My first question is about using empowerment around this clause. The clause applies only to adults. I can understand the issues that there may be with verifying the identity of children, but if that means that children are unable to block unverified accounts because they cannot verify their own account, the internet becomes a less safe place for children than for adults in this context, which concerns me.

To be honest, I do not know how children’s identities could be verified, but giving them access to the filters that would allow them to block unverified accounts, whether or not they are able to verify themselves—because they are children and therefore may not have the identity documentation they need—would be very helpful.

I appreciate the points that the right hon. Member was making, and I completely agree with her on the requirement for user verification, but I have to say that I believe there is a place for anonymity on the internet. I can understand why, for a number of people, that is the only way that they can safely access some of the community support that they need.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Just for clarity, the twin-track approach does not outlaw anonymity. It just means that people have verified accounts by default; they do not have to opt into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

Online Safety Bill (Eighth sitting)

Debate between Kirsty Blackman and Maria Miller
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a quick question about timelines because I am slightly confused about the order in which everything will happen. It is unlikely that the Bill will have been through the full parliamentary process before the summer, yet Ofcom intends to publish information and guidance by the summer, even though some things, such as the codes of practice, will not come in until after the Bill has received Royal Assent. Will the Minister give a commitment that, whether or not the Bill has gone through the whole parliamentary process, Ofcom will be able to publish before the summer?

Will Ofcom be encouraged to publish everything, whether that is guidance, information on its website or the codes of practice, at the earliest point at which they are ready? That will mean that anyone who has to apply those codes of practice or those regulations—people who will have to work within those codes, for example, or charities or other organisations that might be able to make super-complaints—will have as much information as possible, as early as possible, and will be able to prepare to fully implement their work at the earliest possible time. They will need that information in order to be able to gear up to do that.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have three short questions for the Minister about clause 40 and the Secretary of State’s powers of direction. Am in order to cover that?

Online Safety Bill (Seventh sitting)

Debate between Kirsty Blackman and Maria Miller
Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

As the right hon. Member for Basingstoke has mentioned the revenge porn helpline, I will mention the NSPCC’s Report Remove tool for children. It does exactly the same thing, but for younger people—the revenge porn helpline is specifically only for adults. Both those tools together cover the whole gamut, which is massively helpful.

The right hon. Lady’s suggestion about the hypothecation of fines is a very good one. I was speaking to the NSPCC yesterday, and one of the issues that we were discussing was super-complaints. Although super-complaints are great and I am very glad that they are included in the Bill, the reality is that some of the third-sector organisations that are likely to be undertaking super-complaints are charitable organisations that are not particularly well funded. Given how few people work for some of those organisations and the amazing amount of work they do, if some of the money from fines could support not just victims but the initial procedure for those organisations to make super-complaints, it would be very helpful. That is, of course, if the Minister does not agree with the suggestion of creating a user advocacy panel, which would fulfil some of that role and make that support for the charitable organisations less necessary—although I am never going to argue against support for charities: if the Minister wants to hypothecate it in that way, that would be fantastic.

I tabled amendments 78 and 79, but the statement the Minister made about the definition of users gives me a significant level of comfort about the way that people will be able to access a complaints procedure. I am terribly disappointed that the Minister is not a regular Reddit user. I am not, either, but I am well aware of what Reddit entails. I have no desire to sign up to Reddit, but knowing that even browsing the site I would be considered a user and therefore able to report any illegal content I saw, is massively helpful. On that basis, I am comfortable not moving amendments 78 and 79.

On the suggestion of an ombudsman—I am looking at new clause 1—it feels like there is a significant gap here. There are ombudsman services in place for many other areas, where people can put in a complaint and then go to an ombudsman should they feel that it has not been appropriately addressed. As a parliamentarian, I find that a significant number of my constituents come to me seeking support to go to the ombudsman for whatever area it is in which they feel their complaint has not been appropriately dealt with. We see a significant number of issues caused by social media companies, in particular, not taking complaints seriously, not dealing with complaints and, in some cases, leaving illegal content up. Particularly in the initial stages of implementation—in the first few years, before companies catch up and are able to follow the rules put in place by the Bill and Ofcom—a second-tier complaints system that is removed from the social media companies would make things so much better than they are now. It would provide an additional layer of support to people who are looking to make complaints.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I am sure the hon. Lady will agree with me that it is not either/or—it is probably both. Ultimately, she is right that an ombudsman would be there to help deal with what I think will be a lag in implementation, but if someone is a victim of online intimate image abuse, in particular, they want the material taken down immediately, so we need to have organisations such as those that we have both mentioned there to help on the spot. It has to be both, has it not?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. Both those helplines do very good work, and they are absolutely necessary. I would strongly support their continuation in addition to an ombudsman-type service. Although I am saying that the need for an ombudsman would likely be higher in the initial bedding-in years, it will not go away—we will still need one. With NHS complaints, the system has been in place for a long time, and it works pretty well in the majority of cases, but there are still cases it gets wrong. Even if the social media companies behave in a good way and have proper complaints procedures, there will still be instances of them getting it wrong. There will still be a need for a higher level. I therefore urge the Minister to consider including new clause 1 in the Bill.

Online Safety Bill (Fifth sitting)

Debate between Kirsty Blackman and Maria Miller
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger.

I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.

I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?

That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.

First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?

Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.

Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.

It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.

Online Safety Bill (Fourth sitting)

Debate between Kirsty Blackman and Maria Miller
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

Online Safety Bill (Second sitting)

Debate between Kirsty Blackman and Maria Miller
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Online Safety Bill (First sitting)

Debate between Kirsty Blackman and Maria Miller
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.