All 15 Barbara Keeley contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 24th May 2022
Tue 24th May 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage

Online Safety Bill (First sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q That was to be my next question: do you think it is an issue that category 1 services will not have to publish child risk assessments? It seems to me that it would be better if they did.

Richard Wronka: I think what is important for us as a regulator is that we are able to access those risk assessments; and for the biggest services, the category 1 services, we would be expecting to do that routinely through a supervisory approach. We might even do that proactively, or where services have come to us for dialogue around those—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q But would it not improve transparency if they did have to publish them? Why would they not want to publish them?

Richard Wronka: Some services may wish to publish the risk assessments. There is nothing in the Bill or in our regulated approach that would prevent that. At the moment, I do not see a requirement in the Bill to do that. Some services may have concerns about the level of confidential information in there. The important point for us is that we have access to those risk assessments.

Kevin Bakhurst: Picking up on the risk assessments, it is a tricky question because we would expect those assessments to be very comprehensive and to deal with issues such as how algorithms function, and so on. There is a balance between transparency, which, as Richard says, we will drive across the regime—to address information that can harm, or people who are trying to behave badly online or to game the system—and what the regulator needs in practical terms. I am sure the platforms will be able to talk to you more about that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q May I ask follow-up some questions about resources and timing once the Bill has gone through? You said you are going to open a new digital and technology hub in Manchester, with the creation of 150 jobs. I have a couple of questions on that. Do you think that what is set out in the proposal will be enough? Will you have the resources to carry out the duties set out in the Bill? This is a follow-up point from my colleague’s question earlier.

There is also a question of timing. The reports suggested that the new hub and jobs will come into play in 2025. I am sure that everyone here wants to see the Bill taking effect sooner. Ofcom will need to do a lot of reviews and reporting in the first year after the Bill receives Royal Assent. How will that be possible if people are not in post until 2025?

Kevin Bakhurst: They are both big questions. I will take the first part and maybe Richard can take the second one about the timing. On the resourcing, it is important to say publicly that we feel strongly that, very unusually, we have had funding from Government to prepare for this regime. I know how unusual that is; I was at a meeting with the European regulators last week, and we are almost unique in that we have had funding and in the level of funding that we have had.

The funding has meant that we are already well advanced in our preparations. We have a team of around 150 people working on online safety across the organisation. A number are in Manchester, but some are in London or in our other offices around the UK. It is important to say that that funding has helped us to get off to a really strong start in recruiting people across the piece—not just policy people. Importantly, we have set up a new digital function within Ofcom and recruited a new chief technology officer, who came from Amazon Alexa, to head up that function.

The funding has allowed us to really push hard into this space, which is not easy, and to recruit some of the skills we feel we need to deliver this regime as effectively and rapidly as possible. I know that resourcing is not a matter within the Bill; it is a separate Treasury matter. Going forward though, we feel that, in the plans, we have sufficient resourcing to deliver what we are being asked to deliver. The team will probably double in size by the time we actually go live with the regime. It is a significant number of people.

Some significant new duties have been added in, such as fraudulent advertising, which we need to think carefully about. That is an important priority for us. It requires a different skillset. It was not in the original funding plan. If there are significant changes to the Bill, it is important that we remain alive to having the right people and the right number of people in place while trying to deliver with maximum efficiency. Do you want to talk about timing, Richard?

Richard Wronka: All I would add to that, Kevin, is that we are looking to front-load our recruitment so that we are ready to deliver on the Bill’s requirements as quickly as possible once it receives Royal Assent and our powers commence. That is the driving motivation for us. In many cases, that means recruiting people right now, in addition to the people we have already recruited to help with this.

Clearly there is a bit of a gating process for the Bill, so we will need a settled legislative framework and settled priority areas before we can get on with the consultation process. We will look to run that consultation process as swiftly as possible once we have those powers in place. We know that some stakeholders are very keen to see the Bill in place and others are less enthusiastic, so we need to run a robust process that will stand the test of time.

The Bill itself points us towards a phased process. We think that illegal content, thanks to the introduction of priority illegal content in the Bill, with those priority areas, is the area on which we can make the quickest progress as soon as the Bill achieved Royal Assent.

None Portrait The Chair
- Hansard -

Thank you. I intend to bring in the Minister at about 10 o’clock. Kirsty Blackman, Kim Leadbeater and Dean Russell have indicated that they wish to ask questions, so let us try to keep to time.

--- Later in debate ---
None Portrait The Chair
- Hansard -

And on the screen—[Interruption.] Uh-oh, it has frozen. We will have to come back to that. We will take evidence from the witnesses in the room until we have sorted out the problem with the screen.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Do you think there is enough in the Bill to make sure that the voices of children at risk of online harms are heard? There is a super-complaints mechanism, but do you think it goes far enough for children, and are you confident that the regime will be able to quickly respond to new and emerging harms to children? Could Andy Burrows start?

Andy Burrows: Thank you for the question. We think that more could be built into the Bill to ensure that children’s needs and voices can be fed into the regime.

One of the things that the NSPCC would particularly like to see is provision for statutory user advocacy arrangements, drawing on the examples that we see in multiple other regulated sectors, where we have a model by which the levy on the firms that will cover the costs of the direct regulation also provides for funded user advocacy arrangements that can serve as a source of expertise, setting out children’s needs and experiences.

A comparison here would be the role that Citizens Advice plays in the energy and postal markets as the user voice and champion. We think that would be really important in bolstering the regulatory settlement. That can also help to provide an early warning function—particularly in a sector that is characterised by very rapid technological and market change—to identify new and emerging harms, and bolster and support the regulator in that activity. That, for us, feels like a crucial part of this jigsaw.

Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.

Dame Rachel de Souza: I was very pleased when the Government asked me, when I came into the role, to look at what more could be done to keep children safe online and to make sure that their voices went right through the passage of the Bill. I am committed to doing that. Obviously, as Children’s Commissioner, my role is to elevate children’s voices. I was really pleased to convene a large number of charities, internet safety organisations and violence against women and girls experts in a joint briefing to MPs to try to get children’s voices over.

I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and talking their complaints into account. I know you have a busy day, but that is the key point that I want to get across.

None Portrait The Chair
- Hansard -

Lynn Perry is back on the screen—welcome. Would you like to introduce yourself for the record and then answer the question? [Interruption.] Oh, she has gone again. Apparently the problem is at Lynn’s end, so we will just have to live with it; there is nothing we can do on this side.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Is the Bill future-proof? If you think it is not, how can we ensure that it is responsive to future risks and harms?

Andy Burrows: The systemic regime is important. That will help to ensure that the regime can be future-proofed; clearly, it is important that we are not introducing a set of proposals and then casting them in aspic. But there are ways that the Bill could be more strongly future-proofed, and that links to ensuring that the regime can effectively map on to the dynamics of the child sexual abuse problem in particular.

Let me give a couple of examples of where we think the Bill could be bolstered. One is around placing a duty on companies to consider the cross-platform nature of harm when performing their risk assessment functions, and having a broad, overarching duty to ask companies to work together to tackle the child sexual abuse threat. That is very important in terms of the current dynamics of the problem. We see, for example, very well-established grooming pathways, where abusers will look to exploit the design features of open social networks, such as on Instagram or Snapchat, before moving children and abuse on to perhaps live-streaming sites or encrypted messaging sites.

The cross-platform nature of the threat is only going to intensify in the years ahead as we start to look towards the metaverse, for example. It is clear that the metaverse will be built on the basis of being cross-platform and interdependent in nature. We can also see the potential for unintended consequences from other regulatory regimes. For example, the Digital Markets Act recently passed by the EU has provisions for interoperability. That effectively means that if I wanted to send you a message on platform A, you could receive it on platform B. There is a potential unintended consequence there that needs to be mitigated; we need to ensure that there is a responsibility to address the harm potential that could come from more interoperable services.

This is a significant area where the Bill really can be bolstered to address the current dynamics of the problem and ensure that legislation is as effective as it possibly can be. Looking to the medium to long term, it is crucial to ensure that we have arrangements that are commensurate to the changing nature of technology and the threats that will emerge from that.

Dame Rachel de Souza: A simple answer from me: of course we cannot future-proof it completely, because of the changing nature of online harms and technology. I talked to a large number of 16 to 21-year-olds about what they wished their parents had known about technology and what they had needed to keep them safe, and they listed a range of things. No. 1 was age assurance—they absolutely wanted good age assurance.

However, the list of harms and things they were coming across—cyber-flashing and all this—is very much set in time. It is really important that we deal with those things, but they are going to evolve and change. That is why we have to build in really good cross-platform work, which we have been talking about. We need these tech companies to work together to be able to stay live to the issues. We also need to make sure that we build in proper advocacy and listen to children and deal with the issues that come up, and that the Bill is flexible enough to be able to grow in that way. Any list is going to get timed out. We need to recognise that these harms are there and that they will change.

None Portrait The Chair
- Hansard -

I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q I want to ask about the many tragic cases of teenagers who have died by suicide after viewing self-harm material online. Do you think coroners have sufficient powers to access digital data after the death of a child, and should parents have the right to access their children’s digital data following their death?

Dame Rachel de Souza: Baroness Kidron has done some fantastic work on this, and I really support her work. I want to tell you why. I am a former headteacher—I worked for 30 years in schools as a teacher and headteacher. Only in the last five or six years did I start seeing suicides of children and teenagers; I did not see them before. In the year just before I came to be Children’s Commissioner, there was a case of a year 11 girl from a vulnerable family who had a relationship with a boy, and it went all over the social media sites. She looked up self-harm material, went out to the woods and killed herself. She left a note that basically said, “So there. Look what you’ve done.”

It was just horrendous, having to pick up the family and the community of children around her, and seeing the long-term effects of it on her siblings. We did not see things like that before. I am fully supportive of Baroness Kidron and 5Rights campaigning on this issue. It is shocking to read about the enormous waiting and wrangling that parents must go through just to get their children’s information. It is absolutely shocking. I think that is enough from me.

Andy Burrows: I absolutely agree. One of the things we see at the NSPCC is the impact on parents and families in these situations. I think of Ian Russell, whose daughter Molly took her own life, and the extraordinarily protracted process it has taken to get companies to hand over her information. I think of the anguish and heartbreak that comes with this process. The Bill is a fantastic mechanism to be able to redress the balance in terms of children and families, and we would strongly support the amendments around giving parents access to that data, to ensure that this is not the protracted process that it currently all too often is.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Just quickly, do coroners have sufficient powers? Should they have more powers to access digital data after the death of a child?

Andy Burrows: We can see what a protracted process it has been. There have been improvements to the process. It is currently a very lengthy process because of the mutual legal assistance treaty arrangements—MLAT, as they are known—by which injunctions have to be sought to get data from US companies. It has taken determination from some coroners to pursue cases, very often going up against challenges. It is an area where we think the arrangements could certainly be streamlined and simplified. The balance here should shift toward giving parents and families access to the data, so that the process can be gone through quickly and everything can be done to ease the heartbreak for families having to go through those incredibly traumatic situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Very briefly, Dame Rachel, I will build on what you were just saying, based on your experience as a headteacher. When I make my school visits, the teachers overwhelmingly tell me how, on a daily basis, they have to deal with the fallout from an issue that has happened online or on social media. On that matter, the digital media literacy strategy is being removed from the Bill. What is your thinking on that? How important do you see a digital media literacy strategy being at the heart of whatever policy the Government try to make regarding online safety for children?

Dame Rachel de Souza: There is no silver bullet. This is now a huge societal issue and I think that some of the things that I would want to say would be about ensuring that we have in our educational arsenal, if you like, a curriculum that has a really strong digital media literacy element. To that end, the Secretary of State for Education has just asked me to review how online harms and digital literacy are taught in schools—reviewing not the curriculum, but how good the teaching is and what children think about how the subject has been taught, and obviously what parents think, too.

I would absolutely like to see the tech companies putting some significant funding into supporting education of this kind; it is exactly the kind of thing that they should be working together to provide. So we need to look at this issue from many aspects, not least education.

Obviously, in a dream world I would like really good and strong digital media literacy in the Bill, but actually it is all our responsibility. I know from my conversations with Nadhim Zahawi that he is very keen that this subject is taught through the national curriculum, and very strongly.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Can I ask about children’s risk assessments? Who in your organisation will write the children’s risk assessments, and at what level in your organisation will they be signed off?

Katy Minshall: At present, we have a range of risk assessment processes. We have a risk committee of the board. We do risk assessments when we make a change about—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q No, I mean the children’s risk assessment you will have to do as part of what the Bill will bring in.

Katy Minshall: At present, we do not have a specific individual designated to do the children’s risk assessment. The key question is how much does Ofcom’s guidance on risk assessments—once we see it—intersect with our current processes versus changes we would need to make to our risk assessment processes?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Okay. At what level in the organisation do you anticipate children’s risk assessment would be signed off? Clearly, this is a very important aspect of the Bill.

Katy Minshall: I would have to go away and review the Bill. I do not know whether a specific level is set out in the Bill, but we would want to engage with the regulation and requirements set for companies such as Twitter. However it would be expected that is what we would—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Do you think it should be signed off at a senior level—board level—in your organisation?

Katy Minshall: Already all the biggest decisions that we make as a company are signed off at the most senior level. We report to our chief executive, Parag Agrawal, and then to the board. As I say, there is a risk committee of the board, so I expect that we would continue to make those decisions at the highest level.

Ben Bradley: It is broadly the same from a TikTok perspective. Safety is a priority for every member of the team, regardless of whether they are in a specific trust and safety function. In terms of risk assessments, we will see from the detail of the Bill at what level they need to be signed off, but our CEO has been clear in interviews that trust and safety is a priority for him and everyone at TikTok, so it would be something to which we are all committed.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Do you think you would be likely to sign it off at the board level—

None Portrait The Chair
- Hansard -

Sorry, I have to interrupt you there. I call the Minister.

Online Safety Bill (Second sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

Thank you very much. Ms Foreman, do you want to add anything to that? You do not have to.

Becky Foreman: I do not have anything to add.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

Q I want to come back to transparency, which we touched on with my colleague Alex Davies-Jones earlier. Clearly, it is very important, and I think we could take a big step forward with the Bill. I want to ask you about child risk assessments, and whether they should be available publicly. I also want to ask about reports on the measures that you will have to take, as platforms, to manage the risks and mitigate the impact of harm. Harm is occurring at the moment—for example, content that causes harm is being left up. We heard earlier from the NSPCC that Facebook would not take down birthday groups for eight, nine and 10-year-old children, when it is known what purpose those birthday groups were serving for those young children. I guess my question on transparency is, “Can’t you do much better, and should there be public access to reports on the level of harm?”

Richard Earley: There are quite a few different questions there, and I will try to address them as briefly as I can. On the point about harmful Facebook groups, if a Facebook group is dedicated to breaking any of our rules, we can remove that group, even if no harmful content has been posted in it. I understand that was raised in the context of breadcrumbing, so trying to infer harmful intent from innocuous content. We have teams trying to understand how bad actors circumvent our rules, and to prevent them from doing that. That is a core part of our work, and a core part of what the Bill needs to incentivise us to do. That is why we have rules in place to remove groups that are dedicated to breaking our rules, even if no harmful content is actually posted in them.

On the question you asked about transparency, the Bill does an admirable job of trying to balance different types of transparency. There are some kinds of transparency that we believe are meaningful and valid to give to users. I gave the example a moment ago of explaining why a piece of content was removed and which of our community standards it broke. There is other transparency that we think is best given in a more general sense. We have our transparency report, as I said, where we give the figures for how much content we remove, how much of it we find ourselves—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q I am not talking here about general figures for what you have removed. I am talking about giving real access to the data on the risks of harm and the measures to mitigate harm. You could make those reports available to academics—we could find a way of doing that—and that would be very valuable. Surely what we want to do is to generate communities, including academics and people who have the aim of improving things, but you need to give them access to the data. You are the only ones who have access to the data, so it will just be you and Ofcom. A greater community out there who can help to improve things will not have that access.

Richard Earley: I completely agree. Apologies for hogging more time, but I think you have hit on an important point there, which is about sharing information with researchers. Last year, we gave data to support the publishing of more than 400 independent research projects, carried out along the lines you have described here. Just yesterday, we announced an expansion of what is called our Facebook open research tool, which expands academics’ ability to access data about advertising.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q My question is, will you publish the risk assessment and the measures you are taking to mitigate?

Richard Earley: Going back to how the Bill works, when it comes to—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

No, I am not just asking about the Bill. Will you do that?

Richard Earley: We have not seen the Ofcom guidance on what those risk assessments should contain yet, so it is not possible to say. I think more transparency should always be the goal. If we can publish more information, we will do so.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q It would be good to have that goal. Can I come to you, Katie O’Donovan?

Katie O'Donovan: To begin with, I would pick up on the importance of transparency. We at Google and YouTube publish many reports on a quarterly or annual basis to help understand the actions we are taking. That ranges from everything on YouTube, where we publish by country the content we have taken down, why we have taken it down, how it was detected and the number of appeals. That is incredibly important information. It is good for researchers and others to have access to that.

We also do things around ads that we have removed and legal requests from different foreign Governments, which again has real validity. I think it is really important that Ofcom will have access to how we work through this—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q I was not just asking about Ofcom; I was wanting to go further than that and have wider access.

Katie O'Donovan: I do not want to gloss over the Ofcom point; I want to dwell on it for a second. In anticipation of this Bill, we were able to have conversations with Ofcom about how we work, the risks that we see and how our systems detect that. Hopefully, that is very helpful for Ofcom to understand how it will audit and regulate us, but it also informs how we need to think and improve our systems. I do think that is important.

We make a huge amount of training data available at Google. We publish a lot of shared APIs to help people understand what our data is doing. We are very open to publishing and working with academics.

It is difficult to give a broad statement without knowing the detail of what that data is. One thing I would say—it always sound a bit glib when people in my position say this—is that, in some cases, we do need to be limited in explaining exactly how our systems work to detect bad content. On YouTube, you have very clear community guidelines, which we know we have to publish, because people have a right to know what content is allowed and what is not, but we will find people who go right up to the line of that content very deliberately and carefully—they understand that, almost from a legal perspective. When it comes to fraudulent services and our ads, we have also seen people pivot the way that they attempt to defraud us. There needs to be some safe spaces to share that information. Ofcom is helpful for that too.

None Portrait The Chair
- Hansard -

Okay. Kim Leadbetter, one very quick question. We must move on—I am sorry.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

That is fine.

Professor Clare McGlynn: I know that there was a discussion this morning about age assurance, which obviously targets children’s access to pornography. I would emphasise that age assurance is not a panacea for the problems with pornography. We are so worried about age assurance only because of the content that is available online. The pornography industry is quite happy with age verification measures. It is a win-win for them: they get public credibility by saying they will adopt it; they can monetise it, because they are going to get more data—especially if they are encouraged to develop age verification measures, which of course they have been; that really is putting the fox in charge of the henhouse—and they know that it will be easily evaded.

One of the most recent surveys of young people in the UK was of 16 and 17-year-olds: 50% of them had used a VPN, which avoids age verification controls, and 25% more knew about that, so 75% of those older children knew how to evade age assurance. This is why the companies are quite happy—they are going to make money. It will stop some people stumbling across it, but it will not stop most older children accessing pornography. We need to focus on the content, and when we do that, we have to go beyond age assurance.

You have just heard Google talking about how it takes safety very seriously. Rape porn and incest porn are one click away on Google. They are freely and easily accessible. There are swathes of that material on Google. Twitter is hiding in plain sight, too. I know that you had a discussion about Twitter this morning. I, like many, thought, “Yes, I know there is porn on Twitter,” but I must confess that until doing some prep over the last few weeks, I did not know the nature of that porn. For example, “Kidnapped in the wood”; “Daddy’s little girl comes home from school; let’s now cheer her up”; “Raped behind the bin”—this is the material that is on Twitter. We know there is a problem with Pornhub, but this is what is on Twitter as well.

As the Minister mentioned this morning, Twitter says you have to be 13, and you have to be 18 to try to access much of this content, but you just put in whatever date of birth is necessary—it is that easy—and you can get all this material. It is freely and easily accessible. Those companies are hiding in plain sight in that sense. The age verification and age assurance provisions, and the safety duties, need to be toughened up.

To an extent, I think this will come down to the regulator. Is the regulator going to accept Google’s SafeSearch as satisfying the safety duties? I am not convinced, because of the easy accessibility of the rape and incest porn I have just talked about. I emphasise that incest porn is not classed as extreme pornography, so it is not a priority offence, but there are swathes of that material on Pornhub as well. In one of the studies that I did, we found that one in eight titles on the mainstream pornography sites described sexually violent material, and the incest material was the highest category in that. There is a lot of that around.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q We are talking here about pornography when it is hosted on mainstream websites, as opposed to pornographic websites. Could I ask you to confirm what more, specifically, you think the Bill should do to tackle pornography on mainstream websites, as you have just been describing with Twitter? What should the Bill be doing here?

Professor Clare McGlynn: In many ways, it is going to be up to the regulator. Is the regulator going to deem that things such as SafeSearch, or Twitter’s current rules about sensitive information—which rely on the host to identify their material as sensitive—satisfy their obligations to minimise and mitigate the risk? That is, in essence, what it will all come down to.

Are they going to take the terms and conditions of Twitter, for example, at face value? Twitter’s terms and conditions do say that they do not want sexually violent material on there, and they even say that it is because they know it glorifies violence against women and girls, but this material is there and does not appear to get swiftly and easily taken down. Even when you try to block it—I tried to block some cartoon child sexual abuse images, which are easily available on there; you do not have to search for them very hard, it literally comes up when you search for porn—it brings you up five or six other options in case you want to report them as well, so you are viewing them as well. Just on the cartoon child sexual abuse images, before anyone asks, they are very clever, because they are just under the radar of what is actually a prohibited offence.

It is not necessarily that there is more that the Bill itself could do, although the code of practice would ensure that they have to think about these things more. They have to report on their transparency and their risk assessments: for example, what type of content are they taking down? Who is making the reports, and how many are they upholding? But it is then on the regulator as to what they are going to accept as acceptable, frankly.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Do any other panellists want to add to that?

Janaya Walker: Just to draw together the questions about pornography and the question you asked about children, I wanted to highlight one of the things that came up earlier, which was the importance of media literacy. We share the view that that has been rolled back from earlier versions of the draft Bill.

There has also been a shift, in that the emphasis of the draft Bill was also talking about the impact of harm. That is really important when we are talking about violence against women and girls, and what is happening in the context of schools and relationship and sex education. Where some of these things like non-consensual image sharing take place, the Bill as currently drafted talks about media literacy and safe use of the service, rather than the impact of such material and really trying to point to the collective responsibility that everyone has as good digital citizens—in the language of Glitch—in terms of talking about online violence against women and girls. That is an area in which the Bill could be strengthened from the way it is currently drafted.

Jessica Eagelton: I completely agree with the media literacy point. In general, we see very low awareness of what tech abuse is. We surveyed some survivors and did some research last year—a public survey—and almost half of survivors told no one about the abuse they experienced online at the hands of their partner or former partner, and many of the survivors we interviewed did not understand what it was until they had come to Refuge and we had provided them with support. There is an aspect of that to the broader media literacy point as well: increasing awareness of what is and is not unacceptable behaviour online, and encouraging members of the public to report that and call it out when they see it.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Thank you. Can I ask for a bit more detail on a question that you touched on earlier with my colleague Kirsty Blackman? It is to Professor McGlynn, really. I think you included in your written evidence to the Committee a point about using age and consent verification for pornography sites for people featured in the content of the site—not the age verification assurance checks on the sites, but for the content. Could I just draw out from you whether that is feasible, and would it be retrospective for all videos, or just new ones? How would that work?

Professor Clare McGlynn: Inevitably, it would have to work from any time that that requirement was put in place, in reality. That measure is being discussed in the Canadian Parliament at the moment—you might know that Pornhub’s parent company, MindGeek, is based in Canada, which is why they are doing a lot of work in that regard. The provision was also put forward by the European Parliament in its debates on the Digital Services Act. Of course, any of these measures are possible; we could put it into the Bill that that will be a requirement.

Another way of doing it, of course, would be for the regulator to say that one of the ways in which Pornhub, for example—or XVideos or xHamster—should ensure that they are fulfilling their safety duties is by ensuring the age and consent of those for whom videos are uploaded. The flipside of that is that we could also introduce an offence for uploading a video and falsely representing that the person in the video had given their consent to that. That would mirror offences in the Fraud Act 2006.

The idea is really about introducing some element of friction so that there is a break before images are uploaded. For example, with intimate image abuse, which we have already talked about, the revenge porn helpline reports that for over half of the cases of such abuse that it deals with, the images go on to porn websites. So those aspects are really important. It is not just about all porn videos; it is also about trying to reduce the distribution of non-consensual videos.

Nick Fletcher Portrait Nick Fletcher (Don Valley) (Con)
- Hansard - - - Excerpts

Q I think that it would have been better to hear from you three before we heard from the platforms this morning. Unfortunately, you have opened my eyes to a few things that I wish I did not have to know about—I think we all feel the same.

I am concerned about VPNs. Will the Bill stop anyone accessing through VPNs? Is there anything we can do about that? I googled “VPNs” to find out what they were, and apparently there is a genuine need for them when using public networks, because it is safer. Costa Coffee suggests that people do so, for example. I do not know how we could work that.

You have obviously educated me, and probably some of my colleagues, about some of the sites that are available. I do not mix in circles where I would be exposed to that, but obviously children and young people do and there is no filter. If I did know about those things, I would probably not speak to my colleagues about it, because that would probably not be a good thing to do, but younger people might think it is quite funny to talk about. Do you think there is an education piece there for schools and parents? Should these platforms be saying to them, “Look, this is out there, even though you might not have heard of it—some MPs have not heard of it.” We ought to be doing something to protect children by telling parents what to look out for. Could there be something in the Bill to force them to do that? Do you think that would be a good idea? There is an awful lot there to answer—sorry.

Professor Clare McGlynn: On VPNs, I guess it is like so much technology: obviously it can be used for good, but it can also be used to evade regulations. My understanding is that individuals will be able to use a VPN to avoid age verification. On that point, I emphasise that in recent years Pornhub, at the same time as it was talking to the Government about developing age verification, was developing its own VPN app. At the same time it was saying, “Of course we will comply with your age verification rules.”

Don’t get me wrong: the age assurance provisions are important, because they will stop people stumbling across material, which is particularly important for the very youngest. In reality, 75% know about VPNs now, but once it becomes more widely known that this is how to evade it, I expect that all younger people will know how to do so. I do not think there is anything else you can do in the Bill, because you are not going to outlaw VPNs, for the reasons you identified—they are actually really important in some ways.

That is why the focus needs to be on content, because that is what we are actually concerned about. When you talk about media literacy and understanding, you are absolutely right, because we need to do more to educate all people, including young people—it does not just stop at age 18—about the nature of the pornography and the impact it can have. I guess that goes to the point about media literacy as well. It does also go to the point about fully and expertly resourcing sex and relationships education in school. Pornhub has its own sex education arm, but it is not the sex education arm that I think many of us would want to be encouraging. We need to be doing more in that regard.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We also have Dr Rachel O’Connell, who is the CEO of TrustElevate. Good afternoon.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Does the Bill differentiate enough between services that have different business models? If not, what do you think are the consequences of the lack of differentiation, and where could more differentiation be introduced? Shall we start with you, Jared Sine?

Jared Sine: Sure—thank you for the question. Business models play a pretty distinct role in the incentives of the companies. When we talk to people about Match Group and online dating, we try to point out a couple of really important things that differentiate what we do in the dating space from what many technology companies are doing in the social media space. One of those things is how we generate our revenue. The overwhelming majority of it is subscription-based, so we are focused not on time on platform or time on device, but on whether you are having a great experience, because if you are, you are going to come back and pay again, or you are going to continue your subscription with us. That is a really big differentiator, in terms of the business model and where incentives lie, because we want to make sure they have a great experience.

Secondly, we know we are helping people meet in real life. Again, if people are to have a great experience on our platforms, they are going to have to feel safe on them, so that becomes a really big focus for us.

Finally, we are more of a one-to-one platform, so people are not generally communicating to large groups, so that protects us from a lot of the other issues you see on some of these larger platforms. Ultimately, what that means is that, for our business to be successful, we really have to focus on safety. We have to make sure users come, have a good, safe experience, and we have to have tools for them to use and put in place to empower themselves so that they can be safe and have a great experience. Otherwise, they will not come back and tell their friends.

The last thing about our platforms is that ultimately, if they are successful, our users leave them because they are engaged in a relationship, get married or just decide they are done with dating all together—that happens on occasion, too. Ultimately, our goal is to make sure that people have that experience, so safety becomes a core part of what we do. Other platforms are more focused on eyeballs, advertising sales and attention—if it bleeds, it leads—but those things are just not part of the equation for us.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q And do you think the Bill differentiates enough? If not, what more could be done in it?

Jared Sine: We are very encouraged by the Bill. We think it allows for different codes of conduct or policy, as it relates to the various different types of businesses, based on the business models. That is exciting for us because we think that ultimately those things need to be taken into account. What are the drivers and the incentives in place for those businesses? Let us make sure that we have regulations in place that address those needs, based on the approaches of the businesses.

None Portrait The Chair
- Hansard -

Nima, would you like go next?

Nima Elmi: Thank you very much for inviting me along to this discussion. Building on what Jared said, currently the Bill is not very clear in terms of references to categorisations of services. It clusters together a number of very disparate platforms that have different platform designs, business models and corporate aims. Similarly to Match Group, our platform is focused much more on one-to-one communications and subscription-based business models. There is an important need for the Bill to acknowledge these different types of platforms and how they engage with users, and to ensure appropriate guidance from Ofcom on how they should be categorised, rather than clustering together a rather significant amount of companies that have very different business aims in in this space.

None Portrait The Chair
- Hansard -

Dr O’Connell, would you like to answer?

Dr Rachel O'Connell: Absolutely. I think those are really good points that you guys have raised. I would urge a little bit of caution around that though, because I think about Yellow Tinder, which was the Tinder for teens, which has been rebranded as Yubo. It transgresses: it is a social media platform; it enables livestreaming of teens to connect with each other; it is ultimately for dating. So there is a huge amount of risk. It is not a subscription-based service.

I get the industry drive to say, “Let’s differentiate and let’s have clarity”, but in a Bill, essentially the principles are supposed to be there. Then it is for the regulator, in my view, to say, at a granular level, that when you conduct a risk impact assessment, you understand whether the company has a subscription-based business model, so the risk is lower, and also if there is age checking to make sure those users are 18-plus. However, you must also consider that there are teen dating sites, which would definitely fall under the scope of this Bill and the provisions that it is trying to make to protect kids and to reduce the risk of harm.

While I think there is a need for clarity, I would urge caution. For the Bill to have some longevity, being that specific about the categorisations will have some potential unintended consequences, particularly as it relates to children and young people.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q The next question is really about age verification, which you have touched on, so let us start with you, Dr O’Connell. What do you think the Bill should contain to enable age verification or the age assurance needed to protect children online?

Dr Rachel O'Connell: There is a mention of age assurance in the Bill. There is an opportunity to clarify that a little further, and also to bring age verification services under the remit of the Bill, as they are serving and making sure that they are mitigating risk. There was a very clear outline by Elizabeth Denham when we were negotiating the Digital Economy Act in relation to age verification and adult content sites; she was very specific when she came to Committee and said it should be a third party conducting the checks. If you want to preserve privacy and security, it should be a third-party provider that runs the checks, rather than companies saying, “You know what? We’ll track everybody for the purposes of age verification.”

There needs to be a clear delineation, which currently in clause 50 is not very clear. I would recommend that that be looked at again and that some digital identity experts be brought into that discussion, so that there is a full appreciation. Currently, there is a lot of latitude for companies to develop their own services in-house for age verification, without, I think, a proper risk assessment of what that might mean for end users in terms of eroding their privacy.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q TikTok were talking to us earlier about their age verification. If companies do it themselves rather than it being a third party, where does that fall down?

Dr Rachel O'Connell: That means you have to track and analyse people’s activities and you are garnering a huge amount of data. If you are then handling people under the age of 13, under the Data Protection Act, you must obtain parental consent prior to processing data. By definition, you have to gather the data from parents. I have been working in this space for 25 years. I remember, in 2008, when the Attorneys General brought all the companies together to consider age verification as part of the internet safety technical task force, the arguments of industry—I was in industry at the time—were that it would be overly burdensome and a privacy risk. Looking back through history, industry has said that it does not want to do that. Now, there is an incentive to potentially do that, because you do not have to pay for a third party to do it, but what are the consequences for the erosion of privacy and so on?

I urge people to think carefully about that, in particular when it comes to children. It would require tracking children’s activities over time. We do not want our kids growing up in a surveillance society where they are being monitored like that from the get-go. The advantage of a third-party provider is that they can have a zero data model. They can run the checks without holding the data, so you are not creating a data lake. The parent or child provides information that can be hashed on the device and checked against data sources that are hashed, which means there is no knowledge. It is a zero data model.

The information resides on the user’s device, which is pretty cool. The checks are done, but there is no exposure and no potential for man-in-the-middle checks. The company then gets a token that says “This person is over 18”, or “This person is below 12. We have verified parental responsibility and that verified parent has given consent.” You are dealing with tokens that do not contain any personal information, which is a far better approach than companies developing things in-house.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q I think the TikTok example was looking at materials and videos and seeing whether they mention school or birthdays as a way of verifying age. As you say, that does involve scanning the child’s data.

None Portrait The Chair
- Hansard -

Q Can I see if Ms Elmi wants to come in? She tends to get left out on a limb, on the screen. Are you okay down there? Do you need to come in on this, or are you happy?

Nima Elmi: Yes, I am. I have nothing to add.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Jared Sine, did you have anything to add?

Jared Sine: Sure. I would add a couple of thoughts. We run our own age verification scans, which we do through the traditional age gate but also through a number of other scans that we run.

Again, online dating platforms are a little different. We warn our users upfront that, as they are going to be meeting people in real life, there is a fine balance between safety and privacy, and we tend to lean a little more towards safety. We announce to our users that we are going to run message scans to make sure there is no inappropriate behaviour. In fact, one of the tools we have rolled out is called “Are you sure? Does this bother you?”, through which our AI looks at the message a user is planning to send and, if it is an inappropriate message, a flag will pop up that says, “Are you sure you want to send this?” Then, if they go ahead and send it, the person receiving it at the other end will get a pop-up that says, “This may not be something you want to see. Go ahead and click here if you want to.” If they open it, they then get another pop-up that asks “Does this bother you?” and, if it does, you can report the user immediately.

We think that is an important step to keep our platform safe. We make sure our users know that it is happening, so it is not under the table. However, we think there has to be a balance between safety and privacy, especially when we have users who are meeting in person. We have actually demonstrated on our platforms that this reduces harassment and behaviour that would otherwise be untoward or that you would not want on the platform.

We think that we have to be careful not to tie the hands of industry to be able to come up with technological solutions and advances that can work side by side with third-party tools and solutions. We have third-party ID verification tools that we use. If we identify or believe a user is under the age of 18, we push them through an ID verification process.

The other thing to remember, particularly as it relates to online dating, is that companies such as ours and Bumble have done the right thing by saying “18-plus only on our platforms”. There is no law that says that an online dating platform has to be 18-plus, but we think it is right thing to do. I am a father of five kids; I would not want kids on my platform. We are very vigilant in taking steps to make sure we are using the latest and greatest tools available to try to make sure that our platforms are safe.

Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Rachel, we have, in you, what we are told is a leading, pre-eminent authority on the issue of age verification, so we are listening very carefully to what you say. I am thinking about the evidence we had earlier today, which said that it is reasonably straightforward for a large majority of young people to subvert age verification through the use of VPNs. You have been advocating third-party verification. How could we also deal with this issue of subverting the process through the use of the VPNs?

Dr Rachel O'Connell: I am the author of the technical standard PAS 1296, an age checking code of practice, which is becoming a global standard at the moment. We worked a lot with privacy and security and identity experts. It should have taken nine months, but it took a bit longer. There was a lot of thought that went into it. Those systems were developed to, as I just described, ensure a zero data, zero knowledge kind of model. What they do is enable those verifications to take place and reduce the requirement. There is a distinction between monitoring your systems, as was said earlier, for age verification purposes and abuse management. They are very different. You have to have abuse management systems. It is like saying that if you have a nightclub, you have to have bouncers. Of course you have to check things out. You need bouncers at the door. You cannot let people go into the venue, then afterwards say that you are spotting bad behaviour. You have to check at the door that they are the appropriate age to get into the venue.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have one last question. Rhiannon, a suggestion was made earlier by Dr Rachel O’Connell about age verification and only allowing children to interact with other children whose age is verified within a certain area. Do you think that would help to prevent online grooming?

Rhiannon-Faye McDonald: It is very difficult. While I am strongly about protecting children from encountering perpetrators, I also recognise that children need to have freedoms and the ability to use the internet in the ways that they like. I think if that was implemented and it was 100% certain that no adult could pose as a 13-year-old and therefore interact with actual 13-year-olds, that would help, but I think it is tricky.

Susie Hargreaves: One of the things we need to be clear about, particularly where we see children groomed —we are seeing younger and younger children—is that we will not ever sort this just with technology; the education piece is huge. We are now seeing children as young as three in self-generated content, and we are seeing children in bedrooms and domestic settings being tricked, coerced and encouraged into engaging in very serious sexual activities, often using pornographic language. Actually, a whole education piece needs to happen. We can put filters and different technology in place, but remember that the IWF acts after the event—by the time we see this, the crime has been committed, the image has been shared and the child has already been abused. We need to bump up the education side, because parents, carers, teachers and children themselves have to be able to understand the dangers of being online and be supported to build their resilience online. They are definitely not to be blamed for things that happen online. From Rhiannon’s own story, how quickly it can happen, and how vulnerable children are at the moment—I don’t know.

Rhiannon-Faye McDonald: For those of you who don’t know, it happened very quickly to me, within the space of 24 hours, from the start of the conversation to the perpetrator coming to my bedroom and sexually assaulting me. I have heard other instances where it has happened much more quickly than that. It can escalate extremely quickly.

Just to add to Susie’s point about education, I strongly believe that education plays a huge part in this. However, we must be very careful in how we educate children, so that the focus is not on how to keep themselves safe, because puts the responsibility on them, which in turn increases the feelings of responsibility when things do go wrong. That increased feeling of responsibility makes it less likely that they will disclose that something has happened to them, because they feel that they will be blamed. It will decrease the chance that children will tell us that something has happened.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Just to follow up on a couple of things, mainly with Susie Hargreaves. You mentioned reporting mechanisms and said that reporting will be a step forward. However, the Joint Committee on the draft Bill recommended that the highest-risk services should have to report quarterly data to Ofcom on the results of their child sexual exploitation and abuse removal systems. What difference would access to that kind of data make to your work?

Susie Hargreaves: We already work with the internet industry. They currently take our services and we work closely with them on things such as engineering support. They also pay for our hotline, which is how we find child sexual abuse. However, the difference it would make is that we hope then to be able to undertake work where we are directly working with them to understand the level of their reports and data within their organisations.

At the moment, we do not receive that information from them. It is very much that we work on behalf of the public and they take our services. However, if we were suddenly able to work directly with them—have information about the scale of the issue within their own organisations and work more directly on that— then that would help to feed into our work. It is a very iterative process; we are constantly developing the technology to deal with the current threats.

It would also help us by giving us more intelligence and by allowing us to share that information, on an aggregated basis, more widely. It would certainly also help us to understand that they are definitely tackling the problem. We do believe that they are tackling the problem, because it is not in their business interests not to, but it just gives a level of accountability and transparency that does not exist at the moment.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q You also said earlier that there was nothing in the Bill on co-designation—nothing to recognise the Internet Watch Foundation’s 25 years of experience. Do you still expect to be co-designated as a regulator by Ofcom, and if so, what do you expect your role to be?

Susie Hargreaves: At the moment, there is nothing on the face of the Bill on co-designation. We do think that child sexual abuse is different from other types of harm, and when you think about the huge number of harms, and the scale and complexity of the Bill, Ofcom has so much to work with.

We have been working with Ofcom for the past year to look at exactly what exactly our role would be. However, because we are the country’s experts on dealing with child sexual abuse material, because we have the relationships with the companies, and because we are an internationally renowned organisation, we are able to have that trusted relationship and then undertake a number of functions for Ofcom. We could help to undertake specific investigations, help update the code, or provide that interface between Ofcom and the companies where we undertake that work on their behalf.

We very much feel that we should be doing that. It is not about being self-serving, but about recognising the track record of the organisation and the fact that the relationships and technology are in place. We are already experts in this area, so we are able to work directly with those companies because we already work with them and they trust us. Basically, we have a memorandum of understanding with the CPS and the National Police Chiefs’ Council that protects our staff from prosecution but the companies all work with us on a voluntary basis. They already work with us, they trust our data, and we have that unique relationship with them.

We are able to provide that service to take the pressure off Ofcom because we are the experts in the field. We would like that clarified because we want this to be right for children from day one—you cannot get it wrong when dealing with child sexual abuse. We must not undo or undermine the work that has happened over the last 25 years.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q Just to be clear, is there uncertainty somewhere in there? I am just trying to comprehend.

Susie Hargreaves: There is uncertainty, because we do not know exactly what our relationship with Ofcom is going to be. We are having discussions and getting on very well, but we do not know anything about what the relationship will be or what the criteria and timetable for the relationship are. We have been working on this for nearly five years. We have analysts who work every single day looking at child sexual abuse; we have 70 members of staff, and about half of them look at child sexual abuse every day. They are dealing with some of the worse material imaginable, they are already in a highly stressful situation and they have clear welfare needs; uncertainty does not help. What we are looking for is certainty and clarity that child sexual abuse is so important that it is included on the face of the Bill, and that should include co-designation.

None Portrait The Chair
- Hansard -

Thank you. One question from Kim Leadbeater.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Barbara Keeley?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—

Online Safety Bill (Sixth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 28, in clause 10, page 9, line 18, at end insert—

“(ba) matters relating to CSEA content including—

(i) the level of illegal images blocked at the upload stage and number and rates of livestreams of CSEA in public and private channels terminated; and

(ii) the number and rates of images and videos detected and removed by different tools, strategies and/or interventions.”

This amendment requires the children’s risk assessment to consider matters relating to CSEA content.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

As this is the first time I have spoken in the Committee, may I say that it is a pleasure to serve with you in the Chair, Ms Rees? I agree with my hon. Friend the Member for Pontypridd that we are committed to improving the Bill, despite the fact that we have some reservations, which we share with many organisations, about some of the structure of the Bill and some of its provisions. As my hon. Friend has detailed, there are particular improvements to be made to strengthen the protection of children online, and I think the Committee’s debate on this section is proving fruitful.

Amendment 28 is a good example of where we must go further if we are to achieve the goal of the Bill and protect children from harm online. The amendment seeks to require regulated services to assess their level of risk based, in part, on the frequency with which they are blocking, detecting and removing child sexual exploitation and abuse content from their platforms. By doing so, we will be able to ascertain the reality of their overall risk and the effectiveness of their existing response.

The addition of livestreamed child sexual exploitation and abuse content not only acknowledges first-generation CSEA content, but recognises that livestreamed CSEA content happens on both public and private channels, and that they require different methods of detection.

Furthermore, amendment 28 details the practical information needed to assess whether the action being taken by a regulated service is adequate in countering the production and dissemination of CSEA content, in particular first-generation CSEA content. Separating the rates of terminated livestreams of CSEA in public and private channels is important, because those rates may vary widely depending on how CSEA content is generated. By specifying tools, strategies and interventions, the amendment would ensure that the systems in place to detect and report CSEA are adequate, and that is why we would like it to be part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government support the spirit of amendments 17 and 28, which seek to achieve critical objectives, but the Bill as drafted delivers those objectives. In relation to amendment 17 and cross-platform risk, clause 8 already sets out harms and risks—including CSEA risks—that arise by means of the service. That means through the service to other services, as well as on the service itself, so that is covered.

Amendment 28 calls for the risk assessments expressly to cover illegal child sexual exploitation content, but clause 8 already requires that to happen. Clause 8(5) states that the risk assessment must cover the

“risk of individuals who are users of the service encountering…each kind of priority illegal content”.

If we follow through the definition of priority illegal content, we find all those CSEA offences listed in schedule 6. The objective of amendment 28 is categorically delivered by clause 8(5)(b), referencing onwards to schedule 6.

--- Later in debate ---
Children’s Risk Assessment duties
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 15, in clause10, page 8, line 41, at end insert—

“(4A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 11, in clause 10, page 9, line 2, at end insert—

“(5A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 27, in clause 10, page 9, line 25, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider the production of illegal content.

Clause 10 stand part.

Amendment 16, in clause 25, page 25, line 10, at end insert—

‘”(3A) A duty for the children’s risk assessment to be approved by either—

(a) the board of the entity; or, if the organisation does not have a board structure,

(b) a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties, and reports directly into the most senior employee of the entity.”

This amendment seeks to ensure that regulated companies’ boards or senior staff have responsibility for children’s risk assessments.

Amendment 13, in clause 25, page 25, line 13, at end insert—

“(4A) A duty to publish the children’s risk assessment and proactively supply this to OFCOM.”

This amendment creates a duty to publish the children’s risk assessment and supply it to Ofcom.

Amendment 32, in clause 25, page 25, line 31, after “facilitating” insert “the production of illegal content and”

This amendment requires the children’s risk assessment to consider risks relating to the production of illegal content.

Clause 25 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will speak to other amendments in this group as well as amendment 15. The success of the Bill’s regulatory framework relies on regulated companies carefully risk-assessing their platforms. Once risks have been identified, the platform can concentrate on developing and implementing appropriate mitigations. However, up to now, boards and top executives have not taken the risk to children seriously. Services have either not considered producing risk assessments or, if they have done so, they have been of limited efficacy and failed to identify and respond to harms to children.

In evidence to the Joint Committee, Frances Haugen explained that many of the corporate structures involved are flat, and accountability for decision making can be obscure. At Meta, that means teams will focus only on delivering against key commercial metrics, not on safety. Children’s charities have also noted that corporate structures in the large technology platforms reward employees who move fast and break things. Those companies place incentives on increasing return on investment rather than child safety. An effective risk assessment and risk mitigation plan can impact on profit, which is why we have seen so little movement from companies to take the measures themselves without the duty being placed on them by legislation.

It is welcome that clause 10 introduces a duty to risk-assess user-to-user services that are likely to be accessed by children. But, as my hon. Friend the Member for Pontypridd said this morning, it will become an empty, tick-box exercise if the Bill does not also introduce the requirement for boards to review and approve the risk assessments.

The Joint Committee scrutinising the draft Bill recommended that the risk assessment be approved at board level. The Government rejected that recommendation on the grounds thar Ofcom could include that in its guidance on producing risk assessments. As with much of the Bill, it is difficult to blindly accept promised safeguards when we have not seen the various codes of practice and guidance materials. The amendments would make sure that decisions about and awareness of child safety went right to the top of regulated companies. The requirement to have the board or a senior manager approve the risk assessment will hardwire the safety duties into decision making and create accountability and responsibility at the most senior level of the organisation. That should trickle down the organisation and help embed a culture of compliance across it. Unless there is a commitment to child safety at the highest level of the organisation, we will not see the shift in attitude that is urgently needed to keep children safe, and which I believe every member of the Committee subscribes to.

On amendments 11 and 13, it is welcome that we have risk assessments for children included in the Bill, but the effectiveness of that duty will be undermined unless the risk assessments can be available for scrutiny by the public and charities. In the current version of the Bill, risk assessments will only be made available to the regulator, which we debated on an earlier clause. Companies will be incentivised to play down the likelihood of currently emerging risks because of the implications of having to mitigate against them, which may run counter to their business interests. Unless the risk assessments are published, there will be no way to hold regulated companies to account, nor will there be any way for companies to learn from one another’s best practice, which is a very desirable aim.

The current situation shows that companies are unwilling to share risk assessments even when requested. In October 2021, following the whistleblower disclosures made by Frances Haugen, the National Society for the Prevention of Cruelty to Children led a global coalition of 60 child protection organisations that urged Meta to publish its risk assessments, including its data privacy impact assessments, which are a legal requirement under data protection law. Meta refused to share any of its risk assessments, even in relation to child sexual abuse and grooming. The company argued that risk assessments were live documents and it would not be appropriate for it to share them with any organisation other than the Information Commissioner’s Office, to whom it has a legal duty to disclose. As a result, civil society organisations and the charities that I talked about continue to be in the dark about whether and how Meta has appropriately identified online risk to children.

Making risk assessments public would support the smooth running of the regime and ensure its broader effectiveness. Civil society and other interested groups would be able to assess and identify any areas where a company might not be meeting its safety duties and make full, effective use of the proposed super-complaints mechanism. It will also help civil society organisations to hold the regulated companies and the regulator, Ofcom, to account.

As we have seen from evidence sessions, civil society organisations are often at the forefront of understanding and monitoring the harms that are occurring to users. They have an in depth understanding of what mitigations may be appropriate and they may be able to support the regulator to identify any obvious omissions. The success of the systemic risk assessment process will be significantly underpinned by and reliant upon the regulator’s being able to rapidly and effectively identify new and emerging harms, and it is highly likely that the regulator will want to draw on civil society expertise to ensure that it has highly effective early warning functions in place.

However, civil society organisations will be hampered in that role if they remain unable to determine what, if anything, companies are doing to respond to online threats. If Ofcom is unable to rapidly identify new and emerging harms, the resulting delays could mean entire regulatory cycles where harms were not captured in risk profiles or company risk assessments, and an inevitable lag between harms being identified and companies being required to act upon them. It is therefore clear that there is a significant public value to publishing risk assessments.

Amendments 27 and 32 are almost identical to the suggested amendments to clause 8 that we discussed earlier. As my hon. Friend the Member for Pontypridd said in our discussion about amendments 25, 26 and 30, the duty to carry out a suitable and sufficient risk assessment could be significantly strengthened by preventing the creation of illegal content, not only preventing individuals from encountering it. I know the Minister responded to that point, but the Opposition did not think that response was fully satisfactory. This is just as important for children’s risk assessments as it is for illegal content risk assessments.

Online platforms are not just where abusive material is published. Sex offenders use mainstream web platforms and services as tools to commit child sexual abuse. This can be seen particularly in the livestreaming of child sexual exploitation. Sex offenders pay to direct and watch child sexual abuse in real time. The Philippines is a known hotspot for such abuse and the UK has been identified by police leads as the third-largest consumer of livestreamed abuse in the world. What a very sad statistic that our society is the third-largest consumer of livestreamed abuse in the world.

Ruby is a survivor of online sexual exploitation in the Philippines, although Ruby is not her real name; she recently addressed a group of MPs about her experiences. She told Members how she was trafficked into sexual exploitation aged 16 after being tricked and lied to about the employment opportunities she thought she would be getting. She was forced to perform for paying customers online. Her story is harrowing. She said:

“I blamed myself for being trapped. I felt disgusted by every action I was forced to do, just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape that I would shout whenever I heard a police siren go by, hoping somebody would hear me. One time after I did this, a woman in the house threatened me with a knife.”

Eventually, Ruby was found by the Philippine authorities and, after a four-year trial, the people who imprisoned her and five other girls were convicted. She said it took many years to heal from the experience, and at one point she nearly took her own life.

It should be obvious that if we are to truly improve child protection online we need to address the production of new child abuse material. In the Bill, we have a chance to address not only what illegal content is seen online, but how online platforms are used to perpetrate abuse. It should not be a case of waiting until the harm is done before taking action.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As the hon. Lady said, we discussed in the groupings for clauses 8 and 9 quite a few of the broad principles relating to children, but I will none the less touch on some of those points again because they are important.

On amendment 27, under clause 8 there is already an obligation on platforms to put in place systems and processes to reduce the risk that their services will be used to facilitate the presence of illegal content. As that includes the risk of illegal content being present, including that produced via the service’s functionality, the terrible example that the hon. Lady gave is already covered by the Bill. She is quite right to raise that example, because it is terrible when such content involving children is produced, but such cases are expressly covered in the Bill as drafted, particularly in clause 8.

Amendment 31 covers a similar point in relation to search. As I said for the previous grouping, search does not facilitate the production of content; it helps people to find it. Clearly, there is already an obligation on search firms to stop people using search engines to find illegal content, so the relevant functionality in search is already covered by the Bill.

Amendments 15 and 16 would expressly require board member sign-off for risk assessments. I have two points to make on that. First, the duties set out in clause 10(6)(h) in relation to children’s risk assessments already require the governance structures to be properly considered, so governance is directly addressed. Secondly, subsection (2) states that the risk assessment has to be “suitable and sufficient”, so it cannot be done in a perfunctory or slipshod way. Again, Ofcom must be satisfied that those governance arrangements are appropriate. We could invent all the governance arrangements in the world, but the outcome needs to be delivered and, in this case, to protect children.

Beyond governance, the most important things are the sanctions and enforcement powers that Ofcom can use if those companies do not protect children. As the hon. Lady said in her speech, we know that those companies are not doing enough to protect children and are allowing all kinds of terrible things to happen. If those companies continue to allow those things to happen, the enforcement powers will be engaged, and they will be fined up to 10% of their global revenue. If they do not sort it out, they will find that their services are disconnected. Those are the real teeth that will ensure that those companies comply.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I know that the Minister listened to Frances Haugen and to the members of charities. The charities and civil society organisations that are so concerned about this point do not accept that the Bill addresses it. I cannot see how his point addresses what I said about board-level acceptance of that role in children’s risk assessments. We need to change the culture of those organisations so that they become different from how they were described to us. He, like us, was sat there when we heard from the big platform providers, and they are not doing enough. He has had meetings with Frances Haugen; he knows what they are doing. It is good and welcome that the regulator will have the powers that he mentions, but that is just not enough.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the hon. Lady that, as I said a second ago, those platforms are not doing enough to protect children. There is no question about that at all, and I think there is unanimity across the House that they are not doing enough to protect children.

I do not think the governance point is a panacea. Frankly, I think the boards of these companies are aware of what is going on. When these big questions arise, they go all the way up to Mark Zuckerberg. It is not as if Mark Zuckerberg and the directors of companies such as Meta are unaware of these risks; they are extremely aware of them, as Frances Haugen’s testimony made clear.

We do address the governance point. As I say, the risk assessments do need to explain how governance matters are deployed to consider these things—that is in clause 10(6)(h). But for me, it is the sanctions—the powers that Ofcom will have to fine these companies billions of pounds and ultimately to disconnect their service if they do not protect our children—that will deliver the result that we need.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister is talking about companies of such scale that even fines of billions will not hurt them. I refer him to the following wording in the amendments:

“a named individual who the provider considers to be a senior manager of the entity, who may reasonably be expected to be in a position to ensure compliance with the children’s risk assessment duties”.

That is the minimum we should be asking. We should be asking these platforms, which are doing so much damage and have had to be dragged to the table to do anything at all, to be prepared to appoint somebody who is responsible. The Minister tries to gloss over things by saying, “Oh well, they must be aware of it.” The named individual would have to be aware of it. I hope he understands the importance of his role and the Committee’s role in making this happen. We could make this happen.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Barbara Keeley, do you have anything to add?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

All I have to add is the obvious point—I am sure that we are going to keep running into this—that people should not have to look to a transcript to see what the Minister’s and Parliament’s intention was. It is clear what the Opposition’s intention is—to protect children. I cannot see why the Minister will not specify who in an organisation should be responsible. It should not be a question of ploughing through transcripts of what we have talked about here in Committee; it should be obvious. We have the chance here to do something different and better. The regulator could specify a senior level.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we are legislating here to cover, as I think we said this morning, 25,000 different companies. They all have different organisational structures, different personnel and so on. To anticipate the appropriate level of decision making in each of those companies and put it in the Bill in black and white, in a very prescriptive manner, might not adequately reflect the range of people involved.

--- Later in debate ---

Division 10

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 72, in clause 10, page 9, line 24, after “characteristic” insert “or characteristics”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 73, in clause 10, page 9, line 24, after “group” insert “or groups”.

Amendment 85, in clause 12, page 12, line 22, leave out subsection (d) and insert—

“(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with certain characteristics or members of certain groups;”.

This amendment would recognise the intersectionality of harms.

Amendment 74, in clause 12, page 12, line 24, after “characteristic” insert “or characteristics”.

Amendment 75, in clause 12, page 12, line 24, after “group” insert “or groups”.

Amendment 71, in clause 83, page 72, line 12, at end insert—

“(1A) For each of the above risks, OFCOM shall identify and assess the level of risk of harm which particularly affects people with certain characteristics or membership of a group or groups.”

This amendment requires Ofcom as part of its risk register to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

May I say—this might be a point of order—how my constituency name is pronounced? I get a million different versions, but it is Worsley, as in “worse”. It is an unfortunate name for a great place.

I will speak to all the amendments in the group together, because they relate to how levels of risk are assessed in relation to certain characteristics. The amendments are important because small changes to the descriptions of risk assessment will help to close a significant gap in protection.

Clauses 10 and 12 introduce a duty on regulated companies to assess harms to adults and children who might have an innate vulnerability arising from being a member of a particular group or having a certain characteristic. However, Ofcom is not required to assess harms to people other than children who have that increased innate vulnerability. Amendment 71 would require Ofcom to assess risks of harm particularly affecting people with certain characteristics or membership of a group or groups as part of its risk register. That would reduce the regulatory burden if companies had Ofcom’s risk assessment to base their work on.

Getting this right is important. The risk management regime introduced by the Bill should not assume that all people are at the same risk of harm—they are clearly not. Differences in innate vulnerability increase the incidence and impact of harm, such as by increasing the likelihood of encountering content or of that content being harmful, or heightening the impact of the harm.

It is right that the Bill emphasises the vulnerability of children, but there are other, larger groups with innate vulnerability to online harm. As we know, that often reflects structural inequalities in society.

For example, women will be harmed in circumstances where men might not be, and they could suffer some harms that have a more serious impact than they might for men. A similar point can be made for people with other characteristics. Vulnerability is then compounded by intersectional issues—people might belong to more than one high-risk group—and I will come to that in a moment.

The initial Ofcom risk assessment introduced by clause 83 is not required to consider the heightened risks to different groups of people, but companies are required to assess that risk in their own risk assessments for children and adults. They need to be given direction by an assessment by Ofcom, which amendment 71 would require.

Amendments 72 to 75 address the lack of recognition in these clauses of intersectionality issues. They are small amendments in the spirit of the Bill’s risk management regime. As drafted, the Bill refers to a singular “group” or “characteristic” for companies to assess for risk. However, some people are subject to increased risks of harm arising from being members of more than one group. Companies’ risk assessments for children and adults should reflect intersectionality, and not just characteristics taken individually. Including the plural of “group” and “characteristic” in appropriate places would achieve that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I will first speak to our amendment 85, which, like the Labour amendment, seeks to ensure that the Bill is crystal clear in addressing intersectionality. We need only consider the abuse faced by groups of MPs to understand why that is necessary. Female MPs are attacked online much more regularly than male MPs, and the situation is compounded if they have another minority characteristic. For instance, if they are gay or black, they are even more likely to be attacked. In fact, the MP who is most likely to be attacked is black and female. There are very few black female MPs, so it is not because of sheer numbers that they are at such increased risk of attack. Those with a minority characteristic are at higher risk of online harm, but the risk facing those with more than one minority characteristic is substantially higher, and that is what the amendment seeks to address.

I have spoken specifically about people being attacked on Twitter, Facebook and other social media platforms, but people in certain groups face an additional significant risk. If a young gay woman does not have a community around her, or if a young trans person does not know anybody else who is trans, they are much more likely to use the internet to reach out, to try to find people who are like them, to try to understand. If they are not accepted by their family, school or workplace, they are much more likely to go online to find a community and support—to find what is out there in terms of assistance—but using the internet as a vulnerable, at-risk person puts them at much more significant risk. This goes back to my earlier arguments about people requiring anonymity to protect themselves when using the internet to find their way through a difficult situation in which they have no role models.

It should not be difficult for the Government to accept this amendment. They should consider it carefully and understand that all of us on the Opposition Benches are making a really reasonable proposal. This is not about saying that someone with only one protected characteristic is not at risk; it is about recognising the intersectionality of risk and the fact that the risk faced by those who fit into more than one minority group is much higher than that faced by those who fit into just one. This is not about taking anything away from the Bill; it is about strengthening it and ensuring that organisations listen.

We have heard that a number of companies are not providing the protection that Members across the House would like them to provide against child sexual abuse. The governing structures, risk assessments, rules and moderation at those sites are better at ensuring that the providers make money than they are at providing protection. When regulated providers assess risk, it is not too much to ask them to consider not just people with one protected characteristic but those with multiple protected characteristics.

As MPs, we work on that basis every day. Across Scotland and the UK, we support our constituents as individuals and as groups. When protected characteristics intersect, we find ourselves standing in Parliament, shouting strongly on behalf of those affected and giving them our strongest backing, because we know that that intersection of harms is the point at which people are most vulnerable, in both the real and the online world. Will the Minister consider widening the provision so that it takes intersectionality into account and not only covers people with one protected characteristic but includes an over and above duty? I genuinely do not think it is too much for us to ask providers, particularly the biggest ones, to make this change.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I have nothing to add. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 10 ordered to stand part of the Bill.

Clause 11

Safety duties protecting children

None Portrait The Chair
- Hansard -

We now come to amendment 95, tabled by the hon. Member for Upper Bann, who is not on the Committee. Does anyone wish to move the amendment? If not, we will move on.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 29, in clause 11, page 10, line 20, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 33, in clause 26, page 26, line 18, at end insert—

“(c) prevent the sexual or physical abuse of a child by means of that service.”

This amendment establishes a duty to prevent the sexual or physical abuse of a child by means of a service.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The purpose of this clause is to ensure that children at risk of online harms are given protections from harmful, age-inappropriate content through specific children’s safety duties for user-to-user services likely to be accessed by children.

It is welcome that the Bill contains strong provisions to ensure that service providers act upon and mitigate the risks identified in the required risk assessment, and to introduce protective systems and processes to address what children encounter. This amendment aims to ensure that online platforms are proactive in their attempts to mitigate the opportunity for sex offenders to abuse children.

As we have argued with other amendments, there are missed opportunities in the Bill to be preventive in tackling the harm that is created. The sad reality is that online platforms create an opportunity for offenders to identify, contact and abuse children, and to do so in real time through livestreaming. We know there has been a significant increase in online sexual exploitation during the pandemic. With sex offenders unable to travel or have physical contact with children, online abuse increased significantly.

In 2021, UK law enforcement received a record 97,727 industry reports relating to online child abuse, a 29% increase on the previous year, which is shocking. An NSPCC freedom of information request to police forces in England and Wales last year showed that online grooming offences reached record levels in 2020-21, with the number of sexual communications with a child offences in England and Wales increasing by almost 70% in three years. There has been a deeply troubling trend in internet-facilitated abuse towards more serious sexual offences against children, and the average age of children in child abuse images, particularly girls, is trending to younger ages.

In-person contact abuse moved online because of the opportunity there for sex offenders to continue exploiting children. Sadly, they can do so with little fear of the consequences, because detection and disruption of livestreamed abuse is so low. The duty to protect children from sexual offenders abusing them in real time and livestreaming their exploitation cannot be limited to one part of the internet and tech sector. While much of the abuse might take place on the user-to-user services, it is vital that protections against such abuse are strengthened across the board, including in the search services, as set out in clause 26.

At the moment there is no list of harms in the Bill that must be prioritised by regulated companies. The NSPCC and others have suggested including a new schedule, similar to schedule 7, setting out what the primary priority harms should be. It would be beneficial for the purposes of parliamentary scrutiny for us to consider the types of priority harm that the Government intend the Bill to cover, rather than leaving that to secondary legislation. I hope the Minister will consider that and say why it has not yet been included.

To conclude, while we all hope the Bill will tackle the appalling abuse of children currently taking place online, this cannot be achieved without tackling the conditions in which these harms can take place. It is only by requiring that steps be taken across online platforms to limit the opportunities for sex offenders to abuse children that we can see the prevalence of this crime reduced.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise, hopefully to speak to clause 11 more generally—or will that be a separate stand part debate, Ms Rees?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 26 stand part.

Online Safety Bill (Seventh sitting)

Barbara Keeley Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I certainly do not think I am suggesting that the bigger platforms such as Twitter and Facebook will reduce their reporting mechanisms as a result of how the Bill is written. However, it is possible that newer or smaller platforms, or anything that starts after this legislation comes, could limit the ability to report on the basis of these clauses.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

Good morning, Ms Rees.

It is important that users of online services are empowered to report harmful content, so that it can be removed. It is also important for users to have access to complaints procedures when wrong moderation decisions have been made. Reporting and complaint mechanisms are integral to ensuring that users are safe and that free speech is upheld, and we support these provisions in the Bill.

Clauses 17 and 18, and clauses 27 and 28, are two parts of the same process: content reporting by individual users, and the handling of content reported as a complaint. However, it is vital that these clauses create a system that works. That is the key point that Labour Members are trying to make, because the wild west system that we have at the moment does not work.

It is welcome that the Government have proposed a system that goes beyond the users of the platform and introduces a duty on companies. However, companies have previously failed to invest enough money in their complaints systems for the scale at which they are operating in the UK. The duties in the Bill are an important reminder to companies that they are part of a wider society that goes beyond their narrow shareholder interest.

One example of why this change is so necessary, and why Labour Members are broadly supportive of the additional duties, is the awful practice of image abuse. With no access to sites on which their intimate photographs are being circulated, victims of image abuse have very few if any routes to having the images removed. Again, the practice of image abuse has increased during the pandemic, including through revenge porn, which the Minister referred to. The revenge porn helpline reported that its case load more than doubled between 2019 and 2020.

These clauses should mean that people can easily report content that they consider to be either illegal, or harmful to children, if it is hosted on a site likely to be accessed by children, or, if it is hosted on a category 1 platform, harmful to adults. However, the Minister needs to clarify how these service complaints systems will be judged and what the performance metrics will be. For instance, how will Ofcom enforce against a complaint?

In many sectors of the economy, even with long-standing systems of regulation, companies can have tens of millions of customers reporting content, but that does not mean that any meaningful action can take place. The hon. Member for Aberdeen North has just told us how often she reports on various platforms, but what action has taken place? Many advocacy groups of people affected by crimes such as revenge porn will want to hear, in clear terms, what will happen to material that has been complained about. I hope the Minister can offer that clarity today.

Transparency in reporting will be vital to analysing trends and emerging types of harm. It is welcome that in schedule 8, which we will come to later, transparency reporting duties apply to the complaints process. It is important that as much information as possible is made public about what is going on in companies’ complaints and reporting systems. As well as the raw number of complaints, reporting should include what is being reported or complained about, as the Joint Committee on the draft Bill recommended last year. Again, what happens to the reported material will be an important metric on which to judge companies.

Finally, I will mention the lack of arrangements for children. We have tabled new clause 3, which has been grouped for discussion with other new clauses at the end of proceedings, but it is relevant to mention it now briefly. The Children’s Commissioner highlighted in her oral evidence to the Committee how children had lost faith in complaints systems. That needs to be changed. The National Society for the Prevention of Cruelty to Children has also warned that complaints mechanisms are not always appropriate for children and that a very low proportion of children have ever reported content. A child specific user advocacy body could represent the interests of child users and support Ofcom’s regulatory decisions. That would represent an important strengthening of protections for users, and I hope the Government will support it when the time comes.

Jane Stevenson Portrait Jane Stevenson (Wolverhampton North East) (Con)
- Hansard - - - Excerpts

I rise briefly to talk about content reporting. I share the frustrations of the hon. Member for Aberdeen North. The way I read the Bill was that it would allow users and affected persons, rather than “or” affected persons, to report content. I hope the Minister can clarify that that means affected persons who might not be users of a platform. That is really important.

Will the Minister also clarify the use of human judgment in these decisions? Many algorithms are not taking down some content at the moment, so I would be grateful if he clarified that there is a need for platforms to provide a genuine human judgment on whether content is harmful.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 78, in clause 28, page 28, line 28, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider to be illegal.

Amendment 79, in clause 28, page 28, line 30, leave out “affected” and replace with “any other”

This amendment allows those who do not fit the definition of “affected person” to make a complaint about search content which they consider not to comply with sections 24, 27 or 29.

Clause 28 stand part.

New clause 1—Report on redress for individual complaints

“(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under—

(a) section 18; and

(b) section 28

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services and regulated search services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services and search services.

(3) The report must be laid before Parliament within six months of the commencement of this Act.”

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will speak to new clause 1. Although duties about complaints procedures are welcome, it has been pointed out that service providers’ user complaints processes are often obscure and difficult to navigate—that is the world we are in at the moment. The lack of any external complaints option for individuals who seek redress is worrying.

The Minister has just talked about the super-complaints mechanism—which we will come to later in proceedings—to allow eligible entities to make complaints to Ofcom about a single regulated service if that complaint is of particular importance or affects a particularly large number of service users or members of the public. Those conditions are constraints on the super-complaints process, however.

An individual who felt that they had been failed by a service’s complaints system would have no source of redress. Without redress for individual complaints once internal mechanisms have been exhausted, victims of online abuse could be left with no further options, consumer protections could be compromised, and freedom of expression could be impinged upon for people who felt that their content had been unfairly removed.

Various solutions have been proposed. The Joint Committee recommended the introduction of an online safety ombudsman to consider complaints for which recourse to internal routes of redress had not resulted in resolution and the failure to address risk had led to significant and demonstrable harm. Such a mechanism would give people an additional body through which to appeal decisions after they had come to the end of a service provider’s internal process. Of course, we as hon. Members are all familiar with the ombudsman services that we already have.

Concerns have been raised about the level of complaints such an ombudsman could receive. However, as the Joint Committee noted, complaints would be received only once the service’s internal complaints procedure had been exhausted, as is the case for complaints to Ofcom about the BBC. The new clause seeks to ensure that we find the best possible solution to the problem. There needs to be a last resort for users who have suffered serious harm on services. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact on individuals.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I rise to contribute to the stand part debate on clauses 18 and 28. It was interesting, though, to hear the debate on clause 17, because it is right to ask how the complaints services will be judged. Will they work in practice? When we start to look at how to ensure that the legislation works in all eventualities, we need to ensure that we have some backstops for when the system does not work as it should.

It is welcome that there will be clear duties on providers to have operational complaints procedures—complaints procedures that work in practice. As we all know, many of them do not at the moment. As a result, we have a loss of faith in the system, and that is not going to be changed overnight by a piece of legislation. For years, people have been reporting things—in some cases, very serious criminal activity—that have not been acted on. Consumers—people who use these platforms—are not going to change their mind overnight and suddenly start trusting these organisations to take their complaints seriously. With that in mind, I hope that the Minister listened to the points I made on Second Reading about how to give extra support to victims of crimes or people who have experienced things that should not have happened online, and will look at putting in place the right level of support.

The hon. Member for Worsley and Eccles South talked about the idea of an ombudsman; it may well be that one should be in place to deal with situations where complaints are not dealt with through the normal processes. I am also quite taken by some of the evidence we received about third-party complaints processes by other organisations. We heard a bit about the revenge porn helpline, which was set up a few years ago when we first recognised in law that revenge pornography was a crime. The Bill creates a lot more victims of crime and recognises them as victims, but we are not yet hearing clearly how the support systems will adequately help that massively increased number of victims to get the help they need.

I will probably talk in more detail about this issue when we reach clause 70, which provides an opportunity to look at the—unfortunately—probably vast fines that Ofcom will be imposing on organisations and how we might earmark some of that money specifically for victim support, whether by funding an ombudsman or helping amazing organisations such as the revenge porn helpline to expand their services.

We must address this issue now, in this Bill. If we do not, all those fines will go immediately into the coffers of the Treasury without passing “Go”, and we will not be able to take some of that money to help those victims directly. I am sure the Government absolutely intend to use some of the money to help victims, but that decision would be at the mercy of the Treasury. Perhaps we do not want that; perhaps we want to make it cleaner and easier and have the money put straight into a fund that can be used directly for people who have been victims of crime or injustice or things that fall foul of the Bill.

I hope that the Minister will listen to that and use this opportunity, as we do in other areas, to directly passport fines for specific victim support. He will know that there are other examples of that that he can look at.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me develop the point before I give way. Our first line of defence is Ofcom enforcing the clause, but we have a couple of layers of additional defence. One of those is the super-complaints mechanism, which I have mentioned before. If a particular group of people, represented by a body such as the NSPCC, feel that their legitimate complaints are being infringed systemically by the social media platform, and that Ofcom is failing to take the appropriate action, they can raise that as a super-complaint to ensure that the matter is dealt with.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I should give way to the hon. Member for Aberdeen North first, and then I will come to the shadow Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

A later clause gives Ofcom the ability to levy the fees and charges it sees as necessary and appropriate to ensure that it can deliver the duties. Ofcom will have the power to set those fees at a level to enable it to do its job properly, as Parliament would wish it to do.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

This is the point about individual redress again: by talking about super-complaints, the Minister seems to be agreeing that it is not there. As I said earlier, for super-complaints to be made to Ofcom, the issue has to be of particular importance or to impact a particularly large number of users, but that does not help the individual. We know how much individuals are damaged; there must be a system of external redress. The point about internal complaints systems is that we know that they are not very good, and we require a big culture change to change them, but unless there is some mechanism thereafter, I cannot see how we are giving the individual any redress—it is certainly not through the super-complaints procedure.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I said explicitly a few moments ago, the hon. Lady is right to point out the fact that the super-complaints process is to address systemic issues. She is right to say that, and I think I made it clear a moment or two ago.

Whether there should be an external ombudsman to enforce individual complaints, rather than just Ofcom enforcing against systemic complaints, is a question worth addressing. In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast. Facebook in the UK alone has tens of millions of users—I might get this number wrong, but I think it is 30 million or 40 million users.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes. My hon. Friend hits the nail on the head. If there is a systemic problem and a platform fails to act appropriately not just in one case, but in a number of them, we have, as she has just described, the super-complaints process in clauses 140 to 142. Even under the Bill as drafted, without any changes, if a platform turns out to be systemically ignoring reasonable complaints made by the public and particular groups of users, the super-complainants will be able to do exactly as she describes. There is a mechanism to catch this—it operates not at individual level, but at the level of groups of users, via the super-complaint mechanism—so I honestly feel that the issue has been addressed.

When the numbers are so large, I think that the super-complaint mechanism is the right way to push Ofcom if it does not notice. Obviously, the first line of defence is that companies comply with the Bill. The second line of defence is that if they fail to do so, Ofcom will jump on them. The third line of defence is that if Ofcom somehow does not notice, a super-complaint group—such as the NSPCC, acting for children—will make a super-complaint to Ofcom. We have three lines of defence, and I submit to the Committee that they are entirely appropriate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I was about to sit down, but of course I will give way.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister said that the Opposition had not tabled an amendment to bring in an ombudsman.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

On this clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

On this clause. What we have done, however—we are debating it now—is to table a new clause to require a report on redress for individual complaints. The Minister talks about clause 149 and a process that will kick in between two and five years away, but we have a horrendous problem at the moment. I and various others have described the situation as the wild west, and very many people—thousands, if not millions, of individuals—are being failed very badly. I do not see why he is resisting our proposal for a report within six months of the commencement of the Act, which would enable us to start to see at that stage, not two to five years down the road, how these systems—he is putting a lot of faith in them—were turning out. I think that is a very sound idea, and it would help us to move forward.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The third line of defence—the super-complaint process—is available immediately, as I set out a moment ago. In relation to new clause 1, which the hon. Lady mentioned a moment ago, I think six months is very soon for a Bill of this magnitude. The two-to-five-year timetable under the existing review mechanism in clause 149 is appropriate.

Although we are not debating clause 149, I hope, Ms Rees, that you will forgive me for speaking about it for a moment. If Members turn to pages 125 and 126 and look at the matters covered by the review, they will see that they are extraordinarily comprehensive. In effect, the review covers the implementation of all aspects of the Bill, including the need to minimise the harms to individuals and the enforcement and information-gathering powers. It covers everything that Committee members would want to be reviewed. No doubt as we go through the Bill we will have, as we often do in Bill Committee proceedings, a number of occasions on which somebody tables an amendment to require a review of x, y or z. This is the second such occasion so far, I think, and there may be others. It is much better to have a comprehensive review, as the Bill does via the provisions in clause 149.

Question put and agreed to.

Clause 18 accordingly ordered to stand part of the Bill.

Clause 19

Duties about freedom of expression and privacy

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 19, on user-to-user services, and its associated clause 29, which relates to search services, specify a number of duties in relation to freedom of expression and privacy. In carrying out their safety duties, in-scope companies will be required by clause 19(2) to have regard to the importance of protecting users’ freedom of expression and privacy.

Let me pause for a moment on this issue. There has been some external commentary about the Bill’s impact on freedom of expression. We have already seen, via our discussion of a previous clause, that there is nothing in the Bill that compels the censorship of speech that is legal and not harmful to children. I put on the record again the fact that nothing in the Bill requires the censorship of legal speech that poses no harm to children.

We are going even further than that. As far as I am aware, for the first time ever there will be a duty on social media companies, via clause 19(2), to have regard to freedom of speech. There is currently no legal duty at all on platforms to have regard to freedom of speech. The clause establishes, for the first time, an obligation to have regard to freedom of speech. It is critical that not only Committee members but others more widely who consider the Bill should bear that carefully in mind. Besides that, the clause speaks to the right to privacy. Existing laws already speak to that, but the clause puts it in this Bill as well. Both duties are extremely important.

In addition, category 1 service providers—the really big ones—will need proactively to assess the impact of their policies on freedom of expression and privacy. I hope all Committee members will strongly welcome the important provisions I have outlined.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

As the Minister says, clauses 19 and 29 are designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulation in less substantive ways. That is a fear here.

Category 1 providers will need to undertake an impact assessment to determine the impact of their product and safety decisions on freedom of expression, but it is unclear whether that applies only in respect of content that is harmful to adults. Unlike with the risk assessments for the illegal content and child safety duties set out in part 3, chapter 2, these clauses do not set expectations about whether risk assessments are of a suitable and sufficient quality. It is also not clear what powers Ofcom has at its disposal to challenge any assessments that it considers insufficient or that reach an inappropriate or unreasonable assessment of how to balance fundamental rights. I would appreciate it if the Minister could touch on that when he responds.

The assumption underlying these clauses is that privacy and free expression may need to act as a constraint on safety measures, but I believe that that is seen quite broadly as simplistic and potentially problematic. To give one example, a company could argue that end-to-end encryption is important for free expression, and privacy could justify any adverse impact on users’ safety. The subjects of child abuse images, which could more easily be shared because of such a decision, would see their safety and privacy rights weakened. Such an argument fails to take account of the broader nuance of the issues at stake. Impacts on privacy and freedom of expression should therefore be considered across a range of groups rather than assuming an overarching right that applies equally to all users.

Similarly, it will be important that Ofcom understands and delivers its functions in relation to these clauses in a way that reflects the complexity and nuance of the interplay of fundamental rights. It is important to recognise that positive and negative implications for privacy and freedom of expression may be associated with any compliance decision. I think the Minister implied that freedom of speech was a constant positive, but it can also have negative connotations.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased that the clause is in the Bill, and I think it is a good one to include. Can the Minister reaffirm what he said on Tuesday about child sexual abuse, and the fact that the right to privacy does not trump the ability—particularly with artificial intelligence—to search for child sexual abuse images?

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Record-keeping and review duties on in-scope services make up an important function of the regulatory regime that we are discussing today. Platforms will need to report all harms identified and the action taken in response to this, in line with regulation. The requirements to keep records of the action taken in response to harm will be vital in supporting the regulator to make effective decisions about regulatory breaches and whether company responses are sufficient. That will be particularly important to monitor platforms’ responses through risk assessments—an area where some charities are concerned that we will see under-reporting of harms to evade regulation.

Evidence of under-reporting can be seen in the various transparency reports that are currently being published voluntarily by sites, where we are not presented with the full picture and scale of harm and the action taken to address that harm is thus obscured.

As with other risk assessments, the provisions in clauses 20 and 30 could be strengthened through a requirement on in-scope services to publish their risk assessments. We have made that point many times. Greater transparency would allow researchers and civil society to track harms and hold services to account.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I call Kirsty Blackman to move amendment 22. [Interruption.] Sorry—my bad, as they say. I call Barbara Keeley to move amendment 22.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 22, in clause 31, page 31, line 17, leave out subsection (3).

This amendment removes the condition that applies a child use test to a service or part of a service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Clause 32 stand part.

That schedule 3 be the Third schedule to the Bill.

Clause 33 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The purpose of the amendment is to remove the child use test from the children’s access assessment and to make sure that any service likely to be accessed by children is within the scope of the child safety duty. The amendment is supported by the NSPCC and other children’s charities.

Children require protection wherever they are online. I am sure that every Committee member believes that. The age-appropriate design code from the Information Commissioner’s Office requires all services that are likely to be accessed by children to provide high levels of data protection and privacy. Currently, the Bill will regulate only user-to-user and search services that have a significant number of child users or services for which children form a significant part of their user base. It will therefore not apply to all services that fall within the scope of the ICO’s code, creating a patchwork of regulation that could risk uncertainty, legal battles and unnecessary complexity. It might also create a perverse incentive for online services to stall the introduction of their child safety measures until Ofcom has the capacity to investigate and reach a determination on the categorisation of their sites.

The inclusion of a children’s access assessment in the Bill may result in lower standards of protection, with highly problematic services such as Telegram and OnlyFans able to claim that they are excluded from the child safety duties because children do not account for a significant proportion of their user base. However, evidence has shown that children have been able to access those platforms.

Other services will remain out of the scope of the Bill as currently drafted. They include harmful blogs that promote life-threatening behaviours, such as pro-anorexia sites with provider-generated rather than user-generated content; some of the most popular games among children that do not feature user-generated content but are linked to increasing gambling addiction among children, and through which some families have lost thousands of pounds; and other services with user-generated content that is harmful but does not affect an appreciable number of children. That risks dozens, hundreds or even thousands of children falling unprotected.

Parents have the reasonable expectation that, under the new regime introduced by the Bill, children will be protected wherever they are online. They cannot be expected to be aware of exemptions or distinctions between categories of service. They simply want their children to be protected and their rights upheld wherever they are.

As I say, children have the right to be protected from harmful content and activity by any platform that gives them access. That is why the child user condition in clause 31 should be deleted from the Bill. As I have said, the current drafting could leave problematic platforms out of scope if they were to claim that they did not have a significant number of child users. It should be assumed that platforms are within the scope of the child safety duties unless they can provide evidence that children cannot access their sites, for example through age verification tools.

Although clause 33 provides Ofcom with the power to determine that a platform is likely to be accessed by children, this will necessitate Ofcom acting on a company-by-company basis to bring problematic sites back into scope of the child safety duties. That will take considerable time, and it will delay children receiving protection. It would be simpler to remove the child user condition from clause 31, as I have argued.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am concerned to ensure that children are appropriately protected, as the hon. Lady sets out. Let me make a couple of points in that area before I address that point.

The hon. Lady asked another question earlier, about video content. She gave the example of TikTok videos being viewed or accessed not directly on TikTok but via some third-party means, such as a WhatsApp message. First, it is worth emphasising again that in order to count as a user, a person does not have to be registered and can simply be viewing the content. Secondly, if someone is viewing something through another service, such as WhatsApp—the hon. Lady used the example of browsing the internet on another site—the duty will bite at the level of WhatsApp, and it will have to consider the content that it is providing access to. As I said, someone does not have to be registered with a service in order to count as a user of that service.

On amendment 22, there is a drafting deficiency, if I may put it politely—this is a point of drafting rather than of principle. The amendment would simply delete subsection (3), but there would still be references to the “child user condition”—for example, the one that appears on the same page of the Bill at line 11. If the amendment were adopted as drafted, it would end up leaving references to “child user condition” in the Bill without defining what it meant, because we would have deleted the definition.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Is the Minister coming on to say that he is accepting what we are saying here?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, is the short answer. I was just mentioning in passing that there is that drafting issue.

On the principle, it is worth being very clear that, when it comes to content or matters that are illegal, that applies to all platforms, regardless of size, where children are at all at risk. In schedule 6, we set out a number of matters—child sexual exploitation and abuse, for example—as priority offences that all platforms have to protect children from proactively, regardless of scale.

--- Later in debate ---
Other areas include gambling, which the shadow Minister mentioned. There is separate legislation—very strong legislation—that prohibits children from being involved in gambling. That stands independently of this Bill, so I hope that the Committee is assured—
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister has not addressed the points I raised. I specifically raised—he has not touched on this—harmful pro-anorexia blogs, which we know are dangerous but are not in scope, and games that children access that increase gambling addiction. He says that there is separate legislation for gambling addiction, but families have lost thousands of pounds through children playing games linked to gambling addiction. There are a number of other services that do not affect an appreciable number of children, and the drafting causes them to be out of scope.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

rose—[Interruption.]

Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Eighth sitting)

Barbara Keeley Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Duties about fraudulent advertising: Category 1 services
Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

I beg to move amendment 23, in clause 34, page 33, line 41, after “service” insert “that targets users”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 24, in clause 35, page 34, line 34, after “service” insert “that targets users”.

New clause 5—Duty to distinguish paid-for advertisements

“(1) A provider of a Category 2A service must operate the service using systems and processes designed to clearly distinguish to users of that service paid-for advertisements from all other content appearing in or via search results of the service.

(2) The systems and processes described under subsection (1)—

(a) must include clearly displaying the words “paid-for advertisement” next to any paid-for advertisement appearing in or via search results of the service, and

(b) may include measures such as but not limited to the application of colour schemes to paid-for advertisements appearing in or via search results of the service.

(3) The reference to paid-for advertisements appearing “in or via search results of a search service” does not include a reference to any advertisements appearing as a result of any subsequent interaction by a user with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend to the design, operation and use of a Category 2A service that hosts paid-for advertisements targeted at users of that service in the United Kingdom.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) For the meaning of “paid-for advertisement”, see section 189 (interpretation: general).”

New clause 6—Duty to verify advertisements

“(1) A provider of a Category 2A service must operate an advertisement verification process for any relevant advertisement appearing in or via search results of the service.

(2) In this section, “relevant advertisement” means any advertisement for a service or product to be designated in regulations made by the Secretary of State.

(3) The verification process under subsection (1) must include a requirement for advertisers to demonstrate that they are authorised by a UK regulatory body.

(4) In this section, “UK regulatory body” means a UK regulator responsible for the regulation of a particular service or product to be designated in regulations made by the Secretary of State.

(5) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(6) For the meaning of “Category 2A service”, see section 81 (register of a categories of service).

(7) Regulations under this section shall be made by statutory instrument.

(8) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I begin by thanking my hon. Friend the Member for Washington and Sunderland West (Mrs Hodgson) for her work on drafting these amendments and others relating to this chapter, which I will speak to shortly. She has campaigned excellently over many years in her role as chair of the all-party parliamentary group on ticket abuse. I attended the most recent meeting of that group back in April to discuss what we need to see changed in the Bill to protect people from scams online. I am grateful to those who have supported the group and the anti-ticket touting campaign for their insights.

It is welcome that, after much flip-flopping, the Government have finally conceded to Labour’s calls and those of many campaign groups to include a broad duty to tackle fraudulent advertising on search engines through chapter 5 of part 3 of the Bill. We know that existing laws to protect consumers in the online world have failed to keep pace with the actors attempting to exploit them, and that is particularly true of scams and fraudulent advertisements.

Statistics show a steep increase in this type of crime in the online world, although those figures are likely to be a significant underestimate and do not capture the devastating emotional impact that scams have on their victims. The scale of the problem is large and it is growing.

The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% of that fraud committed online. We know those figures are increasing. The FCA more than doubled the number of scam warnings it issued between 2019 and 2020, while UK Finance data shows that there has been a significant rise in cases across all scam types as criminals adapt to targeting victims online. The pandemic, which led to a boom in internet shopping, created an environment ripe for exploitation. Reported incidents of scams and fraud have increased by 41% since before the pandemic, with one in 10 of us now victims of fraud.

Being scammed can cause serious psychological harm. Research by the Money and Mental Health Policy Institute suggests that three in 10 online scam victims felt depressed as a result of being scammed, while four in 10 said they felt stressed. Clearly, action to tackle the profound harms that result from fraudulent advertising is long overdue.

This Bill is an important opportunity but, as with other issues the Government are seeking to address, we need to see changes if it is to be successful. Amendments 23 and 24 are small and very simple, but would have a profound impact on the ability of the Bill to prevent online fraud from taking place and to protect UK users.

As currently drafted, the duties set out in clauses 34 and 35 for category 1 and 2A services extend only to the design, operation and use of a category 1 or 2A service in the United Kingdom. Our amendments would mean that the duties extended to the design, operation and use of a category 1 or 2A service that targets users in the United Kingdom. That change would make the Bill far more effective, because it would reduce the risk of a company based overseas being able to target UK consumers without any action being taken against them—being allowed to target the public fraudulently without fear of disruption.

That would be an important change, because paid-for advertisements function by the advertiser stating where in the world, by geographical location, they wish to target consumers. For instance, a company would be able to operate from Hong Kong and take out paid-for advertisements to target consumers just in one particular part of north London. The current wording of the Bill does not acknowledge the fact that internet services can operate from anywhere in the world and use international boundaries to circumvent UK legislation.

Other legislation has been successful in tackling scams across borders. I draw the Committee’s attention to the London Olympic Games and Paralympic Games Act 2006, which made it a crime to sell a ticket to the Olympics into the black market anywhere in the world, rather than simply in the UK where the games took place. I suggest that we should learn from the action taken to regulate the Olympics back in 2012 and implement the same approach through amendments 23 and 24.

New clause 5 was also tabled by my hon. Friend the Member for Washington and Sunderland West, who will be getting a lot of mentions this afternoon.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

New clause 5 would tackle one of the reasons people become subject to fraud online by introducing a duty for search engines to ensure that all paid-for search advertisements should be made to look distinct from non-paid-for search results. When bad actors are looking to scam consumers, they often take out paid-for advertising on search results, so that they can give consumers the false impression that their websites are official and trustworthy.

Paid search results occur when companies pay a charge to have their site appear at the top of search results. This is valuable to them because it is likely to direct consumers towards their site. The new clause would stop scam websites buying their way to the top of a search result.

Let me outline some of the consequences of not distinguishing between paid-for and not-paid-for advertisements, because they can be awful. Earlier this year, anti-abortion groups targeted women who were searching online for a suitable abortion clinic. The groups paid for the women to have misleading adverts at the top of their search that directed them towards an anti-abortion centre rather than a clinic. One woman who knew that she wanted to have an abortion went on researching where she could have the procedure. Her search for a clinic on Google led her to an anti-abortion centre that she went on to contact and visit. That was because she trusted the top search results on Google, which were paid for. The fact that it was an advertisement was indicated only by the two letters “AD” appearing in very small font underneath the search headline and description.

Another example was reported by The Times last year. Google had been taking advertising money from scam websites selling premier league football tickets, even though the matches were taking place behind closed doors during lockdown. Because these advertisements appeared at the top of search results, it is entirely understandable that people looking for football tickets were deceived into believing that they would be able to attend the games, which led to them being scammed.

There have been similar problems with passport renewals. As colleagues will be very aware, people have been desperately trying to renew their passports amid long delays because of the backlog of cases. This is a target for fraudsters, who take out paid advertisements to offer people assistance with accessing passport renewal services and then scam them.

New clause 5 would end this practice by ensuring that search engines provide clear messaging to show that the user is looking at a paid-for advertisement, by stating that clearly and through other measures, such as a separate colour scheme. A duty to distinguish paid-for advertising is present in many other areas of advertising. For example, when we watch TV, there is no confusion between what is a programme and what is an advert; the same is true of radio advertising; and when someone is reading a newspaper or magazine, the line between journalism and the advertisements that fund the paper is unmistakable.

We cannot continue to have these discrepancies and be content with the internet being a wild west. Therefore, it is clear that advertising on search engines needs to be brought into line with advertising in other areas, with a requirement on search engines to distinguish clearly between paid-for and organic results.

New clause 6 is another new clause tabled by my hon. Friend the Member for Washington and Sunderland West. It would protect consumers from bad actors trying to exploit them online by placing a duty on search engines to verify adverts before they accept them. That would mean that, before their adverts were allowed to appear in a paid-for search result, companies would have to demonstrate that they were authorised by a UK regulatory body designated by the Secretary of State.

This methodology for preventing fraud is already in process for financial crime. Google only accepts financial services advertisements from companies that are a member of the Financial Conduct Authority. This gives companies a further incentive to co-operate with regulators and it protects consumers by preventing companies that are well-known for their nefarious activities from dominating search results and then misleading consumers. By extending this best practice to all advertisements, search engines would no longer be able to promote content that is fake or fraudulent after being paid to do so.

Without amending the Bill in this way, we risk missing an opportunity to tackle the many forms of scamming that people experience online, one of which is the world of online ticketing. In my role as shadow Minister for the arts and civil society, I have worked on this issue and been informed by the expertise of my hon. Friend the Member for Washington and Sunderland West.

In the meeting of the all-party parliamentary group on ticket abuse in April, we heard about the awful consequences of secondary ticket reselling practices. Ticket reselling websites, such as Viagogo, are rife with fraud. Large-scale ticket touts dominate the resale site, and Viagogo has a well-documented history of breaching consumer protection laws. Those breaches include a number of counts of fraud for selling non-existent tickets. Nevertheless, Viagogo continues to take out paid-for advertisements with Google and is continually able to take advantage of consumers by dominating search results and commanding false trust.

If new clause 6 is passed, then secondary ticketing websites such as Viagogo would have to be members of a regulatory body responsible for secondary ticketing, such as the Society of Ticket Agents and Retailers, or STAR. Viagogo would then have to comply with STAR standards for its business model to be successful.

I have used ticket touting as an example, but the repercussions of this change would be wider than that. Websites that sell holidays and flights, such as Skyscanner, would have to be a member of the relevant regulatory group, for example the Association of British Travel Agents. People would be able to go to football matches, art galleries and music festivals without fearing that they are getting ripped off or have been issued with fake tickets.

I will describe just a few examples of the poor situation we are in at the moment, to illustrate the need for change. The most heartbreaking one is of an elderly couple who bought two tickets from a secondary ticketing website to see their favourite artist, the late Leonard Cohen, to celebrate their 70th wedding anniversary. When the day came around and they arrived at the venue, they were turned away and told they had been sold fake tickets. The disappointment they must have felt would have been very hard to bear. In another instance, a British soldier serving overseas decided to buy his daughter concert tickets because he could not be with her on her birthday. When his daughter went along to the show, she was turned away at the door and told she could not enter because the tickets had been bought through a scam site and were invalid.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her latter remarks. We made an important addition to the Bill after listening to parliamentarians across the House and to the Joint Committee, which many people served on with distinction. I am delighted that we have been able to make that significant move. We have heard a lot about how fraudulent advertising can affect people terribly, particularly more vulnerable people, so that is an important addition.

Amendments 23 and 24 seek to make it clear that where the target is in the UK, people are covered. I am happy to assure the Committee that that is already covered, because the definitions at the beginning of the Bill—going back to clause 3(5)(b), on page 3—make it clear that companies are in scope, both user-to-user and search, if there is a significant number of UK users or where UK users form one of the target markets, or is the only target market. Given the reference to “target markets” in the definitions, I hope that the shadow Minister will withdraw the amendment, because the matter is already covered in the Bill.

New clause 5 raises important points about the regulation of online advertising, but that is outside the purview of what the Bill is trying to achieve. The Government are going to work through the online advertising programme to tackle these sorts of issues, which are important. The shadow Minister is right to raise them, but they will be tackled holistically by the online advertising programme, and of course there are already codes of practice that apply and are overseen by the Advertising Standards Authority. Although these matters are very important and I agree with the points that she makes, there are other places where those are best addressed.

New clause 6 is about the verification process. Given that the Bill is primary legislation, we want to have the core duty to prevent fraudulent advertising in the Bill. How that is implemented in this area, as in many others, is best left to Ofcom and its codes of practice. When Ofcom publishes the codes of practice, it might consider such a duty, but we would rather leave Ofcom, as the expert regulator, with the flexibility to implement that via the codes of practice and leave the hard-edged duty in the Bill as drafted.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

We are going to press amendments 23 and 24 to a vote because they are very important. I cited the example of earlier legislation that considered it important, in relation to selling tickets, to include the wording “anywhere in the world”. We know that ticket abuses happen with organisations in different parts of the world.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady is perfectly entitled to press to a vote whatever amendments she sees fit, but in relation to amendments 24 and 25, the words she asks for,

“where the UK is a target market”,

are already in the Bill, in clause 3(5)(b), on page 3, which set out the definitions at the start. I will allow the hon. Lady a moment to look at where it states:

“United Kingdom users form one of the target markets for the service”.

That applies to user-to-user and to search, so it is covered already.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The problem is that we are getting into the wording of the Bill. As with the child abuse clause that we discussed before lunch, there are limitations. Clause 3 states that a service has links with the United Kingdom if

“the service has a significant number of United Kingdom users”.

It does not matter if a person is one of 50, 100 or 1,000 people who get scammed by some organisation operating in another part of the country. The 2006 Bill dealing with the sale of Olympic tickets believed that was important, and we also believe it is important. We have to find a way of dealing with ticket touting and ticket abuse.

Turning to fraudulent advertising, I have given examples and been supported very well by the hon. Member for Aberdeen North. It is not right that vulnerable people are repeatedly taken in by search results, which is the case right now. The reason we have tabled all these amendments is that we are trying to protect vulnerable people, as with every other part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is of course our objective as well, but let me just return to the question of the definitions. The hon. Lady is right that clause 3(5)(a) says

“a significant number of United Kingdom users”,

but paragraph (b) just says,

“United Kingdom users form one of the target markets”.

There is no significant number qualification in paragraph (b), and to put it beyond doubt, clause 166(1) makes it clear that service providers based outside the United Kingdom are within the scope of the Bill. To reiterate the point, where the UK is a target market, there is no size qualification: the service provider is in scope, even if it is only one user.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Does the Minister want to say anything about the other points I made about advertisements?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Not beyond the points I made previously, no.

Question put, That the amendment be made.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 45, in clause 35, page 34, line 2, leave out subsection (1) and insert—

“(1) A provider of a Category 2A service must operate the service using proportionate systems and processes designed to—

(a) prevent individuals from encountering content consisting of fraudulent advertisements by means of the service;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.”

This amendment brings the fraudulent advertising provisions for Category 2A services in line with those for Category 1 services.

Government amendments 91 to 94.

Clause 35 stand part.

Amendment 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Clause 36 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I am aware that the Minister has reconsidered the clause and tabled a Government amendment that is also in this group, with the same purpose as our amendment 45. That is welcome, as there was previously no justifiable reason why the duties on category 1 services and category 2A services were misaligned.

All three of the duties on category 1 services introduced by clause 34 are necessary to address the harm caused by fraudulent and misleading online adverts. Service providers need to take proportionate but effective action to prevent those adverts from appearing or reappearing, and when they do appear, those service providers need to act quickly by swiftly taking them down. The duties on category 2A services were much weaker, only requiring them to minimise the risk of individuals encountering content consisting of fraudulent advertisements in or via search results of the service. There was no explicit reference to prevention, even though that is vital, or any explicit requirement to act quickly to take harmful adverts down.

That difference would have created an opportunity for fraudsters to exploit by focusing on platforms with lesser protections. It could have resulted in an increase in fraud enabled by paid-for advertising on search services, which would have undermined the aims of the Bill. I am glad that the Government have recognised this and will require the same proactive, preventative response to harmful ads from regulated search engines as is required from category 1 services.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As we have heard already, these clauses are very important because they protect people from online fraudulent advertisements for the first time—something that the whole House quite rightly called for. As the shadow Minister said, the Government heard Parliament’s views on Second Reading, and the fact that the duties in clause 35 were not as strongly worded as those in clause 34 was recognised. The Government heard what Members said on Second Reading and tabled Government amendments 91 to 94, which make the duties on search firms in clause 35 as strong as those on user-to-user firms in clause 34. Opposition amendment 45 would essentially do the same thing, so I hope we can adopt Government amendments 91 to 94 without needing to move amendment 45. It would do exactly the same thing—we are in happy agreement on that point.

I listened carefully to what the shadow Minister said on amendment 44. The example she gave at the end of her speech—the poor lady who was induced into sending money, which she thought was being sent to pay off creditors but was, in fact, stolen—would, of course, be covered by the Bill as drafted, because it would count as an act of fraud.

The hon. Lady also talked about some other areas that were not fraud, such as unfair practices, misleading statements or statements that were confusing, which are clearly different from fraud. The purpose of clause 35 is to tackle fraud. Those other matters are, as she says, covered by the Consumer Protection from Unfair Trading Regulations 2008, which are overseen and administered by the Competition and Markets Authority. While matters to do with unfair, misleading or confusing content are serious—I do not seek to minimise their importance—they are overseen by a different regulator and, therefore, better handled by the CMA under its existing regulations.

If we introduce this extra offence to the list in clause 36, we would end up having a bit of regulatory overlap and confusion, because there would be two regulators involved. For that reason, and because those other matters—unfair, misleading and confusing advertisements —are different to fraud, I ask that the Opposition withdraw amendment 44 and, perhaps, take it up on another occasion when the CMA’s activities are in the scope of the debate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

No, we want to press this amendment to a vote. I have had further comment from the organisations that I quoted. They believe that we do need the amendment because it is important to stop harmful ads going up in the first place. They believe that strengthened provisions are needed for that. Guidance just puts the onus for protecting consumers on the other regulatory regimes that the Minister talked about. The view of organisations such as StepChange is that those regimes—the Advertising Standards Authority regime—are not particularly strong.

The regulatory framework for financial compulsion is fragmented. FCA-regulated firms are clearly under much stronger obligations than those that fall outside FCA regulations. I believe that it would be better to accept the amendment, which would oblige search engines and social media giants to prevent harmful and deceptive ads from appearing in the first place. The Minister really needs to take on board the fact that in this patchwork, this fragmented world of different regulatory systems, some of the existing systems are clearly failing badly, and the strong view of expert organisations is that the amendment is necessary.

Question put and agreed to.

Clause 34 accordingly ordered to stand part of the Bill.

Clause 35

Duties about fraudulent advertising: Category 2A services

Amendments made: 91, in clause 35, page 34, line 3, leave out from “to” to end of line 5 and insert—

“(a) prevent individuals from encountering content consisting of fraudulent advertisements in or via search results of the service;

(b) if any such content may be encountered in or via search results of the service, minimise the length of time that that is the case;

(c) where the provider is alerted by a person to the fact that such content may be so encountered, or becomes aware of that fact in any other way, swiftly ensure that individuals are no longer able to encounter such content in or via search results of the service.”

This amendment alters the duty imposed on providers of Category 2A services relating to content consisting of fraudulent advertisements so that it is in line with the corresponding duty imposed on providers of Category 1 services by clause 34(1).

Amendment 92, in clause 35, page 34, line 16, leave out “reference” and insert “references”.

This amendment is consequential on Amendment 91.

Amendment 93, in clause 35, page 34, line 18, leave out “is a reference” and insert “are references”.

This amendment is consequential on Amendment 91.

Amendment 94, in clause 35, page 34, line 22, leave out

“does not include a reference”

and insert “do not include references”.—(Chris Philp.)

This amendment is consequential on Amendment 91.

Clause 35, as amended, ordered to stand part of the Bill.

Clause 36

Fraud etc offences

Amendment proposed: 44, in clause 36, page 35, line 10, at end insert—

“(4A) An offence under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.”—(Barbara Keeley.)

This amendment adds further offences to those which apply for the purposes of the Bill’s fraudulent advertising provisions.

Question put, That the amendment be made.

Online Safety Bill (Ninth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
None Portrait The Chair
- Hansard -

Barbara Keeley, do you wish to speak to the clause?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Given that the clause is clearly uncontentious, I will be extremely brief.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Schedule 5 has already been debated, so we will proceed straight—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

No, it hasn’t. We did not get a chance to speak to either schedule 5 or schedule 6.

None Portrait The Chair
- Hansard -

Sorry; they were in the group, so we have to carry on.

Schedules 5 and 6 agreed to.

Ordered, That further consideration be now adjourned.—(Steve Double.)

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Tenth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 58, which was touched on in our last debate, simply sets out Ofcom’s duty to publish guidance for category 1 services to assist them in complying with the user identification duty set out in clause 57. We have probably covered the main points, so I will say nothing further.

Question put and agreed to.

Clause 58 accordingly ordered to stand part of the Bill.

Clause 59

Requirement to report CSEA content to the NCA

Question proposed, That the clause stand part of the Bill.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

You are really moving us at pace, Sir Roger. It is a pleasure to serve in Committee with you in the Chair.

It is welcome that regulated services will have to report all child sexual exploitation and abuse material that they detect on their platform. The Government’s decision to move away from the approach of a regulatory code of practice to a mandatory reporting requirement is an important improvement to the draft Bill.

For companies to report child sexual exploitation and abuse material correctly to the mandatory reporting body, they will need access to accurate datasets that will determine whether something that they are intending to report is child sexual exploitation and abuse content. What guidance will be made available to companies so that they can proactively detect CSEA, and what plans are in place to assist companies to identify potential CSEA that has not previously been identified? The impact assessment mentions that, for example, BT is planning to use the Internet Watch Foundation’s hash list, which is compliant with UK law enforcement standards, to identify CSEA proactively. Hashing is a technology used to prevent access to known CSEA; a hash is a unique string of letters and numbers which is applied to an image and which can then be matched every time a user attempts to upload a known illegal image to a platform. It relies, however, on CSEA already having been detected. What plans are in place to assist companies to identify potential CSEA?

Finally, it is important that the introduction of mandatory reporting does not impact on existing international reporting structures. Many of the largest platforms in the scope of the Bill are US-based and required under US law to report CSEA material detected on their platform to the National Centre for Missing and Exploited Children, which ensures that information relevant to UK law enforcement is referred to it for investigation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To answer the shadow Minister’s question about the duty to detect CSEA proactively—because, as she says, we have to detect it before we can report it—I confirm that there are already duties in the Bill to prevent and detect CSEA proactively, because CSEA is a priority offence in the schedule 6 list of child exploitation and abuse offences, and there is a duty for companies to prevent those proactively. In preventing them proactively, they will by definition identify them. That part of her question is well covered.

The hon. Lady also asked about the technologies available to those companies, including hash matching—comparing images against a known database of child sexual exploitation images. A lot of technology is being developed that can proactively spot child sexual exploitation in new images that are not on the hash matching database. For example, some technology combines age identification with nude image identification; by putting them together, we can identify sexual exploitation of children in images that are new and are not yet in the database.

To ensure that such new technology can be used, we have the duties under clause 103, which gives Ofcom the power to mandate—to require—the use of certain accredited technologies in fighting not just CSEA, but terrorism. I am sure that we will discuss that more when we come to that clause. Combined, the requirement to proactively prevent CSEA and the ability to specify technology under clause 103 will mean that companies will know about the content that they now, under clause 59, have to report to the National Crime Agency. Interestingly, the hon. Member for Worsley and Eccles South mentioned that that duty already exists in the USA, so it is good that we are matching that requirement in our law via clause 59, which I hope that the Committee will agree should stand part of the Bill.

Question put and agreed to.

Clause 59 accordingly ordered to stand part of the Bill.

Clause 60

Regulations about reports to the NCA

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clause 61 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The additional regulations created by the Secretary of State in connection with the reports will have a lot resting on them. It is vital that they receive the appropriate scrutiny when the time comes. For example, the regulations must ensure that referrals to the National Crime Agency made by companies are of a high quality, and that requirements are easy to comply with. Prioritising the highest risk cases will be important, particularly where there is an immediate threat to the safety and welfare of a child.

Clause 60 sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

Does the Minister have an idea what that will look like? What plans are in place to ensure that law enforcement can prioritise the highest risk and harm cases?

Under the new arrangements, the National Crime Agency as the designated body, the Internet Watch Foundation as the appropriate authority for notice and takedown in the UK, and Ofcom as the regulator for online harms will all hold a vast amount of information on the scale of the threat posed by child sexual exploitation and illegal content. How will the introduction of mandatory reporting assist those three organisations in improving their understanding of how harm manifests online? How does the Minister envisage the organisations working together to share information to better protect children online?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that clause 60 will be in the Bill and that there will be a duty to report to the NCA. On subsection (3), though, I would like the Minister to clarify that if the Secretary of State believes that the Scottish Ministers would be appropriate people to consult, they would consult them, and the same for the Northern Ireland Executive.

I would appreciate the Minister explaining how clause 61 will work in a Scottish context, because that clause talks about the Crime and Courts Act 2013. Does a discussion need to be had with Scottish Ministers, and perhaps Northern Ireland Ministers as well, to ensure that information sharing takes place seamlessly with devolved areas with their own legal systems, to the same level as within England and Wales? If the Minister does not have an answer today, which I understand that he may not in detail, I am happy to hear from him later; I understand that it is quite a technical question.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member for Worsley and Eccles South asks about the prioritisation of reports made to the NCA under the new statutory provisions. The prioritisation of investigations is an operational matter for the NCA, acting as a law enforcement body. I do not think it would be right either for myself as a Minister or for Parliament as a legislative body to specify how the NCA should conduct its operational activities. I imagine that it would pursue the most serious cases as a matter of priority, and if there is evidence of any systemic abuse it would also prioritise that, but it really is a matter for the NCA, as an operationally independent police force, to decide for itself. I think it is fairly clear that the scope of matters to be contained in these regulations is fairly comprehensive, as one would expect.

On the questions raised by the hon. Member for Aberdeen North, the Secretary of State might consult Scottish Ministers under clause 63(6)(c), particularly those with responsibility for law enforcement in Scotland, and the same would apply to other jurisdictions. On whether an amendment is required to cover any matters to do with the procedures in Scotland equivalent to the matter covered in clause 61, we do not believe that any equivalent change is required to devolved Administration law. However, in order to be absolutely sure, we will get the hon. Lady written confirmation on that point.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I am not sure that the Minister has answered my question on clause 60. I think we all agree that law enforcement agencies can decide their own priorities, quite rightly, but clause 60(2)(d) sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

I asked the Minister what that would look like.

Also, we think it is pretty important that the National Crime Agency, the Internet Watch Foundation and Ofcom work together on mandatory reporting. I asked him how he envisaged them working together to share information, because the better they do that, the more children are protected.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I apologise for missing those two points. On working together, the hon. Lady is right that agencies such as the Internet Watch Foundation and others should co-operate closely. There is already very good working between the Internet Watch Foundation, law enforcement and others—they seem to be well networked together and co-operating closely. It is appropriate to put on the record that Parliament, through this Committee, thinks that co-operation should continue. That communication and the sharing of information on particular images is obviously critical.

As the clause states, the regulations can set out expedited timeframes in cases of particular urgency. I understand that to mean cases where there might be an immediate risk to a child’s safety, or where somebody might be at risk in real time, as opposed to something historic—for example, an image that might have been made some time ago. In cases where it is believed abuse is happening at the present time, there is an expectation that the matter will be dealt with immediately or very close to immediately. I hope that answers the shadow Minister’s questions.

Question put and agreed to.

Clause 60 accordingly ordered to stand part of the Bill.

Clause 61 ordered to stand part of the Bill.

Clause 62

Offence in relation to CSEA reporting

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Clause 63 sets out that the CSEA content required to be reported must have been published, generated, uploaded or shared either in the UK, by a UK citizen, or including a child in the UK. Subsection (6) requires services to provide evidence of such a link to the UK, which might be quite difficult in some circumstances. I would appreciate the Minister outlining what guidance and support will be made available to regulated services to ensure that they can fulfil their obligations. This is about how services are to provide evidence of such a link to the UK.

Takeovers, mergers and acquisitions are commonplace in the technology industry, and many companies are bought out by others based overseas, particularly in the United States. Once a regulated service has been bought out by a company based abroad, what plans are in place to ensure that either the company continues to report to the National Crime Agency or that it is enabled to transition to another mandatory reporting structure, as may be required in another country in the future. That is particularly relevant as we know that the European Union is seeking to introduce mandatory reporting functions in the coming years.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

What the Minister has said is helpful, but the question I asked was about what guidance and support will be made available to regulated services. We all want this to work, because it is one of the most important aspects of the Bill—many aspects are important. He made it clear to us that the definition is quite wide, for both the general definitions and the “UK-linked” content. The point of the question was, given the possible difficulties in some circumstances, what guidance and support will be made available?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I anticipate that the National Crime Agency will issue best practice guidance. A fair amount of information about the requirements will also be set out in the regulations that the Secretary of State will issue under clause 60, which we have already debated. So it is a combination of those regulations and National Crime Agency best practice guidance. I hope that answers the question.

Finally, on companies being taken over, if a company ceases to be UK-linked, we would expect it to continue to discharge its reporting duties, which might include reporting not just in the UK but to its domestic reporting agency—we have already heard the US agency described and referenced.

I hope that my answers demonstrate that the clause is intended to be comprehensive and effective. It should ensure that the National Crime Agency gets all the information it needs to investigate and prosecute CSEA in order to keep our children safe.

Question put and agreed to.

Clause 62, as amended, accordingly ordered to stand part of the Bill.

Clause 63 ordered to stand part of the Bill.

Clause 64

Transparency reports about certain Part 3 services

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move amendment 54, in clause 64, page 56, line 29, leave out “Once” and insert “Twice”.

This amendment would change the requirement for transparency report notices from once a year to twice a year.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

Amendment 55, in schedule 8, page 188, line 42, at end insert—

“31A The notice under section 64(1) must require the provider to provide the following information about the service—

(a) the languages in which the service has safety systems or classifiers;

(b) details of how human moderators employed or engaged by the provider are trained and supported;

(c) the process by which the provider takes decisions about the design of the service;

(d) any other information that OFCOM considers relevant to ensuring the safe operation of the service.”

This amendment sets out details of information Ofcom must request be provided in a transparency report.

That schedule 8 be the Eighth schedule to the Bill.

Clause 65 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The duties on regulated services set out in the clause are welcome. Transparency reports will be a vital tool to hold platforms to account for understanding the true drivers of online harm. However, asking platforms to submit transparency reports once a year does not reflect how rapidly we know the online world changes. As we have seen time and again, the online environment can shift significantly in a matter of months, if not weeks. We have seen that in the rise of disinformation about covid, which we have talked about, and in the accelerated growth of platforms such as TikTok.

Increasing the frequency of transparency reports from annual to biannual will ensure that platforms stay on the pulse of emergent risks, allowing Ofcom to do the same in turn. The amendment would also mean that companies focus on safety, rather than just profit. As has been touched on repeatedly, that is the culture change that we want to bring about. It would go some way towards preventing complacency about reporting harms, perhaps forcing companies to revisit the nature of harm analysis, management and reduction. In order for this regime to be world-leading and ambitious—I keep hearing the Minister using those words about the Bill—we must demand the most that we can from the highest-risk services, including on the important duty of transparency reporting.

Moving to clauses 64 and 65 stand part, transparency reporting by companies and Ofcom is important for analysing emerging harms, as we have discussed. However, charities have pointed out that platforms have a track record of burying documents and research that point to risk of harm in their systems and processes. As with other risk assessments and reports, such documents should be made public, so that platforms cannot continue to hide behind a veil of secrecy. As I will come to when I speak to amendment 55, the Bill must be ambitious and bold in what information platforms are to provide as part of the clause 64 duty.

Clause 64(3) states that, once issued with a notice by Ofcom, companies will have to produce a transparency report, which must

“be published in the manner and by the date specified in the notice.”

Can the Minister confirm that that means regulated services will have to publish transparency reports publicly, not just to Ofcom? Can he clarify that that will be done in a way that is accessible to users, similarly to the requirements on services to make their terms of service and other statements clear and accessible? Some very important information will be included in those reports that will be critical for researchers and civil society when analysing trends and harms. It is important that the data points outlined in schedule 8 capture the information needed for those organisations to make an accurate analysis.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The evidence we heard from Frances Haugen set out how important transparency is. If internet and service providers have nothing to hide, transparency is surely in their interests as well. From my perspective, there is little incentive for the Government not to support the amendment, if they want to help civil society, researchers, academics and so on in improving a more regulated approach to transparency generally on the internet, which I am sure we all agree is a good thing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.

Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.

When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.

When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

One of the things we found on the Joint Committee last year was the consistent message that we should not need to put this Bill in place. I want to put on the record my continued frustration that Meta and the other social media platforms are requiring us to put this Bill in place because they are not doing the monitoring, engaging in that way or putting users first. I hope that the process of going through the Bill has helped them to see the need for more monitoring. It is disappointing that we have had to get to this point. The UK Government are having to lead the world by putting this Bill in place—it should not be necessary. I hope that the companies do not simply follow what we are putting forward, but go much further and see that it is imperative to change the way they work and support their users around the world.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I thank the hon. Gentleman and I agree. It is a constant frustration that we need this Bill. We do need it, though. In fact, amendment 55 would really assist with that, by requiring those services to go further in transparency reporting and to disclose

“the languages in which the service has safety systems or classifiers”.

We need to see what they are doing on this issue. It is an easily reported piece of information that will have an outsized impact on safety, even for English speakers. It will help linguistic groups in the multilingual UK and around the world.

Reporting on language would not be a big burden on companies. In her oral evidence, Frances Haugen told the Committee that large platforms can trivially produce this additional data merely by changing a single line of code when they do their transparency reports. We must not become wrapped up in the comfort of the language we all speak and ignore the gaping loophole left for other languages, which allows harms to slip through.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To start with, it is worth saying that clause 64 is extremely important. In the course of debating earlier clauses, Opposition Members rightly and repeatedly emphasised how important it is that social media platforms are compelled to publish information. The testimony that Frances Haugen gave to the Joint Committee and to this Committee a few weeks ago demonstrates how important that is. Social media platforms are secretive and are not open. They seek to disguise what is going on, even though the impact of what they are doing has a global effect. So the transparency power in clause 64 is a critical part of the Bill and will dramatically transform the insights of parliamentarians, the wider public, civil society campaigners and academics. It will dramatically open up the sense of what is going on inside these companies, so it is extremely important indeed.

Amendment 54 seeks to increase the frequency of transparency reporting from once a year to twice a year. To be honest, we do not want to do this unreasonably frequently, and our sense is that once a year, rather than twice a year, is the right regularity. We therefore do not support the amendment. However, Members will notice that there is an ability in clause 64(12) for the Secretary of State, by regulation, to

“amend subsection (1) so as to change the frequency of the transparency reporting process.”

If it turns out in due course that once a year is not enough and we would like to do it more frequently—for example, twice a year—there is the power for those regulations to be used so that the reporting occurs more frequently. The frequency is not set in stone.

I turn to amendment 55, which sets out a number of topics that would be included in reporting. It is important to say that, as a quick glance at schedule 8 shows, the remit of the reports is already extremely wide in scope. Hon. Members will see that paragraph 5 specifies that reports can cover

“systems and processes for users to report content which they consider to be illegal”

or “harmful”, and so on. Paragraph 6 mentions:

“The systems and processes that a provider operates to deal with illegal content, content that is harmful to children”,

and so on. Therefore, the topics that amendment 55 speaks to are already covered by the schedule, and I would expect such things to be reported on. We have given Ofcom the explicit powers to do that and, rather than prescribe such details in the Bill, we should let Ofcom do its job. It certainly has the powers to do such things—that is clearly set out in the schedule—and I would expect, and obviously the Opposition would expect, that it will do so. On that basis, I will gently resist amendments 54 and 55.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

On amendment 55, I want to come back to the Minister on two points about languages that were made by the hon. Member for Aberdeen North. I think most people would be shocked to discover that safety systems and the languages in which they operate are not protected, so if people are speaking a language other than English, they will not be protected. I also think that people will be shocked about, as I outlined, the employment of moderators and how badly they are paid and trained. There are factories full of people doing that important task.

I recommend that the Minister thinks again about requiring Ofcom to provide details on human moderators who are employed or engaged and how they are trained and supported. It is a bit like when we find out about factories producing various items under appalling conditions in other parts of the world—we need transparency on these issues to make people do something about it. These platforms will not do anything about it. Under questioning from my hon. Friend the Member for Pontypridd, Richard Earley admitted that he had no idea how many human moderators were working for Facebook. That is appalling and we must do something about it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I obviously have sympathy with the objectives, but the topics covered in schedule 8, which include the systems and processes for responding to illegal and harmful content and so on, give Ofcom the power to do what the hon. Member requires. On the language point, the risk assessments that companies are required to do are hard-edged duties in the Bill, and they will have to include an assessment of languages used in the UK, which is a large number of languages—obviously, it does not include languages spoken outside the UK. So the duty to risk-assess languages already exists. I hope that gives the hon. Member reassurance. She is making a reasonable point, and I would expect that, in setting out transparency requirements, Ofcom will address it. I am sure that it will look at our proceedings to hear Parliament’s expectations, and we are giving it those powers, which are clearly set out in schedule 8.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will just make a final point. The Bill gives Ofcom powers when it already has so much to do. We keep returning to the point of how much will ride on Ofcom’s decisions. Our amendments would make clear the requirement for transparency reporting relating to the language issue, as well as the employment of human moderators and how they are trained and supported. If we do not point that out to Ofcom, it really has enough other things to be doing, so we are asking for these points to be drawn out specifically. As in so many of our amendments, we are just asking for things to be drawn out so that they happen.

Question put, That the amendment be made.

Online Safety Bill (Eleventh sitting)

Barbara Keeley Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clauses 71 to 76 stand part.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

It is a pleasure to serve with you in the Chair again, Sir Roger. I add my tribute to our former colleague, Jo Cox, on this sad anniversary. Our thoughts are with her family today, including our colleague and my hon. Friend, the Member for Batley and Spen.

We welcome the “polluter pays” principle on which this and the following clauses are founded. Clause 70 establishes a duty for providers to notify Ofcom if their revenue is at or above the specified threshold designated by Ofcom and approved by the Secretary of State. It also creates duties on providers to provide timely notice and evidence of meeting the threshold. The Opposition do not oppose those duties. However, I would be grateful if the Minister could clarify what might lead to a provider or groups of providers being exempt from paying the fee. Subsection (6) establishes that

“OFCOM may provide that particular descriptions of providers of regulated services are exempt”,

subject to the Secretary of State’s approval. Our question is what kinds of services the Minister has in mind for that exemption.

Turning to clauses 71 to 76, as I mentioned, it is appropriate that the cost to Ofcom of exercising its online safety functions is paid through an annual industry fee, charged to the biggest companies with the highest revenues, and that smaller companies are exempt but still regulated. It is also welcome that under clause 71, Ofcom can make reference to factors beyond the provider’s qualifying worldwide revenue when determining the fee that a company must pay. Acknowledging the importance of other factors when computing that fee can allow for a greater burden of the fees to fall on companies whose activities may disproportionately increase Ofcom’s work on improving safety.

My hon. Friend the Member for Pontypridd has already raised our concerns about the level of funding needed for Ofcom to carry out its duties under the Bill. She asked about the creation of a new role: that of an adviser on funding for the online safety regulator. The impact assessment states that the industry fee will need to average around £35 million a year for the next 10 years to pay for operating expenditure. Last week, the Minister referred to a figure of around £88 million that has been announced to cover the first two years of the regime while the industry levy is implemented, and the same figure was used on Second Reading by the Secretary of State. Last October’s autumn Budget and spending review refers on page 115 to

“over £110 million over the SR21 period for the government’s new online safety regime through the passage and implementation of the Online Safety Bill, delivering on the government’s commitment to make the UK the safest place to be online.”

There is no reference to the £88 million figure or to Ofcom in the spending review document. Could the Minister tell us a bit more about that £88 million and the rest of the £110 million announced in the spending review, as it is relevant to how Ofcom is going to be resourced and the industry levy that is introduced by these clauses?

The Opposition feel it is critical that when the Bill comes into force, there is no gap in funding that would prevent Ofcom from carrying out its duties. The most obvious problem is that the level of funding set out in the spending review was determined when the Bill was in draft form, before more harms were brought into scope. The Department for Digital, Culture, Media and Sport has also confirmed that the figure of £34.9 million a year that is needed for Ofcom to carry out its online safety duties was based on the draft Bill.

We welcome many of the additional duties included in the Bill since its drafting, such as on fraudulent advertising, but does the Minister think the same level of funding will be adequate as when the calculation was made, when the Bill was in draft form? Will he reconsider the calculations his Department has made of the level of funding that Ofcom will need for this regime to be effective in the light of the increased workload that this latest version of the Bill introduces?

In March 2021, Ofcom put out a press release stating that 150 people would be employed in the new digital and technology hub in Manchester, but that that number would be reached in 2025. Therefore, as well as the level of resource being based on an old version of the Bill, the timeframe reveals a gap of three years until all the staff are in place. Does the Minister believe that Ofcom will have everything that is needed from the start, and in subsequent years as the levy gets up and going, in order to carry out its duties?

Of course, this will depend on how long the levy might need to be in place. My understanding of the timeframe is that first, the Secretary of State must issue guidance to Ofcom about the principles to be included in the statement of principles that Ofcom will use to determine the fees payable under clause 71. Ofcom must consult with those affected by the threshold amount to inform the final figure it recommends to the Secretary of State, and must produce a statement about what amounts comprise the provider’s qualifying world revenue and the qualifying period. That figure and Ofcom’s guidance must be agreed by the Secretary of State and laid before Parliament. Based on those checks and processes, how quickly does the Minister envisage the levy coming into force?

The Minister said last week that Ofcom is resourced for this work until 2023-24. Will the levy be in place by then to fund Ofcom’s safety work into 2024-25? If not, can the Minister confirm that the Government will cover any gaps in funding? I am sure he will agree, as we all do, that the duties in the Bill must be implemented as quickly as possible, but the necessary funding must also be in place so that Ofcom as a regulator can enforce the safety duty.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I have just a short comment on these clauses. I very much applaud the Government’s approach to the funding of Ofcom through this mechanism. Clause 75 sets out clearly that the fees payable to Ofcom under section 71 should only be

“sufficient to meet, but…not exceed the annual cost to OFCOM”.

That is important when we start to think about victim support. While clearly Ofcom will have a duty to monitor the efficacy of the mechanisms in place on social media platforms, it is not entirely clear to me from the evidence or conversations with Ofcom whether it will see it as part of its duty to ensure that other areas of victim support are financed through those fees.

It may well be that the Minister thinks it more applicable to look at this issue when we consider the clauses on fines, and I plan to come to it at that point, but it would be helpful to understand whether he sees any role for Ofcom in ensuring that there is third-party specialist support for victims of all sorts of crime, including fraud or sexual abuse.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Before the Minister gets past this point—I think he has reached the point of my question—the fees do not kick in for two years. The figure is £88 million, but the point I was making is that the scope of the Bill has already increased. I asked about this during the evidence session with Ofcom. Fraudulent advertising was not included before, so there are already additional powers for Ofcom that need to be funded. I was questioning whether the original estimate will be enough for those two years.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I assume that the hon. Lady is asking about the £88 million.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

indicated assent.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That covers the preparatory work rather than the actual enforcement work that will follow. For the time being, we believe that it is enough, but of course we always maintain an active dialogue with Ofcom.

Finally, there was a question from my right hon. Friend the Member for Basingstoke, who asked how victims will be supported and compensated. As she said, Ofcom will always pay attention to victims in its work, but we should make it clear that the fees we are debating in these clauses are designed to cover only Ofcom’s costs and not those of third parties. I think the costs of victim support and measures to support victims are funded separately via the Ministry of Justice, which leads in this area. I believe that a victims Bill is being prepared that will significantly enhance the protections and rights that victims have—something that I am sure all of us will support.

Question put and agreed to.

Clause 70 accordingly ordered to stand part of the Bill.

Clauses 71 to 76 ordered to stand part of the Bill.

Clause 77

General duties of OFCOM under section 3 of the Communications Act

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I want to remind Committee members of what my hon. Friend is talking about. I refer to the oral evidence we heard from Danny Stone, from the Antisemitism Policy Trust, on these small, high-harm platforms. He laid out examples drawn from the work of the Community Security Trust, which released a report called “Hate Fuel”. The report looked at

“various small platforms and highlighted that, in the wake of the Pittsburgh antisemitic murders, there had been 26 threads…with explicit calls for Jews to be killed. One month prior to that, in May 2020, a man called Payton Gendron found footage of the Christchurch attacks. Among this was legal but harmful content, which included the “great replacement” theory, GIFs and memes, and he went on a two-year journey of incitement.”

A week or so before the evidence sitting,

“he targeted and killed 10 people in Buffalo. One of the things that he posted was:

‘Every Time I think maybe I shouldn’t commit to an attack I spend 5 min of /pol/’—

which is a thread on the small 4chan platform—

‘then my motivation returns’.”

Danny Stone told us that the kind of material we are seeing, which is legal but harmful, is inspiring people to go out and create real-world harm. When my hon. Friend the Member for Pontypridd asked him how to amend this approach, he said:

“You would take into account other things—for example, characteristics are already defined in the Bill, and that might be an option”.––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 128, Q203-204.]

I do hope that, as my hon. Friend urges, the Minister will look at all these options, because this is a very serious matter.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. The evidence we heard from Danny Stone from the Antisemitism Policy Trust clearly outlined the real-world harm that legal but harmful content causes. Such content may be legal, but it causes mass casualties and harm in the real world.

There are ways that we can rectify that in the Bill. Danny Stone set them out in his evidence and the SNP amendments, which the Labour Front Bench supports wholeheartedly, outline them too. I know the Minister wants to go further; he has said as much himself to this Committee and on the Floor of the House. I urge him to support some of the amendments, because it is clear that such changes can save lives.

Schedule 10 outlines the regulations specifying threshold conditions for categories of part 3 services. Put simply, as the Minister knows, Labour has concerns about the Government’s plans to allow thresholds for each category to be set out in secondary legislation. As we have said before, the Bill has already faced significant delays at the hands of the Government and we have real concerns that a reliance on secondary legislation further kicks the can down the road.

We also have concerns that the current system of categorisation is inflexible in so far as we have no understanding of how it will work if a service is required to shift from one category to another, and how long that would take. How exactly will that work in practice? Moreover, how long would Ofcom have to preside over such decisions?

We all know that the online space is susceptible to speed, with new technologies and ways of functioning popping up all over, and very often. Will the Minister clarify how he expects the re-categorisation process to occur in practice? The Minister must accept that his Department has been tone deaf on this point. Rather than an arbitrary size cut-off, the regulator must use risk levels to determine which category a platform should fall into so that harmful and dangerous content does not slip through the net.

Labour welcomes clause 81, which sets out Ofcom’s duties in establishing a register of categories of certain part 3 services. As I have repeated throughout the passage of the Bill, having a level of accountability and transparency is central to its success. However, we have slight concerns that the wording in subsection (1), which stipulates that the register be established

“as soon as reasonably practicable”,

could be ambiguous and does not give us the certainty we require. Given the huge amount of responsibility the Bill places on Ofcom, will the Minister confirm exactly what he believes the stipulation means in practice?

Finally, we welcome clause 82. It clarifies that Ofcom has a duty to maintain the all-important register. However, we share the same concerns I previously outlined about the timeframe in which Ofcom will be compelled to make such changes. We urge the Minister to move as quickly as he can, to urge Ofcom to do all they can and to make these vital changes.

Online Safety Bill (Twelfth sitting)

Barbara Keeley Excerpts
Committee stage
Thursday 16th June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 16 June 2022 - (16 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I understand the shadow Minister’s point. We all heard from Frances Haugen about the social media firms’ well-documented reluctance—to put it politely—to open themselves up to external scrutiny. Making that happen is a shared objective. We have already discussed several times the transparency obligations enshrined in clause 64. Those will have a huge impact in ensuring that the social media firms open up a lot more and become more transparent. That will not be an option; they will be compelled to do that. Ofcom is obliged under clause 64 to publish the guidance around those transparency reports. That is all set in train already, and it will be extremely welcome.

Researchers’ access to information is covered in clause 136, which the amendments seek to amend. As the shadow Minister said, our approach is first to get Ofcom to prepare a report into how that can best be done. There are some non-trivial considerations to do with personal privacy and protecting people’s personal information, and there are questions about who counts as a valid researcher. When just talking about it casually, it might appear obvious who is or is not a valid researcher, but we will need to come up with a proper definition of “valid researcher” and what confidentiality obligations may apply to them.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

This is all sorted in the health environment because of the personal data involved—there is no data more personal than health data—and a trusted and safe environment has been created for researchers to access personal data.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

This data is a little different—the two domains do not directly correspond. In the health area, there has been litigation—an artificial intelligence company is currently engaged in litigation with an NHS hospital trust about a purported breach of patient data rules—so even in that long-established area, there is uncertainty and recent, or perhaps even current, litigation.

We are asking for the report to be done to ensure that those important issues are properly thought through. Once they are, Ofcom has the power under clause 136 to lay down guidance on providing access for independent researchers to do their work.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the hon. Member for Pontypridd for laying out her case in some detail, though nowhere near the level of detail that these people have to experience while providing moderation. She has given a very good explanation of why she is asking for the amendment and new clause to be included in the Bill. Concerns are consistently being raised, particularly by the Labour party, about the impact on the staff members who have to deal with this content. I do not think the significance of this issue for those individuals can be overstated. If we intend the Bill to have the maximum potential impact and reduce harm to the highest number of people possible, it makes eminent sense to accept this amendment and new clause.

There is a comparison with other areas in which we place similar requirements on other companies. The Government require companies that provide annual reports to undertake an assessment in those reports of whether their supply chain uses child labour or unpaid labour, or whether their factories are safe for people to work in—if they are making clothes, for example. It would not be an overly onerous request if we were to widen those requirements to take account of the fact that so many of these social media companies are subjecting individuals to trauma that results in them experiencing PTSD and having to go through a lengthy recovery process, if they ever recover. We have comparable legislation, and that is not too much for us to ask. Unpaid labour, or people being paid very little in other countries, is not that different from what social media companies are requiring of their moderators, particularly those working outside the UK and the US in countries where there are less stringent rules on working conditions. I cannot see a reason for the Minister to reject the provision of this additional safety for employees who are doing an incredibly important job that we need them to be doing, in circumstances where their employer is not taking any account of their wellbeing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

As my hon. Friend the Member for Pontypridd has pointed out, there is little or no transparency about one of the most critical ways in which platforms tackle harms. Human moderators are on the frontline of protecting children and adults from harmful content. They must be well resourced, trained and supported in order to fulfil that function, or the success of the Bill’s aims will be severely undermined.

I find it shocking that platforms offer so little data on human moderation, either because they refuse to publish it or because they do not know it. For example, in evidence to the Home Affairs Committee, William McCants from YouTube could not give precise statistics for its moderator team after being given six days’ notice to find the figure, because many moderators were employed or operated under third-party auspices. For YouTube’s global counter-terrorism lead to be unaware of the detail of how the platform is protecting its users from illegal content is shocking, but it is not uncommon.

In evidence to this Committee, Meta’s Richard Earley was asked how many of Meta’s 40,000 human moderators were outsourced to remove illegal content and disinformation from the platform. My hon. Friend the Member for Pontypridd said:

“You do not have the figures, so you cannot tell me.”

Richard Earley replied:

“I haven’t, no, but I will be happy to let you know afterwards in our written submission.”

Today, Meta submitted its written evidence to the Committee. It included no reference to human content moderators, despite its promise.

The account that my hon. Friend gave just now shows why new clause 11 is so necessary. Meta’s representative told this Committee in evidence:

“Everyone who is involved in reviewing content at Meta goes through an extremely lengthy training process that lasts multiple weeks, covering not just our community standards in total but also the specific area they are focusing on, such as violence and incitement.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 45, Q76.]

But now we know from whistleblowers such as Daniel, whose case my hon. Friend described, that that is untrue. What is happening to Daniel and the other human moderators is deeply concerning. There are powerful examples of the devastating emotional impact that can occur because human moderators are not monitored, trained and supported.

There are risks of platforms shirking responsibility when they outsource moderation to third parties. Stakeholders have raised concerns that a regulated company could argue that an element of its service is not in the scope of the regulator because it is part of a supply chain. We will return to that issue when we debate new clause 13, which seeks to ensure enforcement of liability for supply chain failures that amount to a breach of one of the specified duties.

Platforms, in particular those supporting user-to-user generated content, employ those services from third parties. Yesterday, I met Danny Stone, the chief executive of the Antisemitism Policy Trust, who described the problem of antisemitic GIFs. Twitter would say, “We don’t supply GIFs. The responsibility is with GIPHY.” GIPHY, as part of the supply chain, would say, “We are not a user-to-user platform.” If someone searched Google for antisemitic GIFs, the results would contain multiple entries saying, “Antisemitic GIFs—get the best GIFs on GIPHY. Explore and share the best antisemitic GIFs.”

One can well imagine a scenario in which a company captured by the regulatory regime established by the Bill argues that an element of its service is not within the ambit of the regulator because it is part of a supply chain presented by, but not necessarily the responsibility of, the regulated service. The contracted element, which I have just described by reference to Twitter and GIPHY, supported by an entirely separate company, would argue that it was providing a business-to-business service that is not user-generated content but content designed and delivered at arm’s length and provided to the user-to-user service to deploy for its users.

I suggest that dealing with this issue would involve a timely, costly and unhelpful legal process during which systems were not being effectively regulated—the same may apply in relation to moderators and what my hon. Friend the Member for Pontypridd described; there are a number of lawsuits involved in Daniel’s case—and complex contract law was invoked.

We recognise in UK legislation that there are concerns and issues surrounding supply chains. Under the Bribery Act 2010, for example, a company is liable if anyone performing services for or on the company’s behalf is found culpable for specific actions. These issues on supply chain liability must be resolved if the Bill is to fulfil its aim of protecting adults and children from harm.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

May I first say a brief word about clause stand part, Sir Roger?

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister has not commented on the problem I raised of the contracted firm in the supply chain not being covered by the regulations under the Bill—the problem of Twitter and the GIFs, whereby the GIFs exist and are used on Twitter, but Twitter says, “We’re not responsible for them; it’s that firm over there.” That is the same thing, and new clause 11 would cover both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am answering slightly off the cuff, but I think the point the hon. Lady is raising—about where some potentially offensive or illegal content is produced on one service and then propagated or made available by another—is one we debated a few days ago. I think the hon. Member for Aberdeen North raised that question, last week or possibly the week before. I cannot immediately turn to the relevant clause—it will be in our early discussions in Hansard about the beginning of the Bill—but I think the Bill makes it clear that where content is accessed through another platform, which is the example that the hon. Member for Worsley and Eccles South just gave, the platform through which the content is made available is within the scope of the Bill.

Question put, That the amendment be made.

Online Safety Bill (Thirteenth sitting)

Barbara Keeley Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In this clause we are specifically talking about access to information for researchers. Obviously, the transparency matters were covered in clauses 64 and 135. There is consensus across both parties that access to information for bona fide academic researchers is important. The clause lays out a path to take us in the direction of providing that access by requiring Ofcom to produce a report. We debated the matter earlier. The hon. Member for Worsley and Eccles South—I hope I got the pronunciation right this time—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady made some points about the matter in an earlier sitting, as the shadow Minister just said. It is an area we are giving some careful thought to, because it is important that it is properly academically researched. Although Ofcom is being well resourced, as we have discussed, with lots of money and the ability to levy fees, we understand that it does not have a monopoly on wisdom—as good a regulator as it is. It may well be that a number of academics could add a great deal to the debate by looking at some of the material held inside social media firms. The Government recognise the importance of the matter, and some thought is being given to these questions, but at least we can agree that clause 136 as drafted sets out a path that leads us in this important direction.

Question put and agreed to.

Clause 136 accordingly ordered to stand part of the Bill.

Clause 137

OFCOM’s reports

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this, it will be convenient to consider clause 139 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Good morning, Ms Rees. It is a pleasure to serve on the Committee with you in the Chair. Clause 138 allows companies to make appeals against Ofcom’s decisions regarding the categorisation of services within categories 1, 2A or 2B.

We have argued, many times, that we believe the Government’s size-based approach to categorisation is flawed. Our preference for an approach based on risk is backed up by the views of multiple stakeholders and the Joint Committee. It was encouraging to hear last week of the Minister’s intention to look again at the issues of categorisation, and I hope we will see movement on that on Report.

Clause 138 sets out that where a regulated provider has filed an appeal, they are exempt from carrying out the duties in the Bill that normally apply to services designated as category 1, 2A or 2B. That is concerning, given that there is no timeframe in which the appeals process must be concluded.

While the right to appeal is important, it is feasible that many platforms will raise appeals about their categorisation to delay the start of their duties under the Bill. I understand that the platforms will still have to comply with the duties that apply to all regulated services, but for a service that has been classified by Ofcom as high risk, it is potentially dangerous that none of the risk assessments on measures to assess harm will be completed while the appeal is taking place. Does the Minister agree that the appeals process must be concluded as quickly as possible to minimise the risk? Will he consider putting a timeframe on that?

Clause 139 allows for appeals against decisions by Ofcom to issue notices about dealing with terrorism and child sexual abuse material, as well as a confirmation decision or a penalty notice. As I have said, in general the right to appeal is important. However, would an appeals system work if, for example, a company were appealing to a notice under clause 103? In what circumstances does the Minister imagine that a platform would appeal a notice by Ofcom requiring the platform to use accredited technology to identify child sexual abuse content and swiftly take down that content? It is vital that appeals processes are concluded as rapidly as possible, so that we do not risk people being exposed to harmful or dangerous content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The shadow Minister has set out the purpose of the clauses, which provide for, in clause 138 appeal rights for decisions relating to registration under clause 81, and in clause 139 appeals against Ofcom notices.

I agree that it is important that judicial decisions in this area get made quickly. I note that the appeals are directly to the relevant upper tribunal, which is a higher tier of the tribunal system and tends to be a little less congested than the first-tier tribunal, which often gets used for some first-instance matters. I hope that appeals going to the upper tribunal, directly to that more senior level, provides some comfort.

On putting in a time limit, the general principle is that matters concerning listing are reserved to the judiciary. I recall from my time as a Minister in the Ministry of Justice, that the judiciary guards its independence fiercely. Whether it is the Senior President of Tribunals or the Lord Chief Justice, they consider listing matters to be the preserve of the judiciary, not the Executive or the legislature. Compelling the judiciary to hear a case in a certain time might well be considered to infringe on such principles.

We can agree, however—I hope the people making those listing decisions hear that we believe, that Parliament believes—that it is important to do this quickly, in particular where there is a risk of harm to individuals. Where there is risk to individuals, especially children, but more widely as well, those cases should be heard very expeditiously indeed.

The hon. Member for Worsley and Eccles South also asked about the basis on which appeals might be made and decided. I think that is made fairly clear. For example, clause 139(3) makes it clear that, in deciding an appeal, the upper tribunal will use the same principles as would be applied by the High Court to an application for judicial review—so, standard JR terms—which in the context of notices served or decisions made under clause 103 might include whether the power had been exercised in conformity with statute. If the power were exercised or purported to be exercised in a manner not authorised by statute, that would be one grounds for appeal, or if a decision were considered so grossly unreasonable that no reasonable decision maker could make it, that might be a grounds for appeal as well.

I caution the Committee, however: I am not a lawyer and my interpretation of judicial review principles should not be taken as definitive. Lawyers will advise their clients when they come to apply the clause in practice and they will not take my words in Committee as definitive when it comes to determining “standard judicial review principles”—those are well established in law, regardless of my words just now.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

There is a concern that platforms might raise appeals about their categorisation in order to delay the start of their duties under the Bill. How would the Minister act if that happened—if a large number of appeals were pending and the duties under the Bill therefore did not commence?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, resourcing of the upper tribunal is a matter decided jointly by the Lord Chancellor and the Secretary of State for Justice, in consultation with the Lord Chief Justice, and, in this case, the Senior President of Tribunals. Parliament would expect the resourcing of that part of the upper tribunal to be such that cases could be heard in an expedited matter. Particularly where cases concern the safety of the public—and particularly of children—we expect that to be done as quickly as it can.

Question put and agreed to.

Clause 138 accordingly ordered to stand part of the Bill.

Clause 139 ordered to stand part of the Bill.

Clause 140

Power to make super-complaints

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Bill currently specifies that super-complaints can be made back to Ofcom by bodies representing users or members of the public. The addition of consumer representatives through the amendments is important. Consumer representatives are a key source of information about harms to users of online services, which are widespread, and would be regulated by this legislation. We support the amendments, which would include consumers on the list as an entity that is eligible to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, we want the super-complaint function to be as effective as possible and for groups of relevant people, users or members of the public to be able to be represented by an eligible entity to raise super-complaints. I believe we are all on the same page in wanting to do that. If I am honest, I am a little confused as to what the addition of the term “consumers” will add. The term “users” is defined quite widely, via clause 140(6), which then refers to clause 181, where, as debated previously, a “user” is defined widely to include anyone using a service, whether registered or not. So if somebody stumbles across a website, they count as a user, but the definition being used in clause 140 about bringing super-complaints also includes “members of the public”—that is, regular citizens. Even if they are not a user of that particular service, they could still be represented in bringing a complaint.

Given that, by definition, “users” and “members of the public” already cover everybody in the United Kingdom, I am not quite sure what the addition of the term “consumers” adds. By definition, consumers are a subset of the group “users” or “members of the public”. It follows that in seeking to become an eligible entity, no eligible entity will purport to act for everybody in the United Kingdom; they will always be seeking to define some kind of subset of people. That might be children, people with a particular vulnerability or, indeed, consumers, who are one such subset of “members of the public” or “users”. I do not honestly understand what the addition of the word “consumers” adds here when everything is covered already.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 77, in clause 140, page 121, line 9, leave out subsection (2).

This amendment removes the tests that complaints have to be of particular importance in order to be admissible.

When I first read clause 140, subsection (2) raised a significant number of red flags for me. The subsection might be reasonable if we did not have giant companies—social media platforms particularly—that significant numbers of people across the UK use regularly. Facebook might be counted as a “single regulated service”, but 85% of UK residents—57.1 million people—had a Facebook account earlier this year. Twitter is used by 28% of people living in the UK, which is 19 million users. TikTok is at 19%, which is significantly less, but still a very high number of people—13 million users. I can understand the decision that a super-complaint picking on one certain company might be a bit extreme, but it does not make sense when we are considering the Facebooks of this world.

If someone is making a complaint about a single regulated service and that service is Facebook, Twitter, TikTok or another large platform—or a new, yet-to-be-created platform—that significant numbers of people use, there is no justification for treating that complaint differently just because it is against a single entity. When a complaint is made against Facebook—I am picking on Facebook because 85% of the UK public are members of it; it is an absolute behemoth—I would like there to be no delay in its being taken to Ofcom. I would like Ofcom not to have to check and justify that the complaint is “of particular importance”.

Subsection (2)(a) states that one of the tests of the complaint should be that it “is of particular importance” or, as subsection (2)(b) notes, that it

“relates to the impacts on a particularly large number of users of the service or members of the public.”

I do not understand what

“large number of users of the service”

would mean. Does a large number of the users of Facebook mean 50% of its users? Does it mean 10%? What is a large number? Is that in percentage terms, or is it something that is likely to impact 1 million people? Is that a large number? The second part—

“large number…of members of the public”—

is again difficult to define. I do not think there is justification for this additional hoop just because the complaint relates to a single regulated service.

Where a complaint relates to a very small platform that is not causing significant illegal harm, I understand that Ofcom may want to consider whether it will accept, investigate and give primacy and precedence to that. If the reality is that the effect is non-illegal, fairly minor and impacts a fairly small number of people, in the order of hundreds instead of millions, I can understand why Ofcom might not want to give that super-complaint status and might not want to carry out the level of investigation and response necessary for a super-complaint. But I do not see any circumstances in which Ofcom could justify rejecting a complaint against Facebook simply because it is a complaint against a single entity. The reality is that if something affects one person on Facebook, it will affect significantly more than one person on Facebook because of Facebook’s absolutely massive user base. Therefore this additional hoop is unrealistic.

Paragraph (a), about the complaint being “of particular importance”, is too woolly. Does it relate only to complaints about things that are illegal? Does it relate only to things that are particularly urgent—something that is happening now and that is having an impact today? Or is there some other criterion that we do not yet know about?

I would very much appreciate it if the Minister could give some consideration to amendment 77, which would simply remove subsection (2). If he is unwilling to remove that subsection, I wonder whether we could meet halfway and whether, let us say, category 1 providers could all be excluded from the “single provider” exemption, because they have already been assessed by Ofcom to have particular risks on their platforms. That group is wider than the three names that I have mentioned, and I think that that would be a reasonable and realistic decision for the Government—and direction for Ofcom—to take. It would be sensible.

If the Government believe that there is more information—more direction—that they could add to the clause, it would be great if the Minister could lay some of that out here and let us know how he intends subsection (2) to operate in practice and how he expects Ofcom to use it. I get that people might want it there as an additional layer of protection, but I genuinely do not imagine that it can be justified in the case of the particularly large providers, where there is significant risk of harm happening.

I will illustrate that with one last point. The Government specifically referred earlier to when Facebook—Meta—stopped proactively scanning for child sexual abuse images because of an issue in Europe. The Minister mentioned the significant amount of harm and the issues that were caused in a very small period. And that was one provider—the largest provider that people use and access. That massive amount of harm can be caused in a very small period. I do not support allowing Meta or any other significantly large platform to have a “get out of jail” card. I do not want them to be able to go to Ofcom and say, “Hey, Ofcom, we’re challenging you on the basis that we don’t think this complaint is of particular importance” or “We don’t think the complaint relates to the impacts on a particularly large number of users of the service or members of the public.” I do not want them to have that ability to wriggle out of things because this subsection is in the Bill, so any consideration that the Minister could give to improving clause 140 and subsection (2) would be very much appreciated.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

We support the SNP’s amendment 77, moved by the hon. Member for Aberdeen North. The super-complaints mechanism introduced by clause 140 is a useful device for reporting numerous, widespread concerns about the harm caused by multiple or single services or providers. Subsection (1) includes the conditions on the subjects of super-complaints, which can relate to one or more services. However, as the hon. Member has pointed out, that is caveated by subsection (2), under which a super-complaint that refers to a single service or provider must prove, as she has just outlined, that it is “of particular importance” or

“relates to the impacts on a particularly large number of users of the service or members of the public.”

Given the various hoops through which a super-complaint already has to jump, it is not clear why the additional conditions are needed. Subsection (2) significantly muddies the waters and complicates the provisions for super-complaints. For instance, how does the Minister expect Ofcom to decide whether the complaint is of particular importance? What criteria does he expect the regulator to use? Why include it as a metric in the first place when the super-complaint has already met the standards set out in subsection (1)?

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Fourteenth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Amendment 153, in clause 141, page 121, line 32, after “140” insert

“, which must include the requirement that OFCOM must respond to such complaints within 90 days”

Clauses 141 and 142 stand part.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

Good afternoon, Ms Rees. The importance of an effective complaints procedure has been argued strongly by many people who have given oral and written evidence to this Committee and indeed by Committee members. It is welcome that clause 140 introduces a super-complaints mechanism to report multiple, widespread concerns about the harm caused by services, but the lack of redress for individuals has been raised repeatedly.

This is a David and Goliath situation, with platforms holding all the power, while individuals are left to navigate the often complex and underfunded internal complaints systems provided by the platforms. This is what the London School of Economics and Political Science has called the

“current imbalance between democratic, ‘people’ power and the power of platforms.”

As we argued on new clause 1, there is a clear need to consider a route for redress at an individual level. The current situation is unsatisfactory for people who feel they have been failed by a service’s complaints system and who find themselves with no source of redress.

The current situation is also unsatisfactory for the regulator. Kevin Bakhurst from Ofcom told the right hon. Member for Basingstoke during our evidence sessions:

“Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly.”––[Official Report, Online Safety Public Bill Committee, 24 May; c.9-10, Q9.]

An external redress process was recommended by the Joint Committee on the draft Bill and has been suggested by multiple stakeholders. Our new clause would make sure that we find the best possible solution to the problem. I hope the Minister reconsiders these points and supports new clause 1 when the time comes to vote on it.

As I have argued previously, organisations will not be able to make full and effective use of the super-complaints system unless the platforms risk assessments are published in full. The Opposition’s amendments 11 and 13 sought to address that issue, and I am disappointed that the Government failed to grasp their importance. There is now a real risk that civil society and other groups will not be able to assess and identify the areas where a company may not be meeting its safety duties. How does the Minister expect organisations making super-complaints to identify and argue that a service is causing harm to its users if they have no access to the company’s own analysis and mitigation strategy? Not including a duty to publish risk assessments leaves a gaping hole in the Bill and risks undermining the super-complaints mechanism. I hope that the Minister will reconsider his opposition to this important transparency mechanism in future stages of the Bill.

For powers about super-complaints to be meaningful, there must be a strict deadline for Ofcom to respond to them, and we will support the SNP amendment if it is pushed to a vote. The Enterprise Act 2002 gives a 90-day deadline for the Competition and Markets Authority to respond. Stakeholders have suggested a similar deadline to respond for super-complaints as an effective mechanism to ensure action from the regulator. I urge the Minister to consider this addition, either in the Bill with this amendment, or in the secondary legislation that the clause requires.

Clauses 141 and 142 relate to the structures around super-complaints. Clause 141 appears to be more about handing over powers to the Secretary of State than insuring a fair system of redress. The Opposition have said repeatedly how we feel about the powers being handed over to the Secretary of State. Clause 142 includes necessary provisions on the creation and publication of guidance by Ofcom, which we do not oppose. Under clause 141, Ofcom will have to provide evidence of the validity of the super-complaint and the super-complainant within a stipulated timeframe. However, there is little in the Bill about what will happen when a super-complaint is made, and much of the detail on how that process will work has been left to secondary legislation.

Does the Minister not think that it is strange to leave it up to the Secretary of State to determine how Ofcom is to deal with super-complaints? How does he envisage the system working, and what powers does he think Ofcom will need to be able to assert itself in relation to super-complaints? It seems odd to leave the answers to those important questions out of the Bill.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I appreciate the support from the Opposition in relation to amendment 153. I want to talk about amendment 153, but also about some of the issues there are with clauses 140 and 141—not so much 142. Clause 140(3) allows the Secretary of State to make regulations in relation to working out who an eligible entity is for making super-complaints. The Minister has helpfully been very clear that the definition is likely to be pretty wide—the definition of groups that are working on behalf of consumers is likely to be wide. The regulations that are made in this section are going to be made under the draft affirmative procedure. Although secondary legislation is not brilliant, the affirmative procedure will allow more scrutiny than negative procedure. I appreciate that the Minister has chosen—or the people drafting the Bill have chosen—that way forward for deciding on the eligible entity.

I am concerned that when it comes to clause 141(1), the regulations setting out how the complaints process will be made, and the regulation level, will be done under the negative procedure rather than under the draft affirmative procedure. I have got the Delegated Powers and Regulatory Reform Committee memorandum, which tells us about each of the delegated powers of the Bill, and the justification for them. I understand that the Department is referring to the Police Super-complaints (Designation and Procedure) Regulations 2018, which were made under the negative procedure. However, I am not convinced that in the Policing and Crime Act 2017 we were left with quite so little information about what would be included in those complaints. I think the justification for the negative procedure is not great, especially given the concerns raised about the over-reach of the Secretary of State’s power and the amount of influence they have on Ofcom.

I think clause 142 is fine; it makes sense that Ofcom is able to make guidance. I would have liked to see the regulation part involve more input from parliamentarians. If there is not going to be more input from parliamentarians, there should at least be more in the Bill about how the complaints procedure would work. The reason we have tabled amendment 153 is to ensure that Ofcom provides a response. That response does not have to be a final response saying, “We have investigated everything and these are the findings.” I understand that that may take some time. However, Ofcom must provide a response to super-complainants in 90 days. Even if it were to provide that information in the terms laid out in clause 141(2)(d)—whether a complaint is within clause 140, or is admissible under clause 140 or whether an entity is an eligible entity—and we were to commit Ofcom to provide that information within 90 days, that would be better than the current drafting, which is no time limits at all. It is not specified. It does not say that Ofcom has to deal with the complaint within a certain length of time.

A quick response from Ofcom is important for a number of reasons. I expect that those people who are bringing super-complaints are likely to be third sector organisations. Such organisations do not have significant or excessive budgets. They will be making difficult choices about where to spend their money. If they are bringing forward a super-complaint, they will be doing it on the basis that they think it is incredibly important and it is worth spending their finite funding on legal advice in order to bring forward that super-complaint. If there is an unnecessary delay before Ofcom even recognises whether the complaint is eligible, charities may spend money unnecessarily on building up a further case for the next stages of the super-complaint. They should be told very quickly, “No, we are not accepting this” or “Yes, we are accepting this”.

Ofcom has the ability to levy fees so that it can provide the service that we expect it to provide as a result of the Bill. It will have a huge amount of extra work compared with its current work. It needs to be able to levy fees in order to fulfil its functions. If there is no timeline and it says, “We want to levy fees because we want to be able to respond on a 90-day basis”, it would not be beyond companies to come back and say, “That is unrealistic—you should not be charging us extra fees in order for you to have enough people to respond within a 90-day period to super-complaints.”

If Ofcom is to be able to levy fees effectively to provide the level of service that we would all—including, I am sure, the Minister—like to see to super-complainants who are making very important cases on behalf of members of the public and people who are being harmed by content online, and to give Ofcom that backing when it is setting the structures and levying the fees, it would be sensible for the Minister to make some commitments about the timelines for super-complaints.

In earlier clauses of the Bill, primacy is given to complaints to social media platforms, for example—to regulated providers—about freedom of speech. The Bill says that they are to give such complaints precedence. They are to deal with them as important and, where some content has been taken down, quickly. That precedence is written into the Bill. Such urgency is not included in these three clauses on super-complaints in the way I would like to see. The Bill should say that Ofcom has to deal with super-complaints quickly. I do not mean it should do that by doing a bad job. I mean that it should begin to investigate quickly, work out whether it is appropriate to investigate it under the super-complaints procedure, and then begin the investigation.

In some cases, stuff will be really urgent and will need to be dealt with very quickly, especially if, for example, it includes child sexual abuse images. That would need to be dealt with in a matter of hours or days, rather than any longer period.

I would like to see some sort of indication given to Ofcom about the timelines that we are expecting it to work to. Given the amount of work that third sector organisations have put in to support this Bill and try to make it better, this is a fairly easy amendment for the Minister to accede to—an initial response by Ofcom within a 90-day period; we are not saying overnight—so that everyone can be assured that the internet is, as the Minister wishes, a much safer place.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clauses 151 to 155 stand part.

Clause 157 stand part.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Part 10 of the Bill sets out three new offences involving harmful, false or threatening communications. Clause 156 includes a new offence on cyber-flashing, to which my hon. Friend the Member for Pontypridd will speak shortly.

For many years, charities have been calling for an update to the offences included in the Malicious Communications Act 1998 and the Communications Act 2003. Back in 2018, the Law Commission pointed out that using the criminal law to deal with harmful online conduct was hindered by several factors, including limited law enforcement capacity to pursue the scale of abusive communications, what the commission called a “persistent cultural tolerance” of online abuse, and difficulties in striking a balance between protecting people from harm and maintaining rights of freedom of expression—a debate that we keep coming to in Committee and one that is still raging today. Reform of the legislation governing harmful online communications is welcome—that is the first thing to say—but the points laid out by the Law Commission in 2018 still require attention if the new offences are to result in the reduction of harm.

My hon. Friend the Member for Batley and Spen spoke about the limited definition of harm, which relates to psychological harm but does not protect against all harms resulting from messages received online, including those that are physical. We also heard from the hon. Member for Ochil and South Perthshire about the importance of including an offence of encouraging or assisting self-harm, which we debated last week with schedule 7. I hope that the Minister will continue to upgrade the merits of new clause 36 when the time comes to vote on it.

Those are important improvements about what should constitute an offence, but we share the concerns of the sector about the extent to which the new offences will result in prosecution. The threshold for committing one of the offences in clause 150 is high. When someone sends the message, there must be

“a real and substantial risk that it would cause harm to a likely audience”,

and they must have

“no reasonable excuse for sending the message.”

The first problem is that the threshold of having to prove the intention to cause distress is an evidential threshold. Finding evidence to prove intent is notoriously difficult. Professor Clare McGlynn’s oral evidence to the Committee was clear:

“We know from the offence of non-consensual sending of sexual images that it is that threshold that limits prosecutions, but we are repeating that mistake here with this offence.”

Professor McGlynn highlighted the story of Gaia Pope. With your permission, Ms Rees, I will make brief reference to it, in citing the evidence given to the Committee. In the past few weeks, it has emerged that shortly before Gaia Pope went missing, she was sent indecent images through Facebook, which triggered post-traumatic stress disorder from a previous rape. Professor McGlynn said:

“We do not know why that man sent her those images, and I guess my question would be: does it actually matter why he sent them? Unfortunately, the Bill says that why he sent them does matter, despite the harm it caused, because it would only be a criminal offence if it could be proved that he sent them with the intention of causing distress or for sexual gratification and being reckless about causing distress.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 58, Q105.]

The communications offences should be grounded upon consent rather than the motivation of the perpetrator. That is a clear omission in the Bill, which my hon. Friend the Member for Pontypridd will speak more about in relation to our amendments 41 and 42 to clause 156. The Government must act or risk missing a critical opportunity to tackle the harms resulting from communications offences.

We then come to the problem of the “reasonable excuse” defence and the “public interest” defence. Clause 150(5) sets out that the court must consider

“whether the message is, or is intended to be, a contribution to a matter of public interest”.

The wording in the clause states that this should not “determine the point”. If that is the case, why does the provision exist? Does the Minister recognise that there is a risk of the provision being abused? In a response to a question from the hon. Member for Aberdeen North, the Minister has previously said that:

“Clause 150…does not give a get-out-of-jail-free card”.––[Official Report, Online Safety Public Bill Committee, 7 June 2022; c. 275.]

Could he lay out what the purpose of this “matter of public interest” defence is? Combined with the reasonable excuse defence in subsection (1), the provisions risk sending the wrong message when it comes to balancing harms, particularly those experienced by women, of which we have already heard some awful examples.

There is a difference in the threshold of harm between clause 150, on harmful communications offences, and clause 151, on false communications offences. To constitute a false communications offence, the message sender must have

“intended the message, or the information in it, to cause non-trivial psychological or physical harm to a likely audience”.

To constitute a harmful communications offence, the message sender must have

“intended to cause harm to a likely audience”

and there must have been

“a real and substantial risk that it would cause harm to a likely audience”.

Will the Minister set out the Government’s reasoning for that distinction? We need to get these clauses right because people have been let down by inadequate legislation and enforcement on harmful online communications offences for far too long.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by saying that many of these clauses have been developed in careful consultation with the Law Commission, which has taken a great deal of time to research and develop policy in this area. It is obviously quite a delicate area, and it is important to make sure that we get it right.

The Law Commission is the expert in this kind of thing, and it is right that the Government commissioned it, some years ago, to work on these provisions, and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do. The clauses replace previous offences—for example, those in the Malicious Communications Act 1998—and update and improve those provisions in the form we see them in the Bill.

The shadow Minister, the hon. Member for Worsley and Eccles South, asked a number of questions about the drafting of the clauses and the thresholds that have to be met for an offence to be committed. We are trying to strike a balance between criminalising communications that deserve to be criminalised and not criminalising communications that people would consider should fall below the criminal threshold. There is obviously a balance to strike in doing that. We do not want to infringe free speech by going too far and having legitimate criticism and debate being subject to criminal sanctions. There is a balance to strike here between, on the one hand, public protection and where the criminal law sits versus, on the other hand, free speech and people expressing themselves. That is why clause 150 is constructed as it is, on the advice of the Law Commission.

As the hon. Member set out, the offence is committed only where there is a “real and substantial risk” that the likely audience would suffer harm. Harm is defined as

“psychological harm amounting to at least serious distress.”

Serious distress is quite a high threshold—it is significant thing, not something trivial. It is important to make that clear.

The second limb is that there is an intention to cause harm. Intention can in some circumstances be difficult to prove, but there are also acts that are so obviously malicious that there can be no conceivable motivation or intention other than to cause harm, where the communication is so obviously malfeasant. In those cases, establishing intent is not too difficult.

In a number of specific areas, such as intimate image abuse, my right hon. Friend the Member for Basingstoke and others have powerfully suggested that establishing intent is an unreasonably high threshold, and that the bar should be set simply at consent. For the intimate image abuse offence, the bar is set at the consent level, not at intent. That is being worked through by the Law Commission and the Ministry of Justice, and I hope that it will be brought forward as soon as possible, in the same way as the epilepsy trolling offence that we discussed a short while ago. That work on intimate image abuse is under way, and consent, not intent, is the test.

For the generality of communications—the clause covers any communications; it is incredibly broad in scope—it is reasonable to have the intent test to avoid criminalising what people would consider to be an exercise of free speech. That is a balance that we have tried to strike. The intention behind the appalling communications that we have heard in evidence and elsewhere is clear: it is in inconceivable that there was any other motivation or intention than to cause harm.

There are some defences—well, not defences, but conditions to be met—in clause 150(1)(c). The person must have “no reasonable excuse”. Subsection (5) makes it clear that

“In deciding whether a person has a reasonable excuse…one of the factors that a court must consider (if it is relevant in a particular case) is whether the message is, or is intended to be, a contribution to a matter of public interest (but that does not determine the point)”

of whether there is a reasonable excuse—it simply has to be taken into account by the court and balanced against the other considerations. That qualification has been put in for reasons of free speech.

There is a delicate balance to strike between criminalising what should be criminal and, at the same time, allowing reasonable free speech. There is a line to draw, and that is not easy, but I hope that, through my comments and the drafting of the clause, the Committee will see that that line has been drawn and a balance struck in a carefully calibrated way. I acknowledge that the matter is not straightforward, but we have addressed it with advice from the Law Commission, which is expert in this area. I commend clause 150 to the Committee.

The other clauses in this group are a little less contentious. Clause 151 sets out a new false communication offence, and I think it is pretty self-explanatory as drafted. The threatening communications offence in clause 152 is also fairly self-explanatory—the terms are pretty clear. Clause 153 contains interpretative provisions. Clause 154 sets out the extra-territorial application, and Clause 155 sets out the liability of corporate officers. Clause 157 repeals some of the old offences that the new provisions replace.

Those clauses—apart from clause 150—are all relatively straightforward. I hope that, in following the Law Commission’s advice, we have struck a carefully calibrated balance in the right place.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I would like to take the Minister back to the question I asked about the public interest defence. There is a great deal of concern that a lot of the overlaying elements create loopholes. He did not answer specifically the question of the public interest defence, which, combined with the reasonable excuse defence, sends the wrong message.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The two work together. On the reasonable excuse condition, for the offence to have been committed, it has to be established that there was no reasonable excuse. The matter of public interest condition—I think the hon. Lady is referring to subsection (5)—simply illustrates one of the ways in which a reasonable excuse can be established, but, as I said in my remarks, it is not determinative. It does not mean that someone can say, “There is public interest in what I am saying,” and they automatically have a reasonable excuse—it does not work automatically like that. That is why in brackets at the end of subsection (5) it says

“but that does not determine the point”.

That means that if a public interest argument was mounted, a magistrate or a jury, in deciding whether the condition in subsection (1)(c)—the “no reasonable excuse” condition—had been met, would balance the public interest argument, but it would not be determinative. A balancing exercise would be performed. I hope that provides some clarity about the way that will operate in practice.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

That was about as clear as mud, actually, but let us leave it there.

Question put and agreed to.

Clause 150 accordingly ordered to stand part of the Bill.

Clauses 151 to 155 ordered to stand part of the Bill.

Clause 156

Sending etc photograph or film of genitals

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 41, in clause 156, page 131, line 15, at end insert—

“(za) B has not consented for A to share the photograph or film with B, or”.

This amendment makes it an offence to send an image of genitals to another person if the recipient has not given consent to receive the image.

--- Later in debate ---
None Portrait The Chair
- Hansard -

With this it will be convenient to discuss schedule 13.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

We have argued that changes to the legislation are long overdue to protect people from the harms caused by online communications offences. The clause and schedule 13 include necessary amendments to the legislation, so we do not oppose them standing part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The clause cross-references schedule 13 and sets out amendments to existing legislation consequential on the communications offences in part 10. Schedule 13 has a number of consequential amendments, divided broadly into two parts. It makes various changes to the Sexual Offences Act 2003, amends the Regulatory Enforcement and Sanctions Act 2008 in relation to the Malicious Communications Act 1988, and makes various other changes, all of which are consequential on the clauses we have just debated. I therefore commend clause 158 and its associated schedule 13 to the Committee.

Question put and agreed to.

Clause 158 accordingly ordered to stand part of the Bill.

Schedule 13 agreed to.

Clause 159

Providers that are not legal persons

Question proposed, That the clause stand part of the Bill.

Online Safety Bill (Fifteenth sitting)

Barbara Keeley Excerpts
Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir Roger. Clause 168 is a very short and straightforward clause. Ofcom will be required to publish a variety of documents under the Online Safety Bill. The clause simply requires that this be done in a way that is appropriate and likely to bring it to the attention of any audience who are going to be affected by it. Ofcom is already familiar with this type of statutory obligation through existing legislation, such as the Digital Economy Act 2017, which places similar obligations on Ofcom. Ofcom is well versed in publishing documents in a way that is publicly accessible. Clause 168 puts the obligation on to a clear statutory footing.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

As the Minister said, clause 168 rightly sets out that the raw material the Bill requires of Ofcom is published in a way that will bring it to the attention of any audience likely to be affected by it. It will be important that all the guidance is published in a way that is easily available and accessible, including for people who are not neurotypical, or experience digital exclusion. I think we would all agree, after the work we have done on the Bill, that the subjects are complex and the landscape is difficult to understand. I hope Ofcom will make its documents as accessible as possible.

Question put and agreed to.

Clause 168 accordingly ordered to stand part of the Bill.

Clause 169

Service of notices

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 169 sets out the process for the service of any notice under the Bill, including notices to deal with child sexual exploitation and abuse or terrorism content, information notices, enforcement notices, penalty notices and public statement notices to providers of regulated services both within and outside the United Kingdom. The clause sets out that Ofcom may give a notice to a person by handing it to them, leaving it at the person’s last known address, sending it by post to that address or sending it by email to the person’s email address. It provides clarity regarding who Ofcom must give notice to in respect of different structures. For example, notice may be given to an officer of a body corporate.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

As the Minister said, clause 169 sets out the process of issuing notices or decisions by Ofcom. It mostly includes provisions about how Ofcom is to contact the company, which seem reasonable. The Opposition do not oppose clause 169.

Question put and agreed to.

Clause 169 accordingly ordered to stand part of the Bill.

Clause 170

Repeal of Part 4B of the Communications Act

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to consider clauses 171 and 172.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 170 repeals the video-sharing platform regime. While the VSP and online safety regimes have similar objectives, the new framework in the Bill will be broader and will apply to a wider range of online platforms. It is for this reason that we will repeal the VSP regime and transition those entities regulated as VSPs across to the online safety regime, which is broader and more effective in its provisions. The clause simply sets out the intention to repeal the VSP.

Clause 171 repeals part 3 of the Digital Economy Act 2017. As we have discussed previously, the Online Safety Bill now captures all online sites that display pornography, including commercial pornography sites, social media sites, video sharing platforms, forums and search engines. It will provide much greater protection to children than the Digital Economy Act. The Digital Economy Act was criticised for not covering social media platforms, which this Bill does cover. By removing that section from the Digital Economy Act, we are laying the path to regulate properly and more comprehensively.

Finally, in this group, clause 172 amends section 1B of the Protection of Children Act 1978 and creates a defence to the offence of making an indecent photograph of a child for Ofcom, its staff and those assisting Ofcom in exercising its online safety duties. Clearly, we do not want to criminalise Ofcom staff while they are discharging their duties under the Bill that we are imposing on them, so it is reasonable to set out that such a defence exists. I hope that provides clarity to the Committee on the three clauses.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The provisions in clauses 170 to 172, as the Minister has said, repeal or amend existing laws for the purposes of the Bill. As Labour supports the need to legislate on the issue of online safety, we will not oppose the clauses. However, I want to note that the entire process, up until the final abandonment of part 3 of the Digital Economy Act under clause 171 appears shambolic. It has been five years now since that part of the Act could have been implemented, which means five years during which children could have been better protected from the harms of pornographic content.

When the Government eventually admitted that part 3 was being ditched, the Minister at the time, the hon. Member for Boston and Skegness (Matt Warman), said that the Government would seek to take action on pornography more quickly than on other parts of the online harms regime. Stakeholders and charities have expressed concerns that we could now see a delay to the implementation of the duties on pornographic content providers, which is similar to the postponement and eventual abandonment of part 3 of the Digital Economy Act. I appreciate that the Minister gave some reassurance of his

“desire to get this done as quickly as possible”—[Official Report, Online Safety Bill Committee, 9 June 2022; c. 308.]

in our debate on clauses 31 to 33, but would it not be better to set out timeframes in the Bill?

Under clause 193, it appears that the only clauses in part 5 to be enacted once the Bill receives Royal Assent will be the definitions—clause 66 and clause 67(4)—and not the duties. That is because Ofcom is expected to issue a call for evidence, after which draft proposals for consultation are published, which then need to be agreed by the Secretary of State and laid before Parliament. There are opportunities there for delays and objections at any stage and, typically, enforcement will be implemented only in a staged fashion, from monitoring to supervision. The consultations and safeguarding processes are necessary to make the guidance robust; we understand that. However, children cannot wait another three years for protections, having been promised protection under part 3 of the Digital Economy Act five years ago, which, as I have said, was never implemented.

The provisions on pornography in part 5 of the Bill require no secondary legislation so they should be implemented as quickly as possible to minimise the amount of time children continue to be exposed to harmful content. It would be irresponsible to wait any longer than absolutely necessary, given the harms already caused by this drawn-out process.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you, Sir Roger, for chairing this meeting this morning. I want to agree with the Opposition’s points about the timing issue. If an Act will repeal another one, it needs to make sure that there is no gap in the middle and, if the repeal takes place on one day, that the Bill’s provisions that relate to that are in force and working on the same day, rather than leaving a potential set-up time gap.

On clause 170 and repealing the part of the Communications Act 2003 on video-sharing platform services, some concerns have been raised that the requirements in the Online Safety Bill do not exactly mirror the same provisions in the video-sharing platform rules. I am not saying necessarily or categorically that the Online Safety Bill is less strong than the video-sharing platform rules currently in place. However, if the legislation on video-sharing platform services is repealed, the Online Safety Act, as it will be, will become the main way of regulating video-sharing platforms and there will be a degradation in the protections provided on those platforms and an increase in some of the issues and concerns we have seen raised. Will the Minister keep that under review and consider how that could be improved? We do not want to see this getting worse simply because one regime has been switched for another that, as the Minister said, is broader and has stronger protections. Will he keep under review whether that turns out to be the case when the Act has bedded in, when Ofcom has the ability to take action and properly regulate—particularly, in this case, video-sharing platforms?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

So far as I am aware, all the costs incurred by Ofcom in relation to the duties in the Bill can be recouped by way of fees. If that is not correct, I will write to the hon. Lady saying so, but my understanding is that any relevant Ofcom cost will be in the scope of the fees.

Question put and agreed to.

New clause 42 accordingly read a Second time, and added to the Bill.

New Clause 43

Payment of sums into the Consolidated Fund

“(1) Section 400 of the Communications Act (destination of penalties etc) is amended as follows.

(2) In subsection (1), after paragraph (i) insert—

‘(j) an amount paid to OFCOM in respect of a penalty imposed by them under Chapter 6 of Part 7 of the Online Safety Act 2022;

(k) an amount paid to OFCOM in respect of an additional fee charged under Schedule (Recovery of OFCOM’s initial costs) to the Online Safety Act 2022.’

(3) In subsection (2), after ‘applies’ insert ‘(except an amount mentioned in subsection (1)(j) or (k))’.

(4) After subsection (3) insert—

‘(3A) Where OFCOM receive an amount mentioned in subsection (1)(j) or (k), it must be paid into the Consolidated Fund of the United Kingdom.’

(5) In the heading, omit ‘licence’.”—(Chris Philp.)

This new clause provides that additional fees charged to providers under NS2 must be paid into the Consolidated Fund. The Bill already provided that monetary penalties must be paid into the Consolidated Fund, and those provisions are now placed in this clause.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Establishment of Advocacy Body

“(1) There is to be a body corporate (‘the Advocacy Body’) to represent interests of child users of regulated services.

(2) A ‘child user’—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The ‘interests of child users’ means the interest of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) ‘enforceable requirements’ relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Secretary of State may appoint an organisation known to represent children to be designated the functions under this Act, or may create an organisation to carry out the designated functions.”—(Barbara Keeley.)

This new clause creates a new advocacy body for child users of regulated internet services.

Brought up, and read the First time.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

The hon. Lady will recall the issue that I raised earlier in the Committee’s deliberations, regarding the importance of victim support that gives people somewhere to go other than the platforms. I think that is what she is now alluding to. Does she not believe that the organisations that are already in place, with the right funding—perhaps from the fines coming from the platforms themselves—would be in a position to do this almost immediately, and that we should not have to set up yet another body, or have I misunderstood what she has said?

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I do not think that the right hon. Lady has misunderstood what I said. I said that the new clause would allow the Secretary of State to appoint a new or existing body as the statutory user advocate, so it could very much be either.

New clause 3 would also rebalance the interests of children against the vocal and well-resourced regulated companies. I think that is a key argument for having an advocacy body. Without such a counterbalance, large tech companies could attempt to capture independent expert voices, fund highly selective research with the intent to skew the evidence base, and then challenge regulatory decisions with the evidence base they have created.

Those tactics are not new; similar tactics are used in other regulated sectors, such as the tobacco industry. In line with other sectors, the user advocacy body should be funded by a levy on regulated companies. That would be in line with the “polluter pays” principle in part 6 and would be neutral to the Exchequer—another reason to accept it. Compared with the significant benefits and improved outcomes it would create, the levy would represent only a minimal additional burden on companies.

There is strong support for the creation of a user advocate. Research by the NSPCC shows that 88% of UK adults who responded to a YouGov survey think that it is necessary for the Bill to introduce a requirement for an independent body that can protect the interests of children at risk of online harms, including grooming and child sexual abuse.

It is also a popular option among children. YoungMinds has said that young people do not feel they are being included enough in the drafting of the Bill. It evidenced that with research it undertook that found that almost 80% of young people aged 11 to 25 surveyed had never even heard of the Bill.

A young woman told the NSPCC why she felt a children’s advocacy body is needed. She is a survivor of online grooming, and it is worth sharing what she said in full, because it is powerful and we have not shared the voices of young people enough. She said:

“When I was 13, a man in his 30s contacted me on Facebook. I added him because you just used to add anyone on Facebook. He started messaging me and I liked the attention. We’d speak every day, usually late at night for hours at a time…He started asking for photos, so I sent some. Then he asked for some explicit photos, so I did that too, and he reciprocated…In my eyes, telling anyone in my life about this man was not an option. We need to stop putting the responsibility on a vulnerable child to prevent crime and start living in a world which puts keeping children safe first. That means putting child safety at the heart of policy. I want a statutory child user advocacy body funded by the industry levy. This would play a vital role in advocating for children’s rights in regulatory debates. Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side. Having a body stand up for the rights of children in such a vulnerable position is invaluable…it is so rare that voices like mine have a chance to be heard by policy makers. Watching pre legislative debates I’ve been struck by how detached from my lived experience they can be”—

that is very much the point that my hon. Friend the Member for Batley and Spen made—

“and indeed the lived experiences of thousands of others. If we want to protect children, we need to understand and represent what they need.”

I hope that the Committee will recognise the bravery of that young woman in speaking about her experiences as a survivor of online grooming. I hope that the Minister will respect the insights she offers and consider the merits of having a user advocacy body to support children and young people experiencing harms online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I read new clause 3 in conjunction with the starred new clause 44, because it makes sense to consider the funding of the advocacy body, and the benefits of that funding, when discussing the merits of such a body. Part of that is because the funding of the advocacy body, and the fact that it needs to be funded, is key to its operation, and a key reason why we need it.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The new clause asks for an additional body. It is not saying the Children’s Commissioners should be done away with. The Children’s Commissioners do an amazing job, as we have recognised, but the No. 1 priority, certainly for the Children’s Commissioner in Scotland, is to protect the human rights of children; it is not to protect children online, which is what the user advocacy body would do. The body would specifically give the benefit of its experience and specifically use its resources, time and energy to advocate between Ofcom, children and children’s organisations and groups.

The Minister is right that the Bill takes massive steps forward in protecting children online, and he is right that the Children’s Commissioners do a very good job. The work done by the Children’s Commissioners in giving us evidence on behalf of children and children’s organisations has been incredibly powerful and incredibly helpful, but there is still a layer missing. If this Bill is to be future-proof, if it is to work and if it is not to put an undue burden on charitable organisations, we need a user advocacy body. The Minister needs to consider that.

I appreciate that the Government provide money to victim support organisations, which is great, but I am also making a case about potential victims. If the money only goes to those who support people who have already been harmed, it will not allow them to advocate to ensure that more people are not harmed. It will allow them to advocate on the behalf of those who have been harmed—absolutely—but it will not effectively tackle potential and emerging harms. It is a key place where the Bill misses out. I am quite disappointed that the Minister has not recognised that something may be lacking and is so keen to defend his position, because it seems to me that the position of the Opposition is so obviously the right one.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I wholeheartedly agree with what the hon. Member for Aberdeen North just said, but I wish to emphasise some elements because it seems to me that the Minister was not listening, although he has listened to much that has been said. I made some specific points, used quotes and brought forward some evidence. He feels that children have been consulted in the drafting of the Bill; I cited a YoungMinds survey that showed that that was very much not what young people feel. YoungMinds surveyed a large group of young people and a very large proportion of them had not even heard of the Bill.

The evidence of the young survivor of online grooming was very powerful. She very much wanted a user-advocacy body and spoke strongly about that. The Minister is getting it wrong if he thinks that somebody in that situation, who has been groomed, would go to a parent. The quote that I cited earlier was:

“Being groomed made me feel incredibly vulnerable, isolated, and weak. I felt I had no one who was on my side.”

There were clearly adults in her life she could have gone to, but she did not because she was in that vulnerable position—a position of weakness. That is why some kind of independent advocacy body for children is so important.

I do not think children and young people do feel consulted about the Bill because the organisations and charities are telling us that. I join all Opposition Members in supporting and paying tribute to the remarkable job that the Children’s Commissioner does. I quoted her setting out her worries about the Bill. I quoted her saying that

“the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

That is what she said. She did not say, “I’m the person charged with doing this. I’m the person who has the resource and my office has the resource.”

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I hope that I did not in any way confuse the debate earlier, because these two things are very separate. The idea of a user-advocacy service and individual victim support are two separate issues. The Minister has already taken up the issue of victim support, which is what the Children’s Commissioner was talking about, but that is separate from advocacy, which is much broader and not necessarily related to an individual problem.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Indeed, but the Children’s Commissioner was very clear about certain elements being missing in the Bill, as is the NSPCC and other organisations. It is just not right for the Minister to land it back with the Children’s Commissioner as part of her role, because she has to do so many other things. The provisions in the Bill in respect of a parent or adult assisting a young people in a grooming situation are a very big concern. The Children’s Commissioner cited her own survey of 2,000 children, a large proportion of whom had not succeeded in getting content about themselves removed. From that, we see that she understands that the problem exists. We will push the new clause to a Division.

Question put, That the clause be read a Second time.

Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Sixteenth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
None Portrait The Chair
- Hansard -

Good morning, ladies and gentlemen. Please be kind enough to make sure that your mobile phones are switched off.

New Clause 4

Duty to disclose information to OFCOM

“(1) This section sets out the duties to disclose information to OFCOM which apply in relation to all regulated user-to-user services.

(2) A regulated user-to-user service must disclose to OFCOM anything relating to that service of which that regulator would reasonably expect notice.

(3) This includes —

(a) any significant changes to its products or services which may impact upon its performance of its safety duties;

(b) any significant changes to its moderation arrangements which may impact upon its performance of its safety duties;

(c) any significant breaches in respect of its safety duties.”—(Barbara Keeley.)

This new clause creates a duty to disclose information to Ofcom.

Brought up, and read the First time.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

I beg to move, That the clause be read a Second time.

Good morning, Sir Roger. The new clause would require regulated companies to disclose proactively to the regulator material changes in its operations that may impact on safety, and any significant breaches as a result of its safety duties. Category 1 services should be under regulatory duties to disclose proactively to the regulator matters about which it could reasonably expect to be informed. For example, companies should notify Ofcom about significant changes to their products and services, or to their moderation arrangements, that may impact on the child abuse threat and the company’s response to it. A similar proactive duty already applies in the financial services sector. The Financial Conduct Authority handbook states:

“A firm must deal with its regulators in an open and cooperative way, and must disclose to the FCA appropriately anything relating to the firm of which that regulator would reasonably expect notice.”

The scope of the duty we are suggesting could be drawn with sufficient clarity so that social media firms properly understand their requirements and companies do not face unmanageable reporting burdens. Such companies should also be subject to red flag disclosure requirements, whereby they would be required to notify the regulator of any significant lapses in, or changes to, systems and processes that compromise children’s safety or could put them at risk. For example, if regulation had been in place over the last 12 months, Facebook might reasonably have been expected to report on the technology and staffing issues to which it attributes its reduced detection of child abuse content.

Experience from the financial services sector demonstrates the importance of disclosure duties as a means of regulatory intelligence gathering. Perhaps more importantly, they provide a useful means of hard-wiring regulatory compliance into company decisions on the design and operation of their sites.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Thank you for chairing this meeting, Sir Roger. I have a quick question for the Minister that relates to the new clause, which is a reasonable request for a duty on providers to disclose information to Ofcom. We would hope that the regulator had access to that information, and if companies are making significant changes, it is completely reasonable that they should have to tell Ofcom.

I do not have any queries or problems with the new clause; it is good. My question for the Minister is—I am not trying to catch anyone out; I genuinely do not know the answer—if a company makes significant changes to something that might impact on its safety duties, does it have to do a new risk assessment at that point, or does it not have to do so until the next round of risk assessments? I do not know the answer, but it would be good if the direction of travel was that any company making drastic changes that massively affected security—for example, Snapchat turning on the geolocation feature when it did an update—would have to do a new risk assessment at that point, given that significant changes would potentially negatively impact on users’ safety and increase the risk of harm on the platform.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure, as always, to serve under your chairmanship, Sir Roger. As the hon. Member for Worsley and Eccles South said, the new clause is designed to introduce a duty on providers to notify Ofcom of anything that Ofcom could reasonably be expected to be notified of.

The Bill already has extremely strong information disclosure provisions. I particularly draw the Committee’s attention to clause 85, which sets out Ofcom’s power to require information by provision of an information notice. If Ofcom provides an information notice—the particulars of which are set out in clause 86—the company has to abide by that request. As the Committee will recall, the strongest sanctions are reserved for the information duties, extending not only to fines of up to 10% or service discontinuation—unplugging the website, as it were; there is also personal criminal liability for named executives, with prison sentences of up to two years. We take those information duties extremely seriously, which is why the sanctions are as strong as they are.

The hon. Member for Aberdeen North asked what updates would occur if there were a significant design change. I draw the Committee’s attention to clause 10, which deals with children’s risk assessment duties, but there are similar duties in relation to illegal content and the safety of adults. The duty set out in clause 10(2), which cross-refers to schedule 3, makes it clear. The relevant words are “suitable and sufficient”. Clearly if there were a massive design change that would, in this case, adversely affect children, the risk assessment would not be suitable and sufficient if it were not updated to reflect that design change. I hope that answers the hon. Lady’s question.

Turning to the particulars of the new clause, if we incentivise companies to disclose information they have not been asked for by Ofcom, there is a danger that they might, through an excessive desire to comply, over-disclose and provide a torrent of information that would not be very helpful. There might also be a risk that some companies that are not well intentioned would deliberately dump enormous quantities of data in order to hide things within it. The shadow Minister, the hon. Member for Worsley and Eccles South, mentioned an example from the world of financial services, but the number of companies potentially within the scope of the Bill is so much larger than even the financial services sector. Some 25,000 companies may be in scope, a number that is much larger—probably by one order of magnitude, and possibly by two—than the financial services sector regulated by the FCA. That disparity in scale makes a significant difference.

Given that there are already strong information provision requirements in the Bill, particularly clause 85, and because of the reasons of scale that I have mentioned, I will respectfully resist the new clause.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

We believe that the platforms need to get into disclosure proactively, and that this is a reasonable clause, so we will push it to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move, That the clause be read a Second time.

Throughout these debates it has been clear that we agree on both sides that the Online Safety Bill must be a regime that promotes the highest levels of transparency. This will ensure that platforms can be held accountable for their systems and processes. Like other regulated industries, they must be open and honest with the regulator and the public about how their products work and how they keep users safe.

As we know, platforms duck and dive to avoid sharing information that could make life more difficult for them or cast them in a dim light. The Bill must give them no opportunity to shirk their responsibilities. The Bill enables the largest platforms to carry out a risk assessment safe in the knowledge that it may never see the light of day. Ofcom can access such information if it wants, but only following a lengthy process and as part of an investigation. This creates no incentive for platforms to carry out thorough and proper risk assessments. Instead, platforms should have to submit these risk assessments to Ofcom not only on request but as a matter of course. Limiting this requirement to only the largest platforms will not overload Ofcom, but will give it the tools and information it needs to oversee an effective regime.

In addition, the public have a right to know the risk profile of the services they use. This happens in all other regulated industries, with consumers having easy access to the information they need to make informed decisions about the products they use. At present, the Bill does not give users the information they deserve about what to expect online. Parents in particular will be empowered by information about the risk level of platforms their children use. Therefore, it is imperative that risk assessments are made publicly available, as well as submitted to the regulator as a matter of course.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a couple of comments on the point about parental empowerment. I have been asked by my children for numerous apps. I have a look at them and think, “I don’t know anything about this app. I have never seen or heard of it before, and I have no idea the level of user-to-user functionality in this app.” Nowhere is there a requirement for this information to be set out. There is nowhere that parents can easily find this information.

With iPhones, if a kid wants an app, they have to request it from their parent and their parents needs to approve whether or not they get it. I find myself baffled by some of them because they are not ones that I have ever heard of or come across. To find out whether they have that level of functionality, I have to download and use the app myself in the way that, hopefully, my children would use it in order to find out whether it is safe for them.

A requirement for category 1 providers to be up front and explain the risks and how they manage them, and even how people interact with their services, would increase the ability of parents to be media literate. We can be as media literate as we like, but if the information is not there and we cannot find it anywhere, we end up having to make incredibly restrictive decisions in relation to our children’s ability to use the internet, which we do not necessarily want to make. We want them to be able to have fun, and the information being there would be very helpful, so I completely agree on that point.

My other point is about proportionality. The Opposition moved new clause 4, relating to risk assessments, and I did not feel able to support it on the basis of the arguments that the Minister made about proportionality. He made the case that Ofcom would receive 25,000 risk assessments and would be swamped by the number that it might receive. This new clause balances that, and has the transparency that is needed.

It is completely reasonable for us to put the higher burden of transparency on category 1 providers and not on other providers because they attract the largest market share. A huge percentage of the risk that might happen online happens with category 1 providers, so I am completely happy to support this new clause, which strikes the right balance. It answers the Minister’s concerns about Ofcom being swamped, because only category 1 providers are affected. Asking those providers to put the risk assessment on their site is the right thing to do. It will mean that there is far more transparency and that people are better able to make informed decisions.

--- Later in debate ---
So what are the issues with the new clause? First, for the reasons that I have set out, the Bill already addresses the point. However, exposing the entire risk assessment publicly also carries some risks itself. For example, if the risk assessment identifies weaknesses or vulnerabilities in the service—ways that malfeasant people could exploit it to get at children or do something else that we would consider harmful—then exposing to everybody, including bad actors, the ways of beating the system and doing bad things on the service would not necessarily be in the public interest. A complete disclosure could help those looking to abuse and exploit the systems. That is why the transparency duties in clause 64 and the duties to publish accessible summaries in clauses 11 and 13 meet the objectives—the quite proper objectives—of the shadow Minister, the hon. Member for Worsley and Eccles South, and the hon. Member for Aberdeen North, without running the risks that are inherent in new clause 9, which I would therefore respectfully and genuinely resist.
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister seems to be resisting so many measures that have been put forward that would improve transparency, particularly by making information publicly available. As I made clear, the public have a right to know the risk profile of the services they use. We have debated this issue reasonably exhaustively now. Therefore, I will press the new clause to a Division.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I tabled new clause 17 in relation to protected characteristics because of some of the points made by Danny Stone. I missed the relevant evidence session because unfortunately, at the time, I was in the Chamber, responding to the Chancellor of the Exchequer. I am referring to some of the points made by Danny Stone in the course of the evidence session in relation to the algorithmic prompts that there are in search functions.

We have an issue with search functions; we have an issue with the algorithmic prompts that there are in search functions. There is an issue if someone puts in something potentially derogatory, if they put in something relating to someone with a protected characteristic. For example, if someone were to type “Jews are”, the results that they get with those algorithmic prompts can be overwhelmingly racist, overwhelmingly antisemitic, overwhelmingly discriminatory. The algorithm should not be pushing those things.

To give organisations like Google some credit, if something like that is highlighted to them, they will address it. Some of them take a long time to sort it, but they will have a look at it, consider sorting it and, potentially, sort it. But that is not good enough. By that point, the damage is done. By that point, the harm has been put into people’s minds. By that point, someone who is from a particular group and has protected characteristics has already seen that Google—or any other search provider—is pushing derogatory terms at people with protected characteristics.

I know that the prompts work like that because of artificial intelligence; firms are not intentionally writing these terms in order to push them towards people, but the AI allows that to happen. If such companies are going to be using artificial intelligence—some kind of software algorithm—they have a responsibility to make sure that none of the content they are generating on the basis of user searches is harmful. I asked Google about this issue during one of our evidence sessions, and the response they gave was, “Oh, algorithmic prompts are really good, so we should keep them”—obviously I am paraphrasing. I do not think that is a good enough argument. I do not think the value that is added by algorithmic prompts is enough to counter the harm that is caused by some of those prompts.

As such, the new clause specifically excludes protected characteristics from any algorithm that is used in a search engine. The idea is that if a person starts to type in something about any protected characteristic, no algorithmic prompt will appear, and they will just be typing in whatever they were going to type in anyway. They will not be served with any negative, harmful, discriminatory content, because no algorithmic prompt will come up. The new clause would achieve that across the board for every protected characteristic term. Search engines would have to come up with a list of such terms and exclude all of them from the work of the algorithm in order to provide that layer of protection for people.

I do not believe that that negative content could be in any way balanced by the potential good that could arise from somebody being able to type “Jews are” and getting a prompt that says “funny”. That would be a lovely, positive thing for people to see, but the good that could be caused by those prompts is outweighed by the negativity, harm and pain that is caused by the prompts we see today, which platforms are not quick enough to act on.

As I say, the harm is done by the time the report is made; by the time the concern is raised, the harm has already happened. New clause 17 would prevent that harm from ever happening. It would prevent anybody from ever being injured in any way by an algorithmic prompt from a search engine. That is why I have tabled that new clause, in order to provide a level of protection for any protected characteristic as defined under the Equality Act 2010 when it comes to search engine prompts.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The problem underlying the need for this new clause is that under the Bill, search services will not have to address or risk assess legal harm to adults on their sites, while the biggest user-to-user services will. As Danny Stone of the Antisemitism Policy Trust told us in evidence, that includes sites such as Google and Microsoft Bing, and voice search assistants including Amazon’s Alexa and Apple’s Siri. Search services rightly highlight that the content returned by a search is not created or published by then, but as the hon. Member for Aberdeen North has said, algorithmic indexing, promotion and search prompts provided in the search bar are their responsibility. As she has pointed out, and as we have heard in evidence sessions, those algorithms can cause significant harm.

Danny Stone told us on 26 May:

“Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 130, Q207.]

The hon. Member for Aberdeen North mentioned the examples from Microsoft Bing that Danny gave in his evidence—“Jews are” and “gays are”. He gave other examples of answers that were returned by search services, such as using Amazon Alexa to search, “Is George Soros evil?” The response was, “Yes, he is.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The issue is that the search prompts that the hon. Member has talked about are problematic, because just one person giving an answer to Amazon could prompt that response. The second one, about the White Helmets, was a comment on a website that was picked up. Clearly, that is an issue.

Danny Stone’s view is that it would be wise to have something that forces search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently. It is not reasonable to exempt major international and ubiquitous search services from risk assessing and having a policy to address the harms caused by their algorithms. We know that leaving it up to platforms to sort this out themselves does not work, which is why Labour is supporting the new clause proposed by our SNP colleague.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I think you are probably getting fed up with me, Sir Roger, so I will try my best not to speak for too long. The new clause is one of the most sensible ones we have put forward. It simply allows Ofcom to ask regulated services to submit to Ofcom

“a specific piece of research held by the service”

or

“all research the service holds”

on a specific topic. It also allows Ofcom to product a report into

“how regulated services commission, collate, publish and make use of research.”

The issues that we heard raised by Frances Haugen about the secretive nature of these very large companies gave us a huge amount concern. Providers will have to undertake risk assessments on the basis of the number of users they have, the risk of harm to those users and what percentage of their users are children. However, Ofcom is just going to have to believe the companies when they say, “We have 1 million users,” unless it has the ability to ask for information that proves the risk assessments undertaken are adequate and that nothing is being hidden by those organisations. In order to find out information about a huge number of the platforms, particularly ones such as Facebook, we have had to have undercover researchers posing as other people, submitting reports and seeing how they come out.

We cannot rely on these companies, which are money-making entities. They exist to make a profit, not to make our lives better. In some cases they very much do make our lives better—in some cases they very much do not—but that is not their aim. Their aim is to try to make a profit. It is absolutely in their interests to underplay the number of users they have and the risk faced by people on their platforms. It is very much in their interest to underplay how the algorithms are firing content at people, taking them into a negative or extreme spiral. It is also in their interests to try to hide that from Ofcom, so that they do not have to put in the duties and mitigations that keep people safe.

We are not asking those companies to make the information public, but if we require them to provide to Ofcom their internal research, whether on the gender or age of their users, or on how many of their users are viewing content relating to self-harm, it will raise their standards. It will raise the bar and mean that those companies have to act in the best interests—or as close as they can get to them—of their users. They will have to comply with what is set out in the Bill and the directions of Ofcom.

I see no issue with that. Ofcom is not going to share the information with other companies, so that they could subvert competition law. Ofcom is a regulator; it literally does not do that. Our proposal would mean that Ofcom has the best, and the most, information in order to take sensible decisions to properly regulate the platforms. It is not a difficult provision for the Minister to accept.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The transparency requirements set out in the Bill are welcome but limited. Numerous amendments have been tabled by the Opposition and by our colleagues in the SNP to increase transparency, so that we can all be better informed about the harms around us, and so that the regulator can determine what protections are needed for existing and emerging harms. This new clause is another important provision in that chain and I speak in support of it.

We know that there is research being undertaken all the time by companies that is never published—neither publicly nor to the regulator. As the hon. Member for Aberdeen North said, publishing research undertaken by companies is an issue championed by Frances Haugen, whose testimony last month the Committee will remember. A few years ago, Frances Haugen brought to the public’s attention the extent to which research is held by companies such as Facebook—as it was called then—and never reaches the public realm.

Billions of members of the public are unaware that they are being tracked and monitored by social media companies as subjects in their research studies. The results of those studies are only published when revealed by brave whistleblowers. However, their findings could help charities, regulators and legislators to recognise harms and help to make the internet a safer place. For example, Frances Haugen leaked one Facebook study that found that a third of teenage girls said Instagram made them feel worse about their bodies. Facebook’s head of safety, Antigone Davis, fielded questions on this issue from United States Senators last September. She claimed that the research on the impact of Instagram and Facebook to children’s health was “not a bombshell”. Senator Richard Blumenthal responded:

“I beg to differ with you, Ms Davis, this research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings.”

It is this kind of cover-up that new clause 19 seeks to prevent.

I remind the Committee of one more example that Frances Haugen illustrated to us in her evidence last month. Meta conducts frequent analyses of the estimated age of its users, which is often different from the ages they submit when registering, both among adults and children. Frances told us that Meta does this so that adverts can be targeted more effectively. However, if Ofcom could request this data, as the new clause would require, it would give an important insight into how many under-13s were in fact creating accounts on Facebook. Ofcom should be able to access such information, so I hope hon. Members and the Minister will support the new clause as a measure to increase transparency and support greater protections for children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me start by saying that I completely agree with the premise of the new clause. First, I agree that these large social media companies are acting principally for motives of their own profit and not the public good. Secondly, I agree with the proposition that they are extremely secretive, and do not transparently and openly disclose information to the public, the Government or researchers, and that is a problem we need to solve. I therefore wholeheartedly agree with the premise of the hon. Member for Aberdeen North’s new clause and her position.

However, I am honestly a bit perplexed by the two speeches we have just heard, because the Bill sets out everything the hon. Members for Aberdeen North and for Worsley and Eccles South asked for in unambiguous, black and white terms on the face of the Bill—or black and green terms, because the Bill is published on green paper.

Clause 85 on page 74 outlines the power Ofcom has to request information from the companies. Clause 85(1) says very clearly that Ofcom may require a person

“to provide them with any information”—

I stress the word “any”—

“that they require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions.”

Ofcom can already request anything of these companies.

For the avoidance of doubt, clause 85(5) lists the various things Ofcom can request information for the purpose of and clause 85(5)(l)—on page 75, line 25— includes for

“the purpose of carrying out research, or preparing a report, in relation to online safety matters”.

Ofcom can request anything, expressly including requesting information to carry out research, which is exactly what the hon. Member for Aberdeen North quite rightly asks for.

The hon. Lady then said, “What if they withhold information or, basically, lie?” Clause 92 on page 80 sets out the situation when people commit an offence. The Committee will see that clause 92(3)(a) states that a person “commits an offence” if

“the person provides information that is false in a material respect”.

Again, clause 92(5)(a) states that a person “commits an offence” if

“the person suppresses, destroys or alters, or causes or permits the suppression, destruction or alteration of, any information required to be provided.”

In short, if the person or company who receives the information request lies, or falsifies or destroys information, they are committing an offence that will trigger not only civil sanctions—under which the company can pay a fine of up to 10% of global revenue or be disconnected—but a personal offence that is punishable by up to two years in prison.

I hope I have demonstrated that clauses 85 and 92 already clearly contain the powers for Ofcom to request any information, and that if people lie, destroy information or supress information as they do as the moment, as the hon. Member for Aberdeen North rightly says they do, that will be a criminal offence with full sanctions available. I hope that demonstrates to the Committee’s satisfaction that the Bill does this already, and that it is important that it does so for the reasons that the hon. Lady set out.

--- Later in debate ---
Brought up, and read the First time.
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I beg to move, That the clause be read a Second time.

New clause 24 would enable users to bring civil proceedings against providers when they fail to meet their duties under part 3 of the Bill. As has been said many times, power is currently skewed significantly against individuals and in favour of big corporations, leading people to feel that they have no real ability to report content or complain to companies because, whenever they do, there is no response and no action. We have discussed how the reporting, complaints and super-complaints mechanisms in the Bill could be strengthened, as well as the potential merits of an ombudsman, which we argued should be considered when we debated new clause 1.

In tabling this new clause, we are trying to give users the right to appeal through another route—in this case, the courts. As the Minister will be aware, that was a recommendation of the Joint Committee, whose report stated:

“While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.”

The Government’s response to that recommendation was that the Bill would not change the current situation, which allows individuals to

“seek redress through the courts in the event that a company has been negligent or is in breach of its contract with the individual.”

It went on to note:

“Over time, as regulatory precedent grows, it will become easier for individuals to take user-to-user services to court when necessary.”

That seems as close as we are likely to get to an admission that the current situation for individuals is far from easy. We should not have to wait for the conclusion of the first few long and drawn-out cases before it becomes easier for people to fight companies in the courts.

Some organisations have rightly pointed out that a system of redress based on civil proceedings in the courts risks benefiting those with the resources to sue—as we know, that is often the case. However, including that additional redress system on the face of the Bill should increase pressure on companies to fulfil their duties under part 3, which will hopefully decrease people’s need to turn to the redress mechanism.

If we want the overall system of redress to be as strong as possible, individuals must have the opportunity to appeal failures of a company’s duty of care as set out in the Bill. The Joint Committee argued that the importance of the issues dealt with by the Bill requires that users have a right of redress in the courts. The Government did not respond to that criticism in their formal response, but it is a critical argument. A balancing act between proportionate restrictions and duties versus protections against harms is at the heart of this legislation, and has been at the heart of all our debates. Our position is in line with that of the Joint Committee: these issues are too important to deny individuals the right to appeal failures of duty by big companies through the courts.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I agree with the shadow Minister’s point that it is important to make sure social media firms are held to account, which is the entire purpose of the Bill. I will make two points in response to the proposed new clause, beginning with the observation that the first part of its effect is essentially to restate an existing right. Obviously, individuals are already at liberty to seek redress through the courts where a company has caused that individual to suffer loss through negligence or some other behaviour giving rise to grounds for civil liability. That would, I believe, include a breach of that company’s terms of service, so simply restating in legislation a right that already exists as a matter of law and common law is not necessary. We do not do declaratory legislation that just repeats an existing right.

Secondly, the new clause creates a new right of action that does not currently exist, which is a right of individual action if the company is in breach of one of the duties set out in part 3 of the Bill. Individuals being able to sue for a breach of a statutory duty that we are creating is not the way in which we are trying to construct enforcement under the Bill. We will get social media firms to comply through Ofcom acting as the regulator, rather than via individuals litigating these duties on a case-by-case basis. A far more effective way of dealing with the problems, as we discussed previously when we debated the ombudsman, is to get Ofcom to deal with this on behalf of the whole public on a systemic basis, funded not by individual litigants’ money, which is what would happen, at least in the first instance, if they had to proceed individually. Ofcom should act on behalf of us all collectively—this should appeal to socialists—using charges levied from the industry itself.

That is why we want to enforce against these companies using Ofcom, funded by the industry and acting on behalf of all of us. We want to fix these issues not just on an individual basis but systemically. Although I understand the Opposition’s intent, the first part simply declares what is already the law, and the second bit takes a different route from the one that the Bill takes. The Bill’s route is more comprehensive and will ultimately be more effective. Perhaps most importantly of all, the approach that the Bill takes is funded by the fees charged on the polluters—the social media firms—rather than requiring individual citizens, at least in the first instance, to put their hand in their own pocket, so I think the Bill as drafted is the best route to delivering these objectives.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will say a couple of things in response to the Minister. It is individuals who are damaged by providers breaching their duties under part 3 of the Bill. I understand the point about—

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Yes, but it is not systems that are damaged; it is people. As I said in my speech, the Government’s response that, as regulatory precedent grows, it will become easier over time for individuals to take user-to-user services to court where necessary clearly shows that the Government think it will happen. What we are saying is: why should it wait? The Minister says it is declaratory, but I think it is important, so we will put the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with the hon. Member wholeheartedly. It should be Parliament that is assessing the effectiveness of the Bill. The Committee has discussed many times how groundbreaking the Bill could be, how difficult it has been to regulate the internet for the first time, the many challenges encountered, the relationship between platforms and regulator and how other countries will be looking at the legislation as a guide for their own regulations. Once this legislation is in place, the only way we can judge how well it is tackling harm in the UK is with clear public reports detailing information on what harms have been prevented, who has intervened to remove that harm, and what role the regulator—in this case Ofcom—has had in protecting us online.

New clause 25 will place a number of important obligations on Ofcom to provide us with that crucial information. First, Ofcom will report annually to Parliament on the overall effectiveness of the Act. That report will allow Ofcom to explore fully where the Act is working, where it could be tightened and where we have left gaps. Throughout the Bill we are heaping considerable responsibility on to Ofcom, and it is only right that Ofcom is able to feedback publicly and state clearly where its powers allow it to act, and where it is constrained and in need of assistance.

Secondly, new clause 25 will compel Ofcom to monitor, collate and publish figures relating to the number of harms removed by category 1 services, which is an important indicator for us to know the scale of the issue and that the Act is working.

Thirdly, we need to know how often Ofcom is intervening, compared with how often the platforms themselves are acting. That crucial figure will allow us to assess the balance of regulation, which assists not only us in the UK but countries looking at the legislation as a guide for their own regulation.

Finally, Ofcom will detail the harms removed by type to identify any areas where the Act may be falling short, and where further attention may be needed.

I hope the Committee understands why this information is absolutely invaluable, when we have previously discussed our concerns that this groundbreaking legislation will need constant monitoring. I hope it will also understand why the information needs to be transparent in order to instil trust in the online space, to show the zero-tolerance approach to online harms, and to show countries across the globe that the online space can be effectively regulated to protect citizens online. Only Parliament, as the legislature, can be an effective monitor of that information. I hope I can count on the Government’s support for new clause 25.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate

Online Safety Bill (Seventeenth sitting)

Barbara Keeley Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This is another attempt to place a higher bar and more requirements on regulated services that are likely to cause the most serious risks of harm. The Minister has consistently said that he is keen to consider regulating the companies and platforms that have the highest potential risk of harm more strictly than the normal regime would allow. Some of the platforms would not be category 1 on the basis that they have a small number of members, but the potential for harm—radicalisation, extremism, severe damage to people or extreme pornography—is very high.

I am not yet happy that the Minister has provided an adequate answer to the question about the regulation of the highest-risk platforms that do not meet the category 1 thresholds. If he is unwilling to accept this amendment or any of the other amendments tabled by the Opposition on this specific issue, I hope that he will give consideration to a Government amendment on Report or when the Bill goes through the House of Lords in order that this loose end can be tied up.

As I have said before—I do not want go too much over comments that I have made previously—it is reasonable for us to have a higher bar and a more strict regulation regime on specific platforms that Ofcom will easily be able to identify and that create the highest harm. Again, as I have said, this is another way of going about it. The new clause suggests that if Ofcom assesses that a service poses a very high risk of harm, it might, notwithstanding the categorisation of that service, require it to perform the children’s risk assessment duties and the safety duties protecting children. This is specifically about the children’s risk assessment.

I have previously raised concerns about not being able to accurately assess the number of child users that a service has. I am still not entirely comfortable that platforms will be able to accurately assess the number of child users they have, and therefore they might not be subject to the child user requirements, because they have underplayed or understated the number of children using their service, or because there are only a few hundred children using the service, which is surely massively concerning for the wellbeing of those few hundred children.

I hope the Minister can give us some comfort that he is not just considering what action to take, but that he will take some sort of action on Report or when the Bill proceeds through the House of Lords.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - -

It is a pleasure to serve with you in the Chair again, Ms Rees. I rise to speak in support of new clause 27.

We have argued that the Government’s approach to categorising services fails to take account of the harms that could result from smaller services. I understand that a risk-based approach rather than a size-based approach is being considered, and that is welcome. The new clause would go some way to improving the categorisation of services as it stands. It is critical that there are ways for Ofcom to assess companies’ risk of harm to users and to place additional duties on them even when they lie outside the category to which they were initially assigned. Ofcom should be able to consult any organisation that it sees fit to consult, including user advocacy groups and civil society, in assessing whether a service poses

“a very high risk of harm”.

Following that, Ofcom should have powers to deliver the strictest duties on companies that expose adults to the most dangerous harms. That should always be proportionate to the risk of harm.

Labour supports the new clause and the arguments made by the hon. Member for Aberdeen North.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The new clause attempts to address an asymmetry in the Bill in relation to the lack of user empowerment features for child users. As far as I am aware, there is no requirement for user empowerment functions for child users in the Bill. The new clause would require that if a service has to have user empowerment features in place for adults, then

“OFCOM may require a service to provide equivalent features designed specifically for child users.”

Ofcom would be able then to provide guidance on how those user empowerment features for child users would work.

This provision is especially important for the fairly small number of platforms and providers that are very much aimed at children, and where the vast majority of users are children. We are not talking about Facebook, for example, although if Facebook did have child user empowerment, it would be a good thing. I am thinking about organisations and games such as Roblox, which is about 70% children; Fortnite, although it has quite a lot of adult users too; and Minecraft, which has significant numbers of child users. On those platforms that are aimed at children, not having a child-centred, child-focused user empowerment requirement is an oversight. It is missing from the Bill.

It is important that adults have the ability to make privacy choices about how they use sites and to make choices about some of the content that they can see on a site by navigating the user empowerment functions that exist. But it is also important for children to have that choice. I do not see why adults should be afforded that level of choice and flexibility over the way that they use platforms and the providers that they engage with, but children should not. We are not just talking here about kids who are eight: we are talking about children far older, and for whom adult-centred, adult-written user empowerment functions may not be the best option or as easy to access as ones that are specifically focused on and designed for children.

I have had a discussion with the National Society for the Prevention of Cruelty to Children about the user empowerment functions for child users. We have previously discussed the fact that complaints features have to be understandable by the users of services, so if the Minister is unwilling to accept the new clause, will he give some consideration to what happens when the provider of the platform is marketing that platform to children?

The Roblox website is entirely marketed as a platform for children. It is focused in that way, so will the Minister consider whether Ofcom should be able to require differential user empowerment functions, particularly in cases where the overwhelming majority of users are children? Also, it would not be beyond the wit of man for platforms such as Facebook to have two differential user empowerment functions based on whether somebody is under the age of 18—whether they are a child or an adult—because users tell Facebook their date of birth when signing up. We have talked a lot about age verification and the ways in which that could work.

I would appreciate it if the Minister would consider this important matter. It is something that is lacking at the moment, and we are doing our children a disservice by not providing them with the same functionality that we are providing, or requiring, for adult users.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I mentioned this in earlier consideration. The issue was raised with me by Mencap, specifically in relation to the people it represents who have learning disabilities and who have a right to access the internet just as we all do. They should be empowered to use the internet with a level of safety and be able to access complaints, to make content reports and to use the user empowerment functions. Everybody who is likely to use the platforms should be able to access and understand those functions.

Will the Minister make it clear that he expects Ofcom, when drafting guidance about the user empowerment functions and their accessibility, the content reporting and the complaints procedures, to consult people about how those things work? Will he make it clear that he hopes Ofcom will take into account the level of accessibility? This is not just about writing things in plain English—or whatever that campaign is about writing things in a way that people can understand—it is about actually speaking to groups that represent people with learning disabilities to ensure that content reporting, the empowerment functions and the complaints procedures are accessible, easy to find and easy to understand, so that people can make the complaints that they need to make and can access the internet on an equal and equitable basis.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister, particularly for providing the clarification that I asked for about who is likely to be consulted or taken into account when Ofcom is writing the codes of practice. Notwithstanding that, and particularly given the rather excellent speech from the shadow Minister, the hon. Member for Worsley and Eccles South, I am keen to press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - -

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you.

Barbara Keeley Portrait Barbara Keeley
- Hansard - -

I will take this opportunity, as my hon. Friend has done, to add a few words of thanks. She has already thanked all the people in this place who we should be thanking, including the Clerks, who have done a remarkable job over the course of our deliberations with advice, drafting, and support to the Chair. I also thank the stakeholder organisations. This Bill is uniquely one in which the stakeholders—the children’s charities and all those other organisations—have played an incredible part. I know from meetings that they have already advertised that those organisations will continue playing that part over the coming weeks, up until Report. It has been fantastic.

Finally, I will mention two people who have done a remarkable amount of work: my researcher Iona and my hon. Friend’s researcher Freddie, who have done a huge amount to help us prepare speaking notes. It is a big task, because this is a complex Bill. I add my thanks to you, Ms Rees, for the way you have chaired this Committee. Please thank Sir Roger on our behalf as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Online Safety Bill

Barbara Keeley Excerpts
Adam Afriyie Portrait Adam Afriyie
- View Speech - Hansard - - - Excerpts

I echo the concerns expressed by the right hon. Member for Kingston upon Hull North (Dame Diana Johnson). Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.

I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- View Speech - Hansard - -

I rise to speak to this small group of amendments on behalf of the Opposition. Despite everything that is going on at the moment, we must remember that this Bill has the potential to change lives for the better. It is an important piece of legislation, and we cannot miss the opportunity to get it right. I would like to join my hon. Friend the Member for Pontypridd (Alex Davies-Jones) in welcoming the Under-Secretary of State for Digital, Culture, Media and Sport, the hon. Member for Folkestone and Hythe (Damian Collins) to his role. His work as Chair of the Joint Committee on this Bill was an important part of the pre-legislative scrutiny process, and I look forward to working in collaboration with him to ensure that this legislation does as it should in keeping us all safe online. I welcome the support of the former Minister, the hon. Member for Croydon South (Chris Philp), on giving access to data to academic researchers and on looking at the changes needed to deal with the harm caused by the way in which algorithmic prompts work. It was a pity he was not persuaded by the amendments in Committee, but better late than never.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.

If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.

Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.

Barbara Keeley Portrait Barbara Keeley
- View Speech - Hansard - -

The Minister is quoting a case that I quoted in Committee, and the former Minister, the hon. Member for Croydon South (Chris Philp), would not accept amendments on this issue. We could have tightened up on fraudulent advertising. If Google can do that for financial ads, other platforms can do it. We tabled an amendment that the Government did not accept. I do not know why this Minister is quoting something that we quoted in Committee—I know he was not there, but he needs to know that we tried this and the former Minister did not accept what we called for.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am quoting that case merely because it is a good example of how, if we have better systems, we can get a better result. As part of the codes of practice, Ofcom will be able to look at some of these other systems and say to companies, “This is not just about content moderation; it is about having better systems that detect known illegal activity earlier and prevent it from getting on to the platform.” It is not about how quickly it is removed, but how effective companies are at stopping it ever being there in the first place. That is within the scope of regulation, and my belief is that those powers exist at the moment and therefore should be used.