All 20 Kim Leadbeater contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 17th Jan 2023

Online Safety Bill

Kim Leadbeater Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I had the great privilege of sitting on the Joint Committee on the draft Bill before Christmas and working with the Chair, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), fantastic Members from across both Houses and amazing witnesses.

We heard repeated stories of platforms profiting from pain and prejudice. One story that really affected me was that of Zach Eagling, a heroic young boy who has cerebral palsy and epilepsy and who was targeted with flashing images by cruel trolls to trigger seizures. Those seizures have been triggered for other people with epilepsy, affecting their lives and risking not just harm, but potentially death, depending on their situation. That is why I and my hon. Friend the Member for Stourbridge (Suzanne Webb)—and all members of the Joint Committee, actually, because this was in our report—backed Zach’s law.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Ten-year-old Zach is a child in my constituency who has, as the hon. Member said, cerebral palsy and epilepsy, and he has been subjected to horrendous online abuse. I hope that the Minister can provide clarity tonight and confirm that Zach’s law—which shows that not just psychological harm and distress, but physical harm can be created as a result of online abuse and trolling—will be covered in the Bill.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

My understanding—hopefully this will be confirmed from the Dispatch Box—is that Zach’s law will be covered by clause 150 in part 10, on communications offences, but I urge the Ministry of Justice to firm that up further.

One thing that really came through for me was the role of algorithms. The only analogy that I can find in the real world for the danger of algorithms is narcotics. This is about organisations that focused on and targeted harmful content to people to get them to be more addicted to harm and to harmful content. By doing that, they numbed the senses of people who were using technology and social media, so that they engaged in practices that did them harm, turning them against not only others, but themselves. We heard awful stories about people doing such things as barcoding—about young girls cutting themselves—which was the most vile thing to hear, especially as a parent myself. There was also the idea that it was okay to be abusive to other people and the fact that it became normalised to hurt oneself, including in ways that can be undoable in future.

That leads on to a point about numbing the senses. I am really pleased that in debating the Bill today we have talked about the metaverse, because the metaverse is not just some random technology that we might talk about; it is about numbing the senses. It is about people putting on virtual reality headsets and living in a world that is not reality, even if it is for a matter of minutes or hours. As we look at these technologies and at virtual reality, my concern is that children and young people will be encouraged to spend more time in worlds that are not real and that could include more harmful content. Such worlds are increasingly accurate in their reality, in the impact that they can have and in their capability for user-to-user engagement.

I therefore think that although at the moment the Bill includes Meta and the metaverse, we need to look at it almost as a tech platform in its own right. We will not get everything right at first; I fully support the Bill as it stands, but as we move forward we will need to continue to improve it, test it and adapt it as new technologies come out. That is why I very much support the idea of a continuing Joint Committee specifically on online safety, so that as time goes by the issues can be scrutinised and we can look at whether Ofcom is delivering in its role. Ultimately, we need to use the Bill as a starting point to prevent harm now and for decades to come.

Online Safety Bill (First sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q Do the powers in the Bill cover enough to ensure that people will not be sent flashing images if they have photosensitive epilepsy?

Richard Wronka: This picks up the point we discussed earlier, which is that I understand that the Government are considering proposals from the Law Commission to criminalise the sending of those kinds of images. It would not be covered by the illegal content duties as things stand, but if the Government conclude that it is right to criminalise those issues, it would automatically be picked up by the Bill.

Even so, the regime is not, on the whole, going to be able to pick up every instance of harm. It is about making sure that platforms have the right systems and processes. Where there is clear harm to individuals, we would expect those processes to be robust. We know there is work going on in the industry on that particular issue to try and drive forward those processes.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I will bring in Kim Leadbeater and then Maria Miller and Kirsty Blackman, but I will definitely bring in the Minister at 10.45 am.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Ms Rees, and thank you to the witnesses. Many websites host pornography without necessarily being pornographic websites, meaning that children can easily stumble across it. Does the Bill do enough to tackle pornography when it is hosted on mainstream websites?

Dame Rachel de Souza: I have argued hard to get pornographic sites brought into the Bill. That is something very positive about the Bill, and I was really pleased to see that. Why? I have surveyed more than half a million children in my Big Ask survey and spoken recently to 2,000 children specifically about this issue. They are seeing pornography, mainly on social media sites—Twitter and other sites. We know the negative effects of that, and it is a major concern.

I am pleased to see that age assurance is in the Bill. We need to challenge the social media companies—I pull them together and meet them every six months—on getting this stuff off their sites and making sure that under-age children are not on their sites seeing some of these things. You cannot go hard enough in challenging the social media companies to get pornography off their sites and away from children.

Andy Burrows: Just to add to that, I would absolutely echo that we are delighted that part 5 of the Bill, with measures around commercial pornography, has been introduced. One of our outstanding areas of concern, which applies to pornography but also more broadly, is around clause 26, the children’s access assessment, where the child safety duties will apply not to all services but to services where there is a significant number of child users or children comprise a significant part of the user base. That would seem to open the door to some small and also problematic services being out of scope. We have expressed concerns previously about whether OnlyFans, for example, which is a very significant problem as a user-generated site with adult content, could be out of scope. Those are concerns that I know the Digital, Culture, Media and Sport Committee has recognised as well. We would very much like to see clause 26 removed from the Bill, which would ensure that we have a really comprehensive package in this legislation that tackles both commercial pornography and user-generated material.

None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

That is really helpful, thank you. A quick question—

None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Online Safety Bill (Second sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 24th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)
None Portrait The Chair
- Hansard -

Okay. Kim Leadbetter, one very quick question. We must move on—I am sorry.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Okay, I will try to be very quick. The draft Bill contained a proposed new media literacy duty. That seems to have now disappeared. What are your digital media literacy strategies?

Becky Foreman: We have a range of strategies. One thing I would point to is research that we conduct every year and have done for a number of years called the digital civility index. It is a set of research that speaks to teens and adults in a number of countries around the world to understand what harms they are concerned about online and to ascertain whether those harms are increasing or decreasing and how they vary between different geographies. That is one way in which we are trying to make more data and information available to the general public about the type of harms they might come across online and whether they are increasing or decreasing.

Richard Earley: We have a range of different organisations that we work with in the UK and internationally. One that I would like to draw attention to is the Economist Educational Foundation’s Burnet News Club. We have supported them to increase their funding to be able to aim to reach 10% of all state schools with a really incredibly immersive and impressive programme that enables young people to understand digital literacy and digital numeracy and the media. We are also members of the media literacy taskforce of the Department for Digital, Culture, Media and Sport at the moment, which has been working to build on the strategy that the Government published.

Overall, there is a really important role for us as platforms to play here. We regularly commission and start new programmes in this space. What is also really important is to have more guidance from Government and civil society organisations that we work with on what is effective, so that we can know where we can put our resources and boost the greatest work.

Katie O'Donovan: Thank you for the question. It is really important. We were disappointed to see the literacy focus lost in the Bill.

We really take the issue seriously. We know there is an absolute responsibility for us when it comes to product, and an absolute responsibility when it comes to policy. Even within the safest products and with the most impressive and on-it parents, people can be exposed in content in ways that are surprising and shocking. That is why you need this holistic approach. We have long invested in a programme that we run with the non-governmental organisation Parent Zone called “Be internet legends”. When we developed that, we did it with the PSHE Association to make sure it was totally compliant with the national curriculum. We regularly review that to check that it is actually making a difference. We did some recent research with MORI and got some really good results back.

We used to deliver that programme face to face in schools up and down the country. Obviously, the pandemic stopped that. We went online and while we did not enjoy it quite as much, we were able to reach real scale and it was really effective. Along with doing the assemblies, which are now back in person, we deliver a pack for teachers so they can also take that up at scale. We run similar programmes through YouTube with teenagers. It is absolutely incumbent on us to do more, but it must be part of the debate, because if you rely just on technological solutions, you will end up reducing access to lawful information, with some of the harms still being prevalent and people not having the skills to navigate them.

None Portrait The Chair
- Hansard -

I am sorry, but I must move on. Minister, I am afraid you only have five minutes.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q If someone has consented to take part in pornography and they later change their mind and would like it to be taken down, do you think they should have the right to ask a porn website, for example, to take it down?

Professor Clare McGlynn: That is quite challenging not only for pornography platforms but for sex workers, in that if you could participate in pornography but at any time thereafter withdraw your consent, it is difficult to understand how a pornography company and the sex worker would be able to make a significant amount of money. The company would be reluctant to invest because it might have to withdraw the material at any time. In my view, that is a quite a challenge. I would not go down that route, because what it highlights is that the industry can be exploitative and that is where the concern comes from. I think there are other ways to deal with an exploitative porn industry and other ways to ensure that the material online has the full consent of participants. You could put some of those provisions into the Bill—for example, making the porn companies verify the age and consent of those who are participating in the videos for them to be uploaded. I think that is a better way to deal with that, and it would ensure that sex workers themselves can still contract to perform in porn and sustain their way of life.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you very much—this is extremely interesting and helpful. You have covered a lot of ground already, but I wonder whether there is anything specific you think the Bill should be doing more about, to protect girls—under-18s or under-16s—in particular?

Janaya Walker: A lot of what we have discussed in terms of naming violence against women and girls on the face of the Bill includes children. We know that four in five offences of sexual communications with a child involved girls, and a lot of child abuse material is targeted at girls specifically. The Bill as a whole takes a very gender-neutral approach, which we do not think is helpful; in fact, we think it is quite harmful to trying to reduce the harm that girls face online.

This goes against the approach taken in the Home Office violence against women and girls strategy and its domestic abuse plan, as well as the gold-standard treaties the UK has signed up to, such as the Istanbul convention, which we signed and have recently committed to ratifying. The convention states explicitly that domestic laws, including on violence against women and girls online, need to take a very gendered approach. Currently, it is almost implied, with references to specific characteristics. We think that in addressing the abuse that girls, specifically, experience, we need to name girls. To clarify, the words “women”, “girls”, “gender” and “sex” do not appear in the Bill, and that is a problem.

Jessica Eagelton: May I add a point that is slightly broader than your question? Another thing that the Bill does not do at the moment is provide for specialist victim support for girls who are experiencing online abuse. There has been some discussion about taking a “polluter pays” approach; where platforms are not compliant with the duties, for example, a percentage of the funds that go to the regulator could go towards victim support services, such as the revenge porn helpline and Refuge’s tech abuse team, that provide support to victims of abuse later on.

Professor Clare McGlynn: I can speak to pornography. Do you want to cover that separately, or shall I do that now?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

That is fine.

Professor Clare McGlynn: I know that there was a discussion this morning about age assurance, which obviously targets children’s access to pornography. I would emphasise that age assurance is not a panacea for the problems with pornography. We are so worried about age assurance only because of the content that is available online. The pornography industry is quite happy with age verification measures. It is a win-win for them: they get public credibility by saying they will adopt it; they can monetise it, because they are going to get more data—especially if they are encouraged to develop age verification measures, which of course they have been; that really is putting the fox in charge of the henhouse—and they know that it will be easily evaded.

One of the most recent surveys of young people in the UK was of 16 and 17-year-olds: 50% of them had used a VPN, which avoids age verification controls, and 25% more knew about that, so 75% of those older children knew how to evade age assurance. This is why the companies are quite happy—they are going to make money. It will stop some people stumbling across it, but it will not stop most older children accessing pornography. We need to focus on the content, and when we do that, we have to go beyond age assurance.

You have just heard Google talking about how it takes safety very seriously. Rape porn and incest porn are one click away on Google. They are freely and easily accessible. There are swathes of that material on Google. Twitter is hiding in plain sight, too. I know that you had a discussion about Twitter this morning. I, like many, thought, “Yes, I know there is porn on Twitter,” but I must confess that until doing some prep over the last few weeks, I did not know the nature of that porn. For example, “Kidnapped in the wood”; “Daddy’s little girl comes home from school; let’s now cheer her up”; “Raped behind the bin”—this is the material that is on Twitter. We know there is a problem with Pornhub, but this is what is on Twitter as well.

As the Minister mentioned this morning, Twitter says you have to be 13, and you have to be 18 to try to access much of this content, but you just put in whatever date of birth is necessary—it is that easy—and you can get all this material. It is freely and easily accessible. Those companies are hiding in plain sight in that sense. The age verification and age assurance provisions, and the safety duties, need to be toughened up.

To an extent, I think this will come down to the regulator. Is the regulator going to accept Google’s SafeSearch as satisfying the safety duties? I am not convinced, because of the easy accessibility of the rape and incest porn I have just talked about. I emphasise that incest porn is not classed as extreme pornography, so it is not a priority offence, but there are swathes of that material on Pornhub as well. In one of the studies that I did, we found that one in eight titles on the mainstream pornography sites described sexually violent material, and the incest material was the highest category in that. There is a lot of that around.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q We are talking here about pornography when it is hosted on mainstream websites, as opposed to pornographic websites. Could I ask you to confirm what more, specifically, you think the Bill should do to tackle pornography on mainstream websites, as you have just been describing with Twitter? What should the Bill be doing here?

Professor Clare McGlynn: In many ways, it is going to be up to the regulator. Is the regulator going to deem that things such as SafeSearch, or Twitter’s current rules about sensitive information—which rely on the host to identify their material as sensitive—satisfy their obligations to minimise and mitigate the risk? That is, in essence, what it will all come down to.

Are they going to take the terms and conditions of Twitter, for example, at face value? Twitter’s terms and conditions do say that they do not want sexually violent material on there, and they even say that it is because they know it glorifies violence against women and girls, but this material is there and does not appear to get swiftly and easily taken down. Even when you try to block it—I tried to block some cartoon child sexual abuse images, which are easily available on there; you do not have to search for them very hard, it literally comes up when you search for porn—it brings you up five or six other options in case you want to report them as well, so you are viewing them as well. Just on the cartoon child sexual abuse images, before anyone asks, they are very clever, because they are just under the radar of what is actually a prohibited offence.

It is not necessarily that there is more that the Bill itself could do, although the code of practice would ensure that they have to think about these things more. They have to report on their transparency and their risk assessments: for example, what type of content are they taking down? Who is making the reports, and how many are they upholding? But it is then on the regulator as to what they are going to accept as acceptable, frankly.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.

None Portrait The Chair
- Hansard -

Thank you. I call the Minister.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q Finally, do you think it would be desirable for Ofcom to consider a system with more consistency in parental controls, so that parents can always ensure that their children cannot talk to anybody outside their circle? Would that be helpful?

Dr Rachel O'Connell: There is a history of parental controls, and only 36% of parents use them. Ofcom research consistently says that it is 70%, but in reality, it is lower. When using age verification, the parents are removing the ability to watch everything. It is a platform; they are providing the digital playground. In the same way, when you go on swings and slides, there is bouncy tarmac because you know the kids are going to use them. It is like creating that health and safety environment in a digital playground.

When parents receive a notification that their child wants to access something, there could be a colour-coded nutrition-style thing for social media, livestreaming and so on, and the parents could make an informed choice. It is then up to the platform to maintain that digital playground and run those kinds of detection systems to see if there are any bad actors in there. That is better than parental controls because the parent is consenting and it is the responsibility of the platform to create the safer environment. It is not the responsibility of the parent to look over the child’s shoulder 24/7 when they are online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q The age verification stuff is really interesting, so thank you to our witnesses. On violence against women and girls, clauses 150 to 155 set out three new communications offences. Do you think those offences will protect women from receiving offensive comments, trolling and threats online? What will the Bill mean for changing the way you manage those risks on your platforms?

Jared Sine: I do not know the specific provisions but I am familiar with the general concept of them. Any time you put something in law, it can either be criminalised or have enforcement behind it, and I think that helps. Ultimately, it will be up to the platforms to come up with innovative technologies or systems such as “Are You Sure?” and “Does This Bother You?” which say that although the law says x, we are going to go beyond that to find tools and systems that make it happen on our platform. Although I think it is clearly a benefit to have those types of provisions in law, it will really come down to the platforms taking those extra steps in the future. We work with our own advisory council, which includes the founder of the #MeToo movement, REIGN and others, who advise us on how to make platforms safer for those things. That is where the real bread gets buttered, so to speak.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Do you think that is consistent across your industry? It sounds like you are taking a very proactive approach to it.

Jared Sine: We are proactive about it, and I know our colleagues and friends over at Bumble are proactive about it as well. Our heads of trust and safety both came from the same company—Uber—before coming to us, so I know that they compare notes quite regularly. Because of the way the legislation is set up, there can be codes of conduct applying specifically to online dating, and to the extent that that technology exists, you need to deploy it.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. One question from Kim Leadbeater.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you for your very powerful testimony, Rhiannon. I appreciate that could not have been easy. Going back to the digital literacy piece, it feels like we were talking about digital literacy in the Bill when it started coming through, and that has been removed now. How important do you think it is that we have a digital literacy strategy, and that we hold social media providers in particular to having a strategy on digital education for young people?

Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking both Rhiannon-Faye and Susie for coming and giving evidence, and for all the work they are doing in this area? I know it has been done over many years in both cases.

I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?

Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We are playing “Beat the clock”. I am going to ask for brief answers and brief questions, please. I will take one question from Kim Leadbeater and one from Barbara Keeley.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Gosh, right. I think we are clear that your view is that these two exceptions could potentially do more harm than good. The ideal scenario from your perspective would be to remove them, but again, the challenge is how we balance the freedom of speech issue with protecting the rights of people online who are vulnerable to abuse and harassment. How would you respond to those who say that the Bill risks setting an unwitting precedent for non-democratic countries that would seek to restrict the freedom of expression of their citizens?

Ellen Judson: There is absolutely a risk of over-moderation, and of the Bill incentivising over-moderation, particularly because of the very heavy content focus. Even with illegal content, there is a very broad range of content that companies are expected proactively to monitor for, even when the technical systems to identify that content reliably at scale are perhaps not in place. I absolutely understand and share the concern about over-moderation.

Our response would be that we should look to strengthen the freedom of expression duties currently in the Bill. At the moment, there is a quite vague duty to have regard to the importance of freedom of expression, but it is not at all clear what that would actually mean, and what would be expected from the platforms. One change we would want would be for rights—including freedom of expression and privacy—to be included in the online safety objectives, and to establish that part of the purpose of this regime is to ensure that services are being designed to protect and promote human rights, including freedom of expression. We think that would be a way to bring freedom of expression much more into the centre of the regime and the focus of the Bill, without having to have those add-on exemptions after the fact.

Kyle Taylor: And it creates a level playing field—it says, “These rules apply to everyone equally.”

On the second point, authoritarian—absolutely—but the other area that is really important is fragile democracies. For example, if you look at Hungary, just last week Viktor Orbán said, “You know what you need? Your own media.” If we are setting a standard that says it is totally fine to exempt people in politics and media, then for those fragile democracies that control most aspects of information sharing, we are explicitly saying that it is okay to privilege them over others. That is a very dangerous precedent to set when we have the opportunity to set best global standards here with the Bill.

None Portrait The Chair
- Hansard -

Barbara Keeley?

Online Safety Bill (Third sitting)

Kim Leadbeater Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Is there a body that sets out a framework around journalistic standards that the Bill could possibly refer to?

William Moy: No.

William Perrin: At Carnegie, in our earliest work on this in 2018, we were very clear that this Bill should not be a route to regulating the press and media beyond what the social settlement was. Many people are grumpy about that settlement, and many people are happy with it, but it is a classic system intention. We welcome the Government’s attempt to carve journalism out one way or another, but there is still a great problem in defining journalists and journalism.

I think some of the issues around news provider organisations do give a sense in the Bill of a heavy-duty organisation, not some fly-by-night thing that has been set up to evade the rules. As Will was pointing out, the issue then comes down to individual journalists, who are applying their trade in new ways that the new media allows them to do. I remember many years ago, when I ran a media business, having a surreal meeting at DCMS during Leveson, where I had to explain to them what a blogger was. Sadly, we have not quite yet got that precision of how one achieves the intended effect around, in particular, individual journalists.

Professor Lorna Woods: I emphasise what Mr Moy said about the fact that this is going to have to be a system. It is not a decision on every individual item of content, and it is not about a decision on individual speakers. It is going to be about how the characteristics that we care about—the function of journalism—are recognised in an automated systems.

On the drafting of the Bill, I wonder whether there is any overlap between the user-generated content and citizen journalism in clause 16 and the recognition in clause 15 of user-generated content in relation to democratic speech. I am not sure whether one is not a subset of the other.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q What would you change about clauses 15 and 16? Is there an argument that they should not be there at all?

Professor Lorna Woods: I have to confess that I have not really looked at them in great detail, although I have read them. I do not think they work, but I have not got to a solution because that is actually quite a difficult thing to define.

William Moy: I should declare an interest in clause 15 and the news publisher content exemption, because Full Fact would be covered by that exemption. I do not welcome that; I find it very awkward that we could be fact-checking things and some of the people we are fact-checking would not be covered by the exemption.

It is regrettable that we are asking for those exemptions in the Bill. The Bill should protect freedom of expression for everyone. Given the political reality of that clause, it does not do the job that it tries to do. The reason why is essentially because you can set yourself up to pass the test in that clause very easily. The Minister asked about that in a previous session and recognised that there is probably room to tighten the drafting, and I am very happy to work with his officials and talk about how, if that is Parliament’s political intention, we can do it in as practical a way as possible.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you. How could the Bill protect people who are involved in elections, be they parliamentary candidates, people standing in local elections, staff, or election officers? Could that be worked on, and where would it go in the Bill?

William Perrin: The Bill is a risk-management regime. As part of a risk-management regime, one should routinely identify people who are at high risk and high-risk events, where they intersect and how you assess and mitigate that risk. As someone who was a civil servant for 15 years and has worked in public policy since, I hugely respect the functioning of the election process. At the very extreme end, we have seen hideous events occur in recent years, but there is also the routine abuse of politicians and, to some extent, an attempt to terrorise women politicians off certain platforms, which has been quite grotesque.

I feel that there is a space, within the spirit of the Bill as a risk-management regime, to draw out the particular risks faced by people who participate in elections. They are not just candidates and office holders, as you say, but the staff who administer elections—we saw the terrible abuse heaped on them in recent American elections; let us hope that that does not come across here—and possibly even journalists, who do the difficult job of reporting on elections, which is a fundamental part of democracy.

The best way to address those issues might be to require Ofcom to produce a straightforward code of practice—particularly for large, category 1 platforms—so that platforms regard elections and the people who take part in them as high-risk events and high-harm individuals, and take appropriate steps. One appropriate step would be to do a forward look at what the risks might be and when they might arise. Every year, the BBC produces an elections forward look to help it manage the particular risks of public service broadcasting around elections. Could a platform be asked to produce and publish an elections forward look, discussing with people who take part in elections their experience of the risks that they face and how best to mitigate them in a risk-management regime? That could also involve the National Police Chiefs’ Council, which already produces guidance at each election.

We are sitting here having this discussion in a highly fortified, bomb-proof building surrounded by heavily armed police. I do not think any member of the public would begrudge Members of Parliament and the people who come here that sort of protection. We sometimes hear the argument that MPs should not be recognised as special or get special protection. I do not buy that; no one begrudges the security here. It is a simple step to ask platforms to do a risk assessment that involves potential victims of harm, and to publish it and have a dialogue with those who take part, to ensure that the platforms are safe places for democratic discussion.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you. Just to finish, you are right that the point people have made is, “Why should MPs or elected officials be any different from anybody else?” I understand that. What worries me, from some of the work I have done, is that this is about not just the safety of human beings but the impact on democracy. Threatening and abusive behaviour directed at elected politicians can affect the way they feel about doing their job, and that worries me. Do you think it should be a specific stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, Mayors or police and crime commissioners? Do you think that warrants a separate, stand-alone offence?

William Perrin: The Government have, to their credit, introduced in this Bill offences of sending messages with the intent to harm, but it will take many years for them to work their way through CPS guidance and to establish a body of case law so that it is understood how they are applied. Of course, these cases are heard in magistrates courts, so they do not get reported very well.

One of the reasons we are here discussing this is that the criminal law has failed to provide adequate measures of public protection across social media. If the criminal law and the operation of the police and the CPS worked, we would not need to have this discussion. This discussion is about a civil regulatory regime to make up for the inadequacies in the working of the criminal law, and about making it work a little smoother. We see that in many areas of regulated activity. I would rather get a quicker start by doing some risk assessment and risk mitigation before, in many years’ time, one gets to an effective operational criminal offence. I note that the Government suggested such an offence a few years ago, but I am not quite clear where it got to.

William Moy: To echo Ms Leadbeater’s call for a holistic approach to this, treating as criminal some of the abuse that MPs receive is entirely appropriate. The cost to all of us of women and people of colour being deterred from public life is real and serious. There is also the point that the Bill deals only with personal harms, and a lot of the risk to elections is risk to the democratic system as a whole. You are absolutely right to highlight that that is a gap in what the Bill is doing. We think, certainly from a misinformation point of view, that you cannot adequately address the predictable misinformation and disinformation campaigns around elections simply by focusing on personal harm.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Thank you.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us and giving us such thorough and clear responses to the various questions. I want to start on a topic that William Perrin and William Moy touched on—the exemption for recognised news publishers, set out in clause 50. You both said you have some views on how that is drafted. As you said, I asked questions on Tuesday about whether there are ways in which it could be improved to avoid loopholes—not that I am suggesting there are any, by the way. Mr Perrin and Mr Moy, could you elaborate on the specific areas where you think it might be improved?

William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you. The Antisemitism Policy Trust has made the case that search services should be eligible for inclusion as a high-risk category. Is that still your position? What is the danger, currently, of excluding them from that provision?

Danny Stone: Very much so. You heard earlier about the problems with advertising. I recognise that search services are not the same as user-to-user services, so there does need to be some different thinking. However, at present, they are not required to address legal harms, and the harms are there.

I appeared before the Joint Committee on the draft Bill and talked about Microsoft Bing, which, in its search bar, was prompting people with “Jews are” and then a rude word. You look at “Gays are”, today, and it is prompting people with “Gays are using windmills to waft homosexual mists into your home”. That is from the search bar. The first return is a harmful article. Do the same in Google, for what it’s worth, and you get “10 anti-gay myths debunked.” They have seen this stuff. I have talked to them about it. They are not doing the work to try to address it.

Last night, using Amazon Alexa, I searched “Is George Soros evil?” and the response, was “Yes, he is. According to an Alexa Answers contributor, every corrupt political event.” “Are the White Helmets fake?” “Yes, they are set up by an ex-intelligence officer.” The problem with that is that the search prompts—the things that you are being directed to; the systems here—are problematic, because one person could give an answer to Amazon and that prompts the response. The second one, about the White Helmets, was a comment on a website that led Alexa to give that answer.

Search returns are not necessarily covered because, as I say, they are not the responsibility of the internet companies, but the systems that they design as to how those things are indexed and the systems to prevent them going to harmful sites by default are their responsibility, and at present the Bill does not address that. Something that forces those search companies to have appropriate risk assessments in place for the priority harms that Parliament sets, and to enforce those terms and conditions consistently, would be very wise.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you to the witnesses for joining us today. The Bill contains duties to protect content of “democratic importance” and “journalistic content”. What is your view of these measures and their likely effectiveness?

Liron Velleman: These are both pretty dangerous clauses. We are very concerned about what I would probably be kind and call their unintended consequences. They are loopholes that could allow some of the most harmful and hateful actors to spread harm on social media. I will take “journalistic” first and then move on to “democratic”.

A number of companies mentioned in the previous evidence session are outlets that could be media publications just by adding a complaints system to their website. There is a far-right outlet called Urban Scoop that is run by Tommy Robinson. They just need to add a complaints system to their website and then they would be included as a journalist. There are a number of citizen journalists who specifically go to our borders to harass people who are seeking refuge in this country. They call themselves journalists; Tommy Robinson himself calls himself a journalist. These people have been specifically taken off platforms because they have repeatedly broken the terms of service of those platforms, and we see this as a potential avenue for them to make the case that they should return.

We also see mainstream publications falling foul of the terms of service of social media companies. If I take the example of the Christchurch massacre, social media companies spent a lot of time trying to take down both the livestream of the attack in New Zealand and the manifesto of the terrorist, but the manifesto was then put on the Daily Mail website—you could download the manifesto straight from the Daily Mail website—and the livestream was on the Daily Mirror and The Sun’s websites. We would be in a situation where social media companies could take that down from anyone else, but they would not be able to take it down from those news media organisations. I do not see why we should allow harmful content to exist on the platform just because it comes from a journalist.

On “democratic”, it is still pretty unclear what the definition of democratic speech is within the Bill. If we take it to be pretty narrow and just talk about elected officials and candidates, we know that far-right organisations that have been de-platformed from social media companies for repeatedly breaking the terms of service—groups such as Britain First and, again, Tommy Robinson—are registered with the Electoral Commission. Britain First ran candidates in the local elections in 2022 and they are running in the Wakefield by-election, so, by any measure, they are potentially of “democratic importance”, but I do not see why they should be allowed to break terms of service just because they happen to have candidates in elections.

If we take it on a wider scale and say that it is anything of “democratic importance”, anyone who is looking to cause harm could say, “A live political issue is hatred of the Muslim community.” Someone could argue that that or the political debate around the trans community in the UK is a live political debate, and that would allow anyone to go on the platform and say, “I’ve got 60 users and I’ve got something to say on this live political issue, and therefore I should be on the platform,” in order to cause that harm. To us, that is unacceptable and should be removed from the Bill. We do not want a two-tier internet where some people have the right to be racist online, so we think those two clauses should be removed.

Stephen Kinsella: At Clean up the Internet this is not our focus, although the proposals we have made, which we have been very pleased to see taken up in the Bill, will certainly introduce friction. We keep coming back to friction being one of the solutions. I am not wearing this hat today, but I am on the board of Hacked Off, and if Hacked Off were here, I think they would say that the solution—although not a perfect solution—might be to say that a journalist, or a journalistic outlet, will be one that has subjected itself to proper press regulation by a recognised press regulator. We could then possibly take quite a lot of this out of the scope of social media regulation and leave it where I think it might belong, with proper, responsible press regulation. That would, though, lead on to a different conversation about whether we have independent press regulation at the moment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q I think someone has alluded to this already, but should the comments section on news publisher platforms be included in the scope of the Bill?

Danny Stone: I feel quite strongly that they should. I think this is about clauses 39(2) and (5). When they had an exemption last time, we were told they were already regulated, because various newspapers have their own systems, because of IPSO or whatever it might be. There was a written question in the House from Emma Hardy, and the Government responded that they had no data—no assessment of moderator system effectiveness or the harms caused. The Secretary of State said to the DCMS Select Committee that he was confident that these platforms have appropriate moderation policies in place, but was deeply sceptical about IPSO involvement. The Law Commission said that it was not going to give legal exemption to comments boards because they host an abundance of harmful material and abuse, and there are articles in, say, The Times:

“Pro-Kremlin trolls have infiltrated the reader comments on the websites of news organisations, including The Times, the Daily Mail and Fox News, as part of a ‘major influence operation’”.

A number of years ago, we worked—through the all-party parliamentary group against antisemitism, to which we provide the secretariat—on a piece with the Society of Editors on comment moderation on websites, so there have been efforts in the past, but this is a place where there is serious harm caused. You can go on The Sun or wherever now and find comments that will potentially be read by millions of people, so having some kind of appropriate risk assessment, minimum standard or quality assurance in respect of comments boards would seem to be a reasonable step. If it does not get into the Bill, I would in any event urge the Minister to develop some guidance or work with the industry to ensure they have some of those standards in place, but ideally, you would want to lose that carve-out in the Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you. Stephen, just to finish—

None Portrait The Chair
- Hansard -

Just a short question.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Yes, sorry. Is there a body that sets a framework around journalistic standards that the Bill could refer to?

Stephen Kinsella: Obviously, there are the regulators. There is IMPRESS and IPSO, at the very least. I am afraid that I do not know the answer; there must also be journalistic trade bodies, but the regulators would probably be the first port of call for me.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Thank you.

Caroline Ansell Portrait Caroline Ansell (Eastbourne) (Con)
- Hansard - - - Excerpts

Q May I ask about anonymity? It is mentioned in the Bill, but only once. Do you think there is a need for more expansive coverage of this issue? Do you think people should be able to use the internet while remaining anonymous, and if not, to whom would users disclose their identity? Would it be to the platform, or would it be more publicly than that?

Stephen Kinsella: There are a few questions there, obviously. I should say that we are happy with the approach in the Bill. We always felt that focusing on anonymity was the wrong place to start. Instead, we thought that a positive right to be verified, and then a right to screen out replies and posts from unverified accounts, was the way to go.

In terms of who one should make the disclosure to, or who would provide the verification, our concern was always that we did not want to provide another trove of data that the platforms could use to target us with adverts and otherwise monetise. While we have tried to be agnostic on the solution—again, we welcome the approach in the Bill, which is more about principles and systems than trying to pick outcomes—there are third-party providers out there that could provide one-stop verification. Some of them, for instance, rely on the open banking principles. The good thing about the banks is that under law, under the payment services directive and others, we are the owners of our own data. It is a much greyer area whether we are the owners of the data that the social media platforms hold on us, so using that data that the banks have—there is a solution called One ID, for instance—they will provide verification, and you could then use that to open your social media accounts without having to give that data to the platforms.

I saw in the evidence given to you on Tuesday that it was claimed that 80% of users are reluctant to give their data to platforms. We were surprised by that, and so we looked at it. They chose their words carefully. They said users were reluctant to give their data to “certain websites”. What they meant was porn sites. In the polling they were referring to, the question was specifically about willingness to share data with porn sites, and people are, understandably, reluctant to do that. When using open banking or other systems, there are good third-party providers, I would suggest, for verification.

Online Safety Bill (Fourth sitting)

Kim Leadbeater Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q There has been much talk about trying to future-proof the Bill. Is there anything you could recommend that should be in the Bill to try to help with that?

Stephen Almond: Again, I would say that the most important thing I can recommend around this is to retain that flexibility within the Bill. I know that a temptation will emerge to offer prescription, whether for the purpose of giving companies clarity today or for addressing present harms, but it is going to be really important to make sure that there is due flexibility to enable the legislation to be responsive to future harms.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Under clause 40, the Secretary of State can modify codes of practice to reflect public policy. How do you respond to criticism that this provision risks undermining the independence of the regulator?

Stephen Almond: Ultimately, it is for Ofcom to raise any concerns about the impact of the regime, as set out by its ability to apply its duties appropriately, independently and with due accountability to Parliament and the public. As a regulator, I would say that it is important to have a proper and proportionate degree of independence, so that businesses and the public can have trust in how regulation is carried out. Ultimately though, it is for Government and Parliament to determine what the right level of independence is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

None Portrait The Chair
- Hansard -

Q Ms Perry, do you want to add anything to that?

Lynn Perry: When we were looking at children and young people’s access to harmful pornographic content, one thing we were particularly concerned about related to seeing extreme harmful and violent content, often perpetrated towards women. In respect of younger children, violence against women and girls and gender-based violence considerations, it is something that we are concerned about in that context.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Do you have any thoughts on the Bill committing to a statutory user advocacy body representing the interests of children? If you do, how do you think that that could be funded?

Lynn Perry: I am sorry—that was a question about advocacy, I think.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Yes, the idea of having a statutory user advocacy body that would represent the interests of children. This is something that has been talked about. Is that something you have any thoughts about?

Lynn Perry: We certainly have a lot of representation from children and young people directly. Last year, we worked with more than 380,000 children and young people. We think that advocacy and representation on behalf of children and young people can be used to powerful effect. Making sure that the voices of children and young people, their views, wishes and experiences, are heard and influence legislation that could safeguard and protect them effectively is something that we are supportive of.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Eva, there is just one reference to anonymity in the Bill currently. Do you think there is an opportunity to express a fuller, more settled opinion and potentially expand on that juxtaposition?

Eva Hartshorn-Sanders: I heard the advice that the representative of the Information Commissioner’s Office gave earlier—he feels that the balance is right at the moment. It is important to incorporate freedom of speech and privacy within this framework in a democratic country. I do not think we need to add anything more than that.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you to the witnesses for joining us this afternoon. May I ask for your views on the clauses on journalistic content exemption and democratic content exemption? Do you think that these measures are likely to be effective?

Poppy Wood: I know you have spoken a lot about this over the past few days, but the content of democratic importance clause is a layer of the Bill that makes the Bill very complicated and hard to implement. My concern about these layers of free speech—whether it is the journalistic exemption, the news media exemption or the content of democratic importance clause—is that, as you heard from the tech companies, they just do not really know what to do with it. What we need is a Bill that can be implemented, so I would definitely err on the side of paring back the Bill so that it is easy to understand and clear. We should revisit anything that causes confusion or is obscure.

The clause on content of democratic importance is highly problematic—not just because it makes the Bill hard to implement and we are asking the platforms to decide what democratic speech is, but because I think it will become a gateway for the sorts of co-ordinated disinformation that we spoke about earlier. Covid disinformation for the past two years would easily have been a matter of public policy, and I think the platforms, because of this clause, would have said, “Well, if someone’s telling you to drink hydroxychloroquine as a cure for covid, we can’t touch that now, because it’s content of democratic importance.”

I have another example. In 2018, Facebook said that it had identified and taken down a Facebook page called “Free Scotland 2014”. In 2018—four years later—Facebook identified it. It was a Russian/Iranian-backed page that was promoting falsehoods in support of Scottish independence using fake news websites, with articles about the Queen and Prince Philip wanting to give themselves a pay rise by stealing from the poor. It was total nonsense, but that is easily content of democratic importance. Even though it was backed by fake actors—as we have said, I do not think there is anything in the Bill to preclude that at the moment, or at least to get the companies to focus on it—in 2014, that content would have been content of democratic importance, and the platforms took four years to take it down.

I think this clause would mean that that stuff became legitimate. It would be a major loophole for hate and disinformation. The best thing to do is to take that clause out completely. Clause 15(3) talks about content of democratic importance applying to speech across a diverse range of political opinion. Take that line in that subsection and put it in the freedom of expression clause—clause 19. What you then have is a really beefed-up freedom of expression clause that talks about political diversity, but you do not have layers on top of it that mean bad actors can promote hate and disinformation. I would say that is a solution, and that will make the Bill much easier to implement.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Poppy. Eva?

Eva Hartshorn-Sanders: I think the principle behind the duty is correct and that they should consider the democratic importance of content when they are making moderation decisions, but what we know from our work is that misinformation and disinformation on social media poses a real threat to elections and democracies around the world. As an international organisation, we have studied the real harms caused by online election disinformation in countries like the US. We saw websites like The Gateway Pundit profit from Google ads to the tune of over $1 million while spreading election disinformation. That has led to real-world death threats sent to election officials and contributed to the events of 6 January. It is not something we want to see replicated in the UK.

The problem with the democratic importance duty is that it is framed negatively about preventing platforms from removing content, rather than positively about addressing content that undermines elections. That is concerning because it is the latter that has proved to be damaging in the real world. I think where we are getting to is that there should be a positive duty on platforms to act on content that is designed and intended to undermine our democracy and our elections.

To add to that, the Joint Committee on the draft Bill looked specifically at having misinformation and disinformation on elections and public health on the face of the Bill rather than leaving it to secondary legislation. That is a position that we would support. The type of harm we have seen over the last couple of years through covid is a known harm and it is one that we should be addressing. It has led to the deaths of millions of people around the world.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q That is really helpful; thank you. You raised the point about the abuse that was directed at election officials in America. Do you think it should almost be a stand-alone offence to send harmful or threatening communications to elected people—MPs, councillors, mayors or police and crime commissioners—or possibly even election officials, the people who are involved in the democratic process, because of the risk that that abuse and threats could have on democracy?

Eva Hartshorn-Sanders: Obviously abuse is unacceptable, and there have been real issues with that globally and I know in the UK from the work we have done with MPs here, including through the misogyny research. I guess this is the balance—if people have concerns about legitimate political decisions that are being made—but that is why you have an independent regulator who can assess that content.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Poppy, do you have any thoughts on that?

Poppy Wood: We are seeing people who put themselves forward in public life receiving all sorts of horrible abuse, which was cited as a big reason for women and people of colour removing themselves from public life in recent elections. My understanding is that the threatening communications offences brought in under the illegal duties will probably cover quite a lot of that. The idea that Eva just gave of an election risk assessment or something might, coupled with the threatening communications offences, mean that you are accounting for how your platform promotes that sort of hate.

One of the things that you would want to try to avoid is making better protections for politicians than for everyone else, but I think that threatening communications already covers some of that stuff. Coupled with an elections risk assessment, that would hopefully mean that there are mitigating effects on the risks identified in those risk assessments to tackle the sorts of things that you were just talking about.

Eva Hartshorn-Sanders: Just to add to that, from our work on “Don’t Feed the Trolls”, we know that a lot of these hate campaigns are quite co-ordinated. There is a whole lot of supporting evidence behind that. They will often target people who raise themselves up in whatever position, whether elected or a different type. The misogyny report we have just done had a mix of women who were celebrities or just had a profile and a large Instagram following and who were, again, subject to that abuse.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Should there be more in the Bill with a specific reference to violence against women and girls, abuse and threats, and misogyny?

Eva Hartshorn-Sanders: There are definitely parts of the Bill that could be strengthened in that area. Part of that relates to incels and how they are treated, or not, as a terrorist organisation; or how small sites might be treated under the Bill. I can elaborate on that if you like.

None Portrait The Chair
- Hansard -

Thank you. Minister.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Kim Leadbeater?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you, Chair, and thank you to the witnesses. I just want to clarify something. We were talking about the journalistic content definition as it is. You are saying that you do not think it is reasonable to expect service providers to identify journalistic content using the definition contained in the Bill. Do you think the Bill should be clearer about what it means by journalistic content and journalism?

Matt Rogerson: My point is that for news publishers there is a lack of definition in the journalistic content exemption, and that platforms without the exemption would have to identify whether every piece of content on their platform was journalism, so it would be very difficult for the platforms to implement. That is why for trusted news brands such as the BBC, The Times, and The Guardian, the news media exemption is really important.

What we do not know, and what Gavin Millar suggested in his paper to Index on Censorship, is how that journalistic content exemption will be interpreted by the platforms. His fear in the paper is that the current definition means that the content has to be UK-linked. It could mean, for example, that a blog or a journalist that talks about issues in the Gulf or Ukraine would not be seen as journalistic content and therefore would not be able to take advantage of the systems that the platforms put in place. I think his view is that it should be in line with the article 10 definition of journalistic content, which would seem to make sense.

Owen Meredith: If I could add to that, speaking from my members’ perspective, they would all fall under the recognised news publisher definition. I think that is why it is an important definition. It is not an easy thing to get right, and I think the Department has done a good job in drafting the Bill. I think it captures everyone we would expect it to capture. I think actually it does set a relatively high bar for anyone else who is seeking to use that. I do not think it is possible for someone to simply claim that they are a recognised news publisher if they are operating in a way that we would not expect of such a person or entity. I think it is very important that that definition is clear. I think it is clear and workable.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q I suppose there are two separate clauses there. There is the news publisher clause and the journalistic content clause. Just so I am clear, you are happy with the news publisher clause?

Owen Meredith: Yes.

Matt Rogerson: Yes.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q What about the journalistic content clause? This is an expression that was new to me—this idea of a citizen journalist. I do not even know what that means. Are we confident that this clause, which talks about journalistic content, is the worrying one?

Owen Meredith: Matt spoke to this a little bit, but from my perspective, my focus has been on making sure that the recognised news publisher clause is right, because everything that my members publish is journalistic content. Therefore, the bulk of journalistic content that is out there will be covered by that. I think where there are elements of what else could be considered journalistic content, the journalistic content clause will pick those up.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q As journalists, does that worry you?

Matt Rogerson: I wish I was a journalist.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Sorry, as representatives of journalists.

Matt Rogerson: It worries me in the sense that we want a plural media ecosystem in this country, and we want individuals who are journalists to have their content published on platforms, so that it can be read by the 50% of the UK population that get their news from Facebook. I think it is potentially problematic that they won’t be able to publish on that platform if they talk about issues that are in the “legal but harmful” bucket of harms, as defined after the Bill is passed. I think there is concern for those groups.

There are suggestions for how you could change the clause to enable them to have more protection. As I say, Gavin Millar has outlined that in his paper. Even then, once you have got that in place, if you have a series of legal but harmful harms that are relatively unclear, the challenge for the platforms will be interpreting that and interpreting it against the journalistic content clause.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Mr Lewis, you were nodding.

Martin Lewis: I was nodding—I was smiling and thinking, “If it makes you feel any better, Tim, I have pictures of me that tell people to invest money that are clearly fake, because I don’t do any adverts, and it still is an absolute pain in the backside for me to get them taken down, having sued Facebook.” So, if your members want to feel any sense of comradeship, they are not alone in this; it is very difficult.

I think the interesting thing is about that volumetric algorithm. Of course, we go back to the fact that these big companies like to err on the side of making money and err away from the side of protecting consumers, because those two, when it comes to scams, are diametrically opposed. The sooner we tidy it up, the better. You could have a process where once there has been a certain number of reports—I absolutely get Tim’s point that in certain cases there is not a big enough volume—the advert is taken down and then the company has to proactively decide to put it back up and effectively say, “We believe this is a valid advert.” Then the system would certainly work better, especially if you bring down the required number of reports. At the moment, I think, there tends to be an erring on the side of, “Keep it up as long as it’s making us money, unless it absolutely goes over the top.”

Many tech experts have shown me adverts with my face in on various social media platforms. They say it would take them less than five minutes to write a program to screen them out, but those adverts continue to appear. We just have to be conscious here that—there is often a move towards self-regulation. Let me be plain, as I am giving evidence. I do not trust any of these companies to have the user and the consumer interest at heart when it comes to their advertising; what they have at heart is their own profits, so if we want to stop them, we have to make this Bill robust enough to stop them, because that is the only way it will stop. Do not rely on them trying to do good, because they are trying to make profit and they will err on the side of that over the side of protecting individuals from scam adverts.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - - - Excerpts

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Q Thank you. You alluded earlier to the fact that the Bill contains duties to protect content of democratic importance and journalistic content. What is your view on those measures and their likely effectiveness?

Frances Haugen: I want to reiterate that AI struggles to do even really basic tasks. For example, Facebook’s own document said that it only took down 0.8% of violence-inciting content. Let us look at a much broader category, such as content of democratic importance—if you include that in the Bill, I guarantee you that the platforms will come back to you and say that they have no idea how to implement the Bill. There is no chance that AI will do a good job of identifying content of democratic importance at any point in the next 30 years.

The second question is about carve-outs for media. At a minimum, we need to greatly tighten the standards for what counts as a publication. Right now, I could get together with a friend and start a blog and, as citizen journalists, get the exact same protections as an established, thoughtful, well-staffed publication with an editorial board and other forms of accountability. Time and again, we have seen countries such as Russia use small media outlets as part of their misinformation and disinformation strategies. At a minimum, we need to really tighten that standard.

We have even seen situations where they will use very established publications, such as CNN. They will take an article that says, “Ukrainians destroyed a bunch of Russian tanks,” and intentionally have their bot networks spread that out. They will just paste the link and say, “Russia destroyed a bunch of tanks.” People briefly glance at the snippet, they see the picture of the tank, they see “CNN”, and they think, “Ah, Russia is winning.” We need to remember that even real media outlets can be abused by our enemies to manipulate the public.

Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Q Good afternoon, Frances. I want to ask you about anonymity and striking a balance. We have heard variously that anonymity affords some users safe engagement and actually reduces harm, while for others anonymity has been seen to fuel abuse. How do you see the balance, and how do you see the Bill striving to achieve that?

Frances Haugen: It is important for people to understand what anonymity really is and what it would really mean to have confirmed identities. Platforms already have a huge amount of data on their users. We bleed information about ourselves on to these platforms. It is not about whether the platforms could identify people to the authorities; it is that they choose not to do that.

Secondly, if we did, say, mandate IDs, platforms would have two choices. The first would be to require IDs, so that every single user on their platform would have to have an ID that is verifiable via a computer database—you would have to show your ID and the platform would confirm it off the computer. Platforms would suddenly lose users in many countries around the world that do not have well-integrated computerised databases. The platforms will come back to you and say that they cannot lose a third or half of their users. As long as they are allowed to have users from countries that do not have those levels of sophisticated systems, users in the UK will just use VPNs—a kind of software that allows you to kind of teleport to a different place in the world—and pretend to be users from those other places. Things such as ID identification are not very effective.

Lastly, we need to remember that there is a lot of nuance in things like encryption and anonymity. As a whistleblower, I believe there is a vital need for having access to private communications, but I believe we need to view these things in context. There is a huge difference between, say, Signal, which is open source and anyone in the world can read the code for it—the US Department of Defence only endorses Signal for its employees, because it knows exactly what is being used—and something like Messenger. Messenger is very different, because we have no idea how it actually works. Facebook says, “We use this protocol,” but we cannot see the code; we have no idea. It is the same for Telegram; it is a private company with dubious connections.

If people think that they are safe and anonymous, but they are not actually anonymous, they can put themselves at a lot of risk. The secondary thing is that when we have anonymity in context with more sensitive data—for example, Instagram and Facebook act like directories for finding children—that is a very different context for having anonymity and privacy from something like Signal, where you have to know someone’s phone number in order to contact them.

These things are not cut-and-dried, black-or-white issues. I think it is difficult to have mandatory identity. I think it is really important to have privacy. We have to view them in context.

Online Safety Bill (Fifth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Will the hon. Lady give way?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

You are making some really important points about the world of the internet and online gaming for children and young people. That is where we need some serious consideration about obligations on providers about media literacy for both children and grown-ups. Many people with children know that this is a really dangerous space for young people, but we are not quite sure we have enough information to understand what the threats, risks and harms are. That point about media literacy, particularly in regard to the gaming world, is really important.

None Portrait The Chair
- Hansard -

Order. Before we proceed, the same rules apply in Committee as on the Floor of the House to this extent: the Chair is “you”, and you speak through the Chair, so it is “the hon. Lady”. [Interruption.] One moment.

While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To reassure the hon. Member on the point about doing the risk assessment, all the companies have to do the risk assessment. That obligation is there. Ofcom can request any risk assessment. I would expect, and I think Parliament would expect, it to request risk assessments either where it is concerned about risk or where the platform is particularly large and has a very high reach—I am thinking of Facebook and companies like that. But hon. Members are talking here about requiring Ofcom to receive and, one therefore assumes, to consider, because what is the point of receiving an assessment unless it considers it? Receiving it and just putting it on a shelf without looking at it would be pointless, obviously. Requiring Ofcom to receive and look at potentially 25,000 risk assessments strikes me as a disproportionate burden. We should be concentrating Ofcom’s resources—and it should concentrate its activity, I submit—on those companies that pose a significant risk and those companies that have a very high reach and large numbers of users. I suggest that, if we imposed an obligation on it to receive and to consider risk assessments for tiny companies that pose no risk, that would not be the best use of its resources, and it would take away resources that could otherwise be used on those companies that do pose risk and that have larger numbers of users.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Just to be clear, we are saying that the only reason why we should not be encouraging the companies to do the risk assessment is that Ofcom might not be able to cope with dealing with all the risk assessments. But surely that is not a reason not to do it. The risk assessment is a fundamental part of this legislation. We have to be clear that there is no point in the companies having those risk assessments if they are not visible and transparent.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

All the companies have to do the risk assessment, for example for the “illegal” duties, where they are required to by the Bill. For the “illegal” duties, that is all of them; they have to do those risk assessments. The question is whether they have to send them to Ofcom—all of them—even if they are very low risk or have very low user numbers, and whether Ofcom, by implication, then has to consider them, because it would be pointless to require them to be sent if they were not then looked at. We want to ensure that Ofcom’s resources are pointed at the areas where the risks arise. Ofcom can request any of these. If Ofcom is concerned—even a bit concerned—it can request them.

Hon. Members are then making a slightly adjacent point about transparency—about whether the risk assessments should be made, essentially, publicly available. In relation to comprehensive public disclosure, there are legitimate questions about public disclosure and about getting to the heart of what is going on in these companies in the way in which Frances Haugen’s whistleblower disclosures did. But we also need to be mindful of what we might call malign actors—people who are trying to circumvent the provisions of the Bill—in relation to some of the “illegal” provisions, for example. We do not want to give them so much information that they know how they can circumvent the rules. Again, there is a balance to strike between ensuring that the rules are properly enforced and having such a high level of disclosure that people seeking to circumvent the rules are able to work out how to do so.

Online Safety Bill (Sixth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 7th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

As I say, clause 10 already references the governance arrangements, but my strong view is that the only thing that will make these companies sit up and take notice—the only thing that will make them actually protect children in a way they are currently not doing—is the threat of billions of pounds of fines and, if they do not comply even after being fined at that level, the threat of their service being disconnected. Ultimately, that is the sanction that will make these companies protect our children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

As my hon. Friend the Member for Worsley and Eccles South has said, the point here is about cultural change, and the way to do that is through leadership. It is not about shutting the gate after the horse has bolted. Fining the companies might achieve something, but it does not tackle the root of the problem. It is about cultural change and leadership at these organisations. We all agree across the House that they are not doing enough, so how do we change that culture? It has to come from leadership.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, and that is why governance is addressed in the clause as drafted. But the one thing that will really change the way the leadership of these companies thinks about this issue is the one thing they ultimately care about—money. The reason they allow unsafe content to circulate and do not rein in or temper their algorithms, and the reason we are in this situation, which has arisen over the last 10 years or so, is that these companies have consistently prioritised profit over protection. Ultimately, that is the only language they understand—it is that and legal compulsion.

While the Bill rightly addresses governance in clause 10 and in other clauses, as I have said a few times, what has to happen to make this change occur is the compulsion that is inherent in the powers to fine and to deny service—to pull the plug—that the Bill also contains. The thing that will give reassurance to our constituents, and to me as a parent, is knowing that for the first time ever these companies can properly be held to account. They can be fined. They can have their connection pulled out of the wall. Those are the measures that will protect our children.

--- Later in debate ---
Duties to protect content of democratic importance
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I beg to move amendment 105, in clause 15, page 14, line 33, after “ensure” insert “the safety of people involved in UK elections and”.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 106, in clause 37, page 25, line 31, at end insert—

‘(2A) OFCOM must prepare and issue a code of practice for providers of Category 1 and 2(a) services describing measures recommended for the purpose of compliance with duties set out in section 15 concerning the safety of people taking part in elections.”

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I rise to speak to amendments 105 and 106, in my name, on protecting democracy and democratic debate.

Within the Bill, there are significant clauses intended to prevent the spread of harm online, to protect women and girls against violence and to help prevent child sexual exploitation, while at the same time protecting the right of journalists to do their jobs. Although those clauses are not perfect, I welcome them.

The Bill is wide-ranging. The Minister talked on Second Reading about the power in clause 150 to protect another group—those with epilepsy—from being trolled with flashing images. That subject is close to my heart due to the campaign for Zach’s law—Zach is a young boy in my constituency. I know we will return to that important issue later in the Committee, and I thank the Minister for his work on it.

In protecting against online harm while preserving fundamental rights and values, we must also address the threats posed to those involved in the democratic process. Let me be clear: this is not self-serving. It is about not just MPs but all political candidates locally and nationally and those whose jobs facilitate the execution of our democratic process and political life: the people working on elections or for those elected to public office at all levels across the UK. These people must be defended from harm not only for their own protection, but to protect our democracy itself and, with it, the right of all our citizens to a political system capable of delivering on their priorities free from threats and intimidation.

Many other groups in society are also subjected to a disproportionate amount of targeted abuse, but those working in and around politics sadly receive more than almost any other people in this country, with an associated specific set of risks and harms. That does not mean messages gently, or even firmly, requesting us to vote one way or another—a staple of democratic debate—but messages of hate, abuse and threats intended to scare people in public office, grind them down, unfairly influence their voting intentions or do them physical and psychological harm. That simply cannot be an acceptable part of political life.

As I say, we are not looking for sympathy, but we have a duty to our democracy to try to stamp that out from our political discourse. Amendment 105 would not deny anybody the right to tell us firmly where we are going wrong—quite right, too—but it is an opportunity to draw the essential distinction between legitimately holding people in public life to account and illegitimate intimidation and harm.

The statistics regarding the scale of online abuse that MPs receive are shocking. In 2020, a University of Salford study found that MPs received over 7,000 abusive or hate-filled tweets a month. Seven thousand separate messages of harm a month on Twitter alone directed at MPs is far too many, but who in this room does not believe that the figure is almost certainly much higher today? Amnesty conducted a separate study in 2017 looking at the disproportionate amount of abuse that women and BAME MPs faced online, finding that my right hon. Friend the Member for Hackney North and Stoke Newington (Ms Abbott) was the recipient of almost a third of all the abusive tweets analysed, as alluded to already by the hon. Member for Edinburgh—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Aberdeen North.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I knew that. [Laughter.]

Five years later, we continue to see significant volumes of racist, sexist and homophobic hate-filled abuse and threats online to politicians of all parties. That is unacceptable in itself, but we must ask whether this toxic environment helps to keep decent people in politics or, indeed, attracts good people into politics, so that our democracy can prosper into the future across the political spectrum. The reality we face is that our democracy is under attack online each and every day, and every day we delay acting is another day on which abuse becomes increasingly normalised or is just seen as part of the job for those who have put themselves forward for public service. This form of abuse harms society as a whole, so it deserves specific consideration in the Bill.

While elected Members and officials are not a special group of people deserving of more legal protections than anyone else, we must be honest that the abuse they face is distinct and specific to those roles and directly affects our democracy itself. It can lead to the most serious physical harm, with two Members of Parliament having been murdered in the last six years, and many others face death threats or threats of sexual or other violence on a daily basis. However, this is not just about harm to elected representatives; online threats are often seen first, and sometimes only, by their members of staff. They may not be the intended target, but they are often the people harmed most. I am sure we all agree that that is unacceptable and cannot continue.

All of us have probably reported messages and threats to social media platforms and the police, with varying degrees of success in terms of having them removed or the individuals prosecuted. Indeed, we sadly heard examples of that from my hon. Friend the shadow Minister. Often we are told that nothing can be done. Currently, the platforms look at their own rules to determine what constitutes freedom of speech or expression and what is hateful speech or harm. That fine line moves. There is no consistency across platforms, and we therefore urgently need more clarity and a legal duty in place to remove that content quickly.

Amendment 105 would explicitly include in the Bill protection and consideration for those involved in UK elections, whether candidates or staff. Amendment 106 would go further and place an obligation on Ofcom to produce a code of practice, to be issued to the platforms. It would define what steps platforms must take to protect those involved in elections and set out what content is acceptable or unacceptable to be directed at them.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a few comments on the amendment. As a younger female parliamentarian, I find that I am often asked to speak to young people about becoming an MP or getting involved in politics. I find it difficult to say to young women, “Yes, you should do this,” and most of the reason for that is what people are faced with online. It is because a female MP cannot have a Twitter account without facing abuse. I am sure male MPs do as well, but it tends to be worse for women.

We cannot engage democratically and with constituents on social media platforms without receiving abuse and sometimes threats as well. It is not just an abusive place to be—that does not necessarily meet the threshold for illegality—but it is pretty foul and toxic. There have been times when I have deleted Twitter from my phone because I just need to get away from the vile abuse that is being directed towards me. I want, in good conscience, to be able to make an argument to people that this is a brilliant job, and it is brilliant to represent constituents and to make a difference on their behalf at whatever level of elected politics, but right now I do not feel that I am able to do that.

When my footballing colleague, the hon. Member for Batley and Spen, mentions “UK elections” in the amendment, I assume she means that in the widest possible way—elections at all levels.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

indicated assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The Minister makes a really valid point and is right about the impact on the individual. The point I am trying to make with the amendments is that this is about the impact on the democratic process, which is why I think it fits in with clause 15. It is not about how individuals feel; it is about the impact that that has on behaviours, and about putting the emphasis and onus on platforms to decide what is of democratic importance. In the evidence we had two weeks ago, the witnesses certainly did not feel comfortable with putting the onus on platforms. If we were to have a code of practice, we would at least give them something to work with on the issue of what is of democratic importance. It is about the impact on democracy, not just the harm to the individual involved.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clearly, if a communication is sufficiently offensive that it meets the criminal threshold, it is covered, and that would obviously harm the democratic process as well. If a communication was sufficiently offensive that it breached the harmful communication offence in clause 150, it would also, by definition, harm the democratic process, so communications that are damaging to democracy would axiomatically be caught by one thing or the other. I find it difficult to imagine a communication that might be considered damaging to democracy but that would not meet one of those two criteria, so that it was not illegal and would not meet the definition of a harmful communication.

My main point is that the existing provisions in the Bill address the kinds of behaviours that were described in those two speeches—the illegal content provisions, and the new harmful communication offence in clause 150. On that basis, I hope the hon. Member for Batley and Spen will withdraw the amendment, safe in the knowledge that the Bill addresses the issue that she rightly and reasonably raises.

Question put, That the amendment be made.

Online Safety Bill (Seventh sitting)

Kim Leadbeater Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Will the Minister give way?

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Will the Minister give way?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is categorically not the Government’s position that this problem is too big to fix. In fact, the whole purpose of this piece of groundbreaking and world-leading legislation is to fix a problem of such magnitude. The point my right hon. Friend was making about the hypothecation of fines to support user advocacy is a somewhat different one, which we will come to in due course, but there is nothing in the Bill to prevent individual groups from assisting individuals with making specific complaints to individual companies, as they are now entitled to do in law under clauses 17 and 18.

The point about an ombudsman is a slightly different one—if an individual complaint is made to a company and the individual complainant is dissatisfied with the outcome of their individual, particular and personal complaint, what should happen? In the case of financial services, if, for example, someone has been mis-sold a mortgage and they have suffered a huge loss, they can go to an ombudsman who will bindingly adjudicate that individual, single, personal case. The point that I am making is that having hundreds of thousands or potentially millions of cases being bindingly adjudicated on a case-by- case basis is not the right way to tackle a problem of this scale. The right way to tackle the problem is to force the social media companies, by law, to systemically deal with all of the problem, not just individual problems that may end up on an ombudsman’s desk.

That is the power in the Bill. It deals at a systems and processes level, it deals on an industry-wide level, and it gives Ofcom incredibly strong enforcement powers to make sure this actually happens. The hon. Member for Pontypridd has repeatedly called for a systems and processes approach. This is the embodiment of such an approach and the only way to fix a problem of such magnitude.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I associate myself with the comments of the right hon. Member for Basingstoke. Surely, if we are saying that this is such a huge problem, that is an argument for greater stringency and having an ombudsman. We cannot say that this is just about systems. Of course it is about systems, but online harms—we have heard some powerful examples of this—are about individuals, and we have to provide redress and support for the damage that online harms do to them. We have to look at systemic issues, as the Minister is rightly doing, but we also have to look at individual cases. The idea of an ombudsman and greater support for charities and those who can support victims of online crime, as mentioned by the hon. Member for Aberdeen North, is really important.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Lady for her thoughtful intervention. There are two separate questions here. One is about user advocacy groups helping individuals to make complaints to the companies. That is a fair point, and no doubt we will debate it later. The ombudsman question is different; it is about whether to have a right of appeal against decisions by social media companies. Our answer is that, rather than having a third-party body—an ombudsman—effectively acting as a court of appeal against individual decisions by the social media firms, because of the scale of the matter, the solution is to compel the firms, using the force of law, to get this right on a systemic and comprehensive basis.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.

As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if

“a significant number of children”

are users of the service, or if the service is

“likely to attract a significant number of users who are children”.

I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?

Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.

The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.

My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.

My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Online Safety Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Kim Leadbeater

Main Page: Kim Leadbeater (Labour - Batley and Spen)

Online Safety Bill (Eighth sitting)

Kim Leadbeater Excerpts
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.

I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.

My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

I want to associate myself with the comments of the right hon. Member for Basingstoke and the hon. Member for Aberdeen North, and to explore the intersection between the work we are doing to protect children and the violence against women and girls strategy. There is one group, girls, who apply to both. We know that they are sadly one of the most vulnerable groups for online harm and abuse, and we must do everything we can to protect them. Having a belt and braces approach, with a code of conduct requirement for the violence against women and girls strategy, plus implementing new clause 20 on this technology that can protect girls in particular, although not exclusively, is a positive thing. Surely, the more thorough we are in the preventive approach, the better, rather than taking action after it is too late?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

Online Safety Bill (Ninth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.

There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.

Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

This issue comes down to a fundamental point—are we looking at volume or risk? There is no difference between an individual—a young person in this instance—seeing something about suicide or self-harm on a Facebook post or in the comments section of a newspaper article. The volume—whether it goes viral or not—does not matter if that individual has seen that content and it has directed them to somewhere that will create serious harm and lead them towards dangerous behaviour. The volume is not the point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

--- Later in debate ---
“Recognised news publisher”
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I beg to move amendment 107, in clause 50, page 46, line 46, leave out from end to end of clause and insert

“is a member of an approved regulator (as defined in section 42 of the Crime and Courts Act 2013).”

This amendment expands the definition of a recognised news publisher to incorporate any entity that is a member of an approved regulator.

The primary purpose of the Bill is to protect social media users from harm, and it will have failed if it does not achieve that. Alongside that objective, the Bill must protect freedom of expression and, in particular, the freedom of the press, which I know we are all committed to upholding and defending. However, in evaluating the balance between freedom of the press and the freedom to enjoy the digital world without encountering harm, the Bill as drafted has far too many loopholes and risks granting legal protection to those who wish to spread harmful content and disinformation in the name of journalism.

Amendment 107 will address that imbalance and protect the press and us all from harm. The media exemption in the Bill is a complete exemption, which would take content posted by news publishers entirely out of the scope of platforms’ legal duties to protect their users. Such a powerful exemption must be drafted with care to ensure it is not open to abuse. However, the criteria that organisations must meet to qualify for the exemption, which are set out in clause 50, are loose and, in some cases, almost meaningless. They are open to abuse, they are ambiguous and they confer responsibility on the platforms themselves to decide which publishers meet the Bill’s criteria and which do not.

In evidence that we heard recently, it was clear that the major platforms do not believe it is a responsibility they should be expected to bear, nor do they have the confidence or feel qualified to do so. Furthermore, David Wolfe, chairman of the Press Recognition Panel, has advised that the measure represents a threat to press freedom. I agree.

Opening the gates for any organisation to declare themselves a news publisher by obtaining a UK address, jotting down a standards code on the back of an envelope and inviting readers to send an email if they have any complaints is not defending the press; it is opening the profession up to abuse and, in the long term, risks weakening its rights and protections.

Let us discuss those who may wish to exploit that loophole and receive legal protection to publish harmful content. A number of far-right websites have made white supremacist claims and praised Holocaust deniers. Those websites already meet several of the criteria for exemption and could meet the remaining criteria overnight. The internet is full of groups that describes themselves as news publishers but distribute profoundly damaging and dangerous material designed to promote extremist ideologies and stir up hatred.

We can all think of high-profile individuals who use the internet to propagate disinformation, dangerous conspiracy theories and antisemitic, Islamophobic, homophobic or other forms of abuse. They might consider themselves journalists, but the genuine professionals whose rights we want to protect beg to differ. None of those individuals should be free to publish harmful material as a result of exemptions that are designed for quite a different purpose. Is it really the Government’s intention that any organisation that meets their loose criteria, as defined in the Bill, should be afforded the sacrosanct rights and freedoms of the press that we all seek to defend?

I turn to disinformation, and to hostile state actors who wish to sow the seeds of doubt and division in our politics and our civic life. The Committee has already heard that Russia Today is among those expected to benefit from the exemption. I have a legal opinion from Tamsin Allen, a senior media lawyer at Bindmans LLP, which notes that,

“were the bill to become law in its present form, Russia Today would benefit from the media exemption. The exemption for print and online news publications is so wide that it would encompass virtually all publishers with multiple contributors, an editor and some form of complaints procedure and standards code, no matter how inadequate. I understand that RT is subject to a standards code in Russia and operates a complaints procedure. Moreover, this exemption could also apply to a publisher promoting hate or violence, providing it met the (minimal) standards set out in the bill and constituted itself as a ‘news’ or ‘gossip’ publication. The only such publications which would not be exempt are those published by organisations proscribed under the Terrorism Act.”

If hostile foreign states can exploit this loophole in the Bill to spread disinformation to social media users in the UK, that is a matter of national security and a threat to our freedom and open democracy. The requirement to have a UK address offers little by way of protection. International publishers spreading hate, disinformation or other forms of online harm could easily set up offices in the UK to qualify for this exemption and instantly make the UK the harm capital of the world. For those reasons, the criteria must change.

We heard from several individuals in evidence that the exemption should be removed entirely from the Bill, but we are committed to freedom of the press as well as providing proper protections from harm. Instead of removing the exemption, I propose a change to the qualifying criteria to ensure that credible publishers can access it while extremist and harmful publishers cannot.

My amendment would replace the convoluted list of requirements with a single and simple requirement for the platforms to follow and adhere to: that all print and online media that seeks to benefit from the exemption should be independently regulated under the royal charter provisions that this House has already legislated for. If, as the Bill already says, broadcast media should be defined in this way, why not print media too? Unlike the Government’s criteria, the likes of Russia Today, white supremacist blogs and other deeply disturbing extremist publications simply could not satisfy this requirement. If they were ever to succeed in signing up to such a regulator, they would swiftly be expelled for repeated standards breaches.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Batley and Spen for her speech. There is agreement across the House, in this Committee and in the Joint Committee that the commitment to having a free press in this country is extremely important. That is why recognised news publishers are exempted from the provisions of the Bill, as the hon. Lady said.

The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.

While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.

In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.

The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 86, in clause 50, page 47, line 3, after “material” insert—

“or special interest news material”.

Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate

Kim Leadbeater

Main Page: Kim Leadbeater (Labour - Batley and Spen)

Online Safety Bill (Tenth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 14th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 14 June 2022 - (14 Jun 2022)
John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

New clause 36 seeks to criminalise the encouragement or assistance of a suicide. Before I move on to the details of the new clause, I would like to share the experience of a Samaritans supporter, who said:

“I know that every attempt my brother considered at ending his life, from his early 20s to when he died in April, aged 40, was based on extensive online research. It was all too easy for him to find step-by-step instructions so he could evaluate the effectiveness and potential impact of various approaches and, most recently, given that he had no medical background, it was purely his ability to work out the quantities of various drugs and likely impact of taking them in combination that equipped him to end his life.”

It is so easy when discussing the minutiae of the Bill to forget its real-world impact. I have worked with Samaritans on the new clause, and I use that quote with permission. It is the leading charity in trying to create a suicide-safer internet. It is axiomatic to say that suicide and self-harm have a devastating impact on people’s lives. The Bill must ensure that the online space does not aid the spreading of content that would promote this behaviour in any way.

There has rightly been much talk about how children are affected by self-harm content online. However, it should be stressed they do not exclusively suffer because of that content. Between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were aged over 25. It is likely that, as the Bill stands, suicide-promoting content will be covered in category 1 services, as it will be designated as harmful. Unless this amendment is passed, that content will not be covered on smaller sites, which is crucial. As Samaritans has identified, it is precisely in these smaller fora and websites that harm proliferates. The 151 patients who took their own life after visiting harmful websites may have been part of a handful of people using those sites, which would not fall under the definition of category 1, as I am sure the Minister will confirm.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

The hon. Gentleman makes a very important point, which comes to the nub of a lot of the issues we face with the Bill: the issue of volume versus risk. Does he agree that one life lost to suicide is one life too many? We must do everything that we can in the Bill to prevent every single life being lost through suicide, which is the aim of his amendment.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I do, of course, agree. As anyone who has suffered with someone in their family committing suicide knows, it has a lifelong family effect. It is yet another amendment where I feel we should depart from the pantomime of so much parliamentary procedure, where both sides fundamentally agree on things but Ministers go through the torturous process of trying to tell us that every single amendment that any outside body or any Opposition Member, whether from the SNP or the Labour party, comes up with has been considered by the ministerial team and is already incorporated or covered by the Bill. They would not be human if that were the case. Would it not be refreshing if there were a slight change in tactic, and just occasionally the Minister said, “Do you know what? That is a very good point. I think I will incorporate it into the Bill”?

None of us on the Opposition Benches seeks to make political capital out of any of the things we propose. All of us, on both sides of the House, are here with the best of intentions, to try to ensure that we get the best possible Bill. We all want to be able to vote for the Bill at the end of the day. Indeed, as I said, I have worked with two friends on the Conservative Benches—with the hon. Member for Watford on the Joint Committee on the draft Bill and with the hon. Member for Wolverhampton North East on the Select Committee on Digital, Culture, Media and Sport—and, as we know, they have both voted for various proposals. It is perhaps part of the frustration of the party system here that people are forced to go through the hoops and pretend that they do not really agree with things that they actually do agree with.

Let us try to move on with this, in a way that we have not done hitherto, and see if we can agree on amendments. We will withdraw amendments if we are genuinely convinced that they have already been considered by the Government. On the Government side, let them try to accept some of our amendments—just begin to accept some—if, as with this one, they think they have some merit.

I was talking about Samaritans, and exactly what it wants to do with the Bill. It is concerned about harmful content after the Bill is passed. This feeds into potentially the most important aspect of the Bill: it does not mandate risk assessments based exclusively on risk. By adding in the qualifications of size and scope, the Bill wilfully lets some of the most harmful content slip through its fingers—wilfully, but I am sure not deliberately. Categorisation will be covered by a later amendment, tabled by my hon. Friend the Member for Aberdeen North, so I shall not dwell on it now.

In July 2021, the Law Commission for England and Wales recommended the creation of a new narrow offence of the “encouragement or assistance” of serious self-harm with “malicious intent”. The commission identified that there is

“currently no offence that adequately addresses the encouragement of serious self-harm.”

The recommendation followed acknowledgement that

“self-harm content online is a worrying phenomenon”

and should have a

“robust fault element that targets deliberate encouragement of serious self-harm”.

Currently, there are no provisions of the Bill to create a new offence of assisting or encouraging self- harm.

In conclusion, I urge the Minister to listen not just to us but to the expert charities, including Samaritans, to help people who have lived experience of self-harm and suicide who are calling for regulation of these dangerous sites.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is, as always, a great pleasure to serve under your chairmanship, Sir Roger. The hon. Member for Ochil and South Perthshire made an observation in passing about the Government’s willingness to listen and respond to parliamentarians about the Bill. We listened carefully to the extensive prelegislative scrutiny that the Bill received, including from the Joint Committee on which he served. As a result, we have adopted 66 of the changes that that Committee recommended, including on significant things such as commercial pornography and fraudulent advertising.

If Members have been listening to me carefully, they will know that the Government are doing further work or are carefully listening in a few areas. We may have more to say on those topics as the Bill progresses; it is always important to get the drafting of the provisions exactly right. I hope that that has indicated to the hon. Gentleman our willingness to listen, which I think we have already demonstrated well.

On new clause 36, it is important to mention that there is already a criminal offence of inciting suicide. It is a schedule 7 priority offence, so the Bill already requires companies to tackle content that amounts to the existing offence of inciting suicide. That is important. We would expect the promotion of material that encourages children to self-harm to be listed as a primary priority harm relating to children, where, again, there is a proactive duty to protect them. We have not yet published that primary priority harm list, but it would be reasonable to expect that material promoting children to self-harm would be on it. Again, although we have not yet published the list of content that will be on the adult priority harm list—obviously, I cannot pre-empt the publication of that list—one might certainly wish for content that promotes adults to self-harm to appear on it too.

The hon. Gentleman made the point that duties relating to adults would apply only to category 1 companies. Of course, the ones that apply to children would apply to all companies where there was significant risk, but he is right that were that priority harm added to the adult legal but harmful list, it would apply only to category 1 companies.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

In a second, but I may be about to answer the hon. Lady’s question.

Those category 1 companies are likely to be small in number, as I think the shadow Minister said, but I would imagine—I do not have the exact number—that they cover well over 90% of all traffic. However, as I hinted on the Floor of the House on Second Reading—we may well discuss this later—we are thinking about including platforms that may not meet the category 1 size threshold but none the less pose high-level risks of harm. If that is done—I stress “if”—it will address the point raised by the hon. Member for Ochil and South Perthshire. That may answer the point that the hon. Member for Batley and Spen was going to raise, but if not, I happily give way.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

It kind of does, but the Minister has raised some interesting points about children and adults and the risk of harm. To go back to the work of Samaritans, it is really important to talk about the fact that suicide is the biggest killer of young people aged 16 to 24, so it transcends the barrier between children and adults. With the right hon. Member for Basingstoke, the hon. Member for Aberdeen North, and the shadow Minister, my hon. Friend the Member for Pontypridd, we have rightly talked a lot about women, but it is really important to talk about the fact that men account for three quarters of all suicide. Men aged between 45 and 49 are most at risk of suicide—the rate among that group has been persistently high for years. It is important that we bring men into the discussion about suicide.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the element of gender balance that the hon. Member has introduced, and she is right to highlight the suicide risk. Inciting suicide is already a criminal offence under section 2 of the Suicide Act 1961 and we have named it a priority offence. Indeed, it is the first priority offence listed under schedule 7—it appears a third of the way down page 183—for exactly the reason she cited, and a proactive duty is imposed on companies by paragraph 1 of schedule 7.

On amendment 142 and the attendant new clause 36, the Government agree with the sentiment behind them—namely, the creation of a new offence of encouraging or assisting serious self-harm. We agree with the substance of the proposal from the hon. Member for Ochil and South Perthshire. As he acknowledged, the matter is under final consideration by the Law Commission and our colleagues in the Ministry of Justice. The offence initially proposed by the Law Commission was wider in scope than that proposed under new clause 36. The commission’s proposed offence covered the offline world, as well as the online one. For example, the new clause as drafted would not cover assisting a person to self-harm by providing them with a bladed article because that is not an online communication. The offence that the Law Commission is looking at is broader in scope.

The Government have agreed in principle to create an offence based on the Law Commission recommendation in separate legislation, and once that is done the scope of the new offence will be wider than that proposed in the new clause. Rather than adding the new clause and the proposed limited new offence to this Bill, I ask that we implement the offence recommended by the Law Commission, the wider scope of which covers the offline world as well as the online world, in separate legislation. I would be happy to make representations to my colleagues in Government, particularly in the MOJ, to seek clarification about the relevant timing, because it is reasonable to expect it to be implemented sooner rather than later. Rather than rushing to introduce that offence with limited scope under the Bill, I ask that we do it properly as per the Law Commission recommendation.

Once the Law Commission recommendation is enacted in separate legislation, to which the Government have already agreed in principle, it will immediately flow through automatically to be incorporated into clause 52(4)(d), which relates to illegal content, and under clause 176, the Secretary of State may, subject to parliamentary approval, designate the new offence as a priority offence under schedule 7 via a statutory instrument. The purpose of amendment 142 can therefore be achieved through a SI.

The Government publicly entirely agree with the intention behind the proposed new clause 36, but I think the way to do this is to implement the full Law Commission offence as soon as we can and then, if appropriate, add it to schedule 7 by SI. The Government agree with the spirit of the hon. Gentleman’s proposal, but I believe that the Government already have a plan to do a more complete job to create the new offence.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is absolutely right that the Government have included a commitment to children in the form of defining primary priority content that is harmful. We all know of the dangerous harms that exist online for children, and while the Opposition support the overarching aims of the Bill, we feel the current definitions do not go far enough—that is a running theme with this Bill.

The Bill does not adequately address the risks caused by the design—the functionalities and features of services themselves—or those created by malign contact with other users, which we know to be an immense problem. Research has found that online grooming of young girls has soared by 60% in the last three years—and four in five victims are girls. We also know that games increasingly have addictive gambling-style features. Those without user-to-user functionalities, such as Subway Surfers, which aggressively promotes in-app purchases, are currently out of scope of the Bill.

Lastly, research by Parent Zone found that 91% of children say that loot boxes are available in the games they play and 40% have paid to open one. That is not good enough. I urge the Minister to consider his approach to tackling harmful content and the impact that it can have in all its forms. When considering how children will be kept safe under the new regime, we should consider concerns flagged by some of the civil society organisations that work with them. Organisations such as the Royal College of Psychiatrists, The Mix, YoungMinds and the Mental Health Foundation have all been instrumental in their calls for the Government to do more. While welcoming the intention to protect children, they note that it is not clear at present how some categories of harm, including material that damages people’s body image, will be regulated—or whether it will be regulated at all.

While the Bill does take steps to tackle some of the most egregious, universally damaging material that children currently see, it does not recognise the harm that can be done through the algorithmic serving of material that, through accretion, will cause harm to children with particular mental health vulnerabilities. For example, beauty or fitness-related content could be psychologically dangerous to a child recovering from an eating disorder. Research from the Mental Health Foundation shows how damaging regular exposure to material that shows conventionally perfect images of bodies, often edited digitally and unattainable, are to children and young people.

This is something that matters to children, with 84% of those questioned in a recent survey by charity The Mix saying the algorithmic serving of content was a key issue that the Bill should address. Yet in its current form it does not give children full control over the content they see. Charities also tell us about the need to ensure that children are exposed to useful content. We suggest that the Government consider a requirement for providers to push material on social media literacy to users and to provide the option to receive content that can help with recovery where it is available, curated by social media companies with the assistance of trusted non-governmental organisations and public health bodies. We also hope that the Government can clarify that material damaging to people’s body image will be considered a form of harm.

Additionally, beyond the issue of the content itself that is served to children, organisations including YoungMinds and the Royal College of Psychiatrists have raised the potential dangers to mental health inherent in the way services can be designed to be addictive.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

My hon Friend raises an important point about media literacy, which we have touched on a few times during this debate. We have another opportunity here to talk about that and to say how important it is to think about media literacy within the scope of the Bill. It has been removed, and I think we need to put it back into the Bill at every opportunity—I am talking about media literacy obligations for platforms to help to responsibly educate children and adults about the risks online. We need to not lose sight of that.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. She is right to talk about the lack of a social and digital media strategy within the Bill, and the need to educate children and adults about the harmful content that we see online. How to stay safe online in all its capacities is absolutely fundamental to the Bill. We cannot have an Online Safety Bill without teaching people how to be safe online. That is important for how children and young people interact online. We know that they chase likes and the self-esteem buzz they get from notifications popping up on their phone or device. That can be addictive, as has been highlighted by mental health and young persons’ charities.

I urge the Minister to address those issues and to consider how the Government can go further, whether through this legislation or further initiatives, to help to combat some of those issues.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We all know that managing harmful content, unlike illegal content, is more about implementing systems that prevent people from encountering it rather than removing it entirely. At the moment, there are no duties on the Secretary of State to consult anyone other than Ofcom ahead of making regulations under clauses 53 and 54. We have discussed at length the importance of transparency, and surely the Minister can agree that the process should be widened, as we have heard from those on the Government Back Benches.

Labour has said time and again that it should not be for the Secretary of State of the day to determine what constitutes harmful content for children or adults. Without the important consultation process outlined in amendment 62, there are genuine concerns that that could lead to a damaging precedent whereby a Secretary of State, not Parliament, has the ability to determine what information is harmful. We all know that the world is watching as we seek to work together on this important Bill, and Labour has genuine concerns that without a responsible consultation process, as outlined in amendment 62, we could inadvertently be suggesting to the world that this fairly dogmatic approach is the best way forward.

Amendment 62 would require the Secretary of State to consult other stakeholders before making regulations under clauses 53 and 54. As has been mentioned, we risk a potentially dangerous course of events if there is no statutory duty on the Secretary of State to consult others when determining the definition of harmful content. Let me draw the Minister’s attention to the overarching concerns of stakeholders across the board. Many are concerned that harmful content for adults requires the least oversight, although there are potential gaps that mean that certain content—such as animal abuse content—could completely slip through the net. The amendment is designed to ensure that sufficient consultation takes place before the Secretary of State makes important decisions in directing Ofcom.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

On that point, I agree wholeheartedly with my hon. Friend. It is important that the Secretary of State consults campaign organisations that have expertise in the relevant areas. Much as we might want the Secretary of State to be informed on every single policy issue, that is unrealistic. It is also important to acknowledge the process that we have been through with the Bill: the expertise of organisations has been vital in some of the decisions that we have had to make. My hon. Friend gave a very good example, and I am grateful to animal welfare groups for their expertise in highlighting the issue of online abuse of animals.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The duties on regulated services set out in the clause are welcome. Transparency reports will be a vital tool to hold platforms to account for understanding the true drivers of online harm. However, asking platforms to submit transparency reports once a year does not reflect how rapidly we know the online world changes. As we have seen time and again, the online environment can shift significantly in a matter of months, if not weeks. We have seen that in the rise of disinformation about covid, which we have talked about, and in the accelerated growth of platforms such as TikTok.

Increasing the frequency of transparency reports from annual to biannual will ensure that platforms stay on the pulse of emergent risks, allowing Ofcom to do the same in turn. The amendment would also mean that companies focus on safety, rather than just profit. As has been touched on repeatedly, that is the culture change that we want to bring about. It would go some way towards preventing complacency about reporting harms, perhaps forcing companies to revisit the nature of harm analysis, management and reduction. In order for this regime to be world-leading and ambitious—I keep hearing the Minister using those words about the Bill—we must demand the most that we can from the highest-risk services, including on the important duty of transparency reporting.

Moving to clauses 64 and 65 stand part, transparency reporting by companies and Ofcom is important for analysing emerging harms, as we have discussed. However, charities have pointed out that platforms have a track record of burying documents and research that point to risk of harm in their systems and processes. As with other risk assessments and reports, such documents should be made public, so that platforms cannot continue to hide behind a veil of secrecy. As I will come to when I speak to amendment 55, the Bill must be ambitious and bold in what information platforms are to provide as part of the clause 64 duty.

Clause 64(3) states that, once issued with a notice by Ofcom, companies will have to produce a transparency report, which must

“be published in the manner and by the date specified in the notice.”

Can the Minister confirm that that means regulated services will have to publish transparency reports publicly, not just to Ofcom? Can he clarify that that will be done in a way that is accessible to users, similarly to the requirements on services to make their terms of service and other statements clear and accessible? Some very important information will be included in those reports that will be critical for researchers and civil society when analysing trends and harms. It is important that the data points outlined in schedule 8 capture the information needed for those organisations to make an accurate analysis.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The evidence we heard from Frances Haugen set out how important transparency is. If internet and service providers have nothing to hide, transparency is surely in their interests as well. From my perspective, there is little incentive for the Government not to support the amendment, if they want to help civil society, researchers, academics and so on in improving a more regulated approach to transparency generally on the internet, which I am sure we all agree is a good thing.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree. We cannot emphasis that enough, and it is useful that my hon. Friend has set that out, adding to what I was saying.

Amendment 55 sets out the details of the information that Ofcom must request to be provided in a transparency report in new paragraph 31A. First, transparency disclosures required by the Bill should include how large companies allocate resources to tackling harm in different languages —an issue that was rightly raised by the hon. Member for Ochil and South Perthshire. As we heard from Frances Haugen, many safety systems at Meta have only a subset of detection systems for languages other than English. Languages such as Welsh have almost no safety systems live on Facebook. It is neither fair nor safe.

When we consider that more than 250 languages are spoken in London alone, the inconsistency of safety systems becomes very concerning. Charities have warned that people accessing Facebook in different languages are being exposed to very different levels of risk, with some versions of Facebook having few or none of the safety systems that protect other versions of the site in different languages.

When giving evidence to the Committee last month, Richard Earley disclosed that Meta regulated only 70 languages. Given that around 3 billion people use Facebook on a monthly basis across the world, that is clearly inadequate.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I add my voice to the arguments made by my hon. Friend and the hon. Member for Aberdeen North. Violence against women and girls is a fundamental issue that the Bill needs to address. We keep coming back to that, and I too hope that the Minister hears that point. My hon. Friend has described some of the most horrific harms. Surely, this is one area where we have to be really clear. If we are to achieve anything with the Bill, this is an area that we should be working on.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I wholeheartedly agree with my hon. Friend. As I have said, the amendments would put in place rock-bottom safety measures that could prevent the most appalling abuses on pornography websites, and it is a scandal that, hitherto, they have not been implemented. We have the opportunity to change that today by voting for the amendments and ensuring that these measures are in place. I urge the Minister and Conservative Members to do the right thing.

Online Safety Bill (Thirteenth sitting)

Kim Leadbeater Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

It is fantastic to hear that those other things are happening—that is all well and good—but surely we should explicitly call out disinformation and misinformation in the Online Safety Bill. The package of other measures that the Minister mentions is fantastic, but I think they have to be in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady says that those measures should be in the Bill—more than they already are—but as I have pointed out, the way in which the legal architecture of the Bill works means that the mechanisms to do that would be adding a criminal offence to schedule 7 as a priority offence, for example, or using a statutory instrument to designate the relevant kind of harm as a priority harm, which we plan to do in due course for a number of harms. The Bill can cover disinformation with the use of those mechanisms.

We have not put the harmful to adults content in the Bill; it will be set out in statutory instruments. The National Security Bill is still progressing through Parliament, and we cannot have in schedule 7 of this Bill an offence that has not yet been passed by Parliament. I hope that that explains the legal architecture and mechanisms that could be used under the Bill to give force to those matters.

On amendment 57, the Government feel that six months is a very short time within which to reach clear conclusions, and that 18 months is a more appropriate timeframe in which to understand how the Bill is bedding in and operating. Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. To be clear, the Bill already requires Ofcom to produce codes of practice that set out the steps that providers will take to tackle illegal content— I mentioned the new National Security Bill, which is going through Parliament—and harmful content, which may, in some circumstances, include disinformation.

Disinformation that is illegal or harmful to individuals is in scope of the duties set out in the Bill. Ofcom’s codes of practice will, as part of those duties, have to set out the steps that providers should take to reduce harm to users that arises from such disinformation. Those steps could include content-neutral design choices or interventions of other kinds. We would like Ofcom to have a certain amount of flexibility in how it develops those codes of practice, including by being able to combine or disaggregate those codes in ways that are most helpful to the general public and the services that have to pay regard to them. That is why we have constructed them in the way we have. I hope that provides clarity about the way that disinformation can be brought into the scope of the Bill and how that measure then flows through to the codes of practice. I gently resist amendments 57 and 58 while supporting the clause standing part of the Bill.

Question put, That the amendment be made.

Online Safety Bill (Fourteenth sitting) Debate

Full Debate: Read Full Debate

Kim Leadbeater

Main Page: Kim Leadbeater (Labour - Batley and Spen)

Online Safety Bill (Fourteenth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It did not, but there were examples of disinformation, misinformation and the spreading of falsehoods, and none of these powers existed at the time. It seems weird—if I can use that term—that these exist now. Surely, the more appropriate method would be for the Secretary of State to write a letter to Ofcom to which it had to have regard. As it stands, this dangerous clause ensures the Secretary of State has the power to interfere with day-to-day enforcement. Ultimately, it significantly undermines Ofcom’s overall independence, which we truly believe should be at the heart of the Bill.

With that in mind, I will now speak to our crucial new clause 10, which instead would give Ofcom the power to take particular steps, where it considers that there is a threat to the health and safety of the public or national security, without the need for direction from the Secretary of State. Currently, there is no parliamentary scrutiny of the powers outlined in clause 146; it says only that the Secretary of State must publish their reasoning unless national security is involved. There is no urgency threshold or requirement in the clause. The Secretary of State is not required to take advice from an expert body, such as Public Health England or the National Crime Agency, in assessing reasonable grounds for action. The power is also not bounded by the Bill’s definition of harm.

These instructions do two things. First, they direct Ofcom to use its quite weak media literacy duties to respond to the circumstances. Secondly, a direction turns on a power for Ofcom to ask a platform to produce a public statement about what the platform is doing to counter the circumstances or threats in the direction order—that is similar in some ways to the treatment of harm to adults. This is trying to shame a company into doing something without actually making it do it. The power allows the Secretary of State directly to target a given company. There is potential for the misuse of such an ability.

The explanatory notes say:

“the Secretary of State could issue a direction during a pandemic to require OFCOM to; give priority to ensuring that health misinformation and disinformation is effectively tackled when exercising its media literacy function; and to require service providers to report on the action they are taking to address this issue.”

Recent experience of the covid pandemic and the Russian invasion of Ukraine suggests that the Government can easily legislate when required in an emergency and can recall Parliament. The power in the Bill is a strong power, cutting through regulatory independence and targeting individual companies to evoke quite a weak effect. It is not being justified as an emergency power where the need to move swiftly is paramount. Surely, if a heavier-duty action is required in a crisis, the Government can legislate for that and explain to Parliament why the power is required in the context of a crisis.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

It is really important to make sure that the Bill does not end up being a cover for the Secretary of State of the day to significantly interfere with the online space, both now and in the future. At the moment, I am not satisfied that the Secretary of State’s powers littered through the Bill are necessary. I share other hon. Members’ concerns about what this could mean for both the user experience and online safety more broadly. I hope my hon. Friend agrees that the Minister needs to provide us—not just us here today, but civil society and others who might be listening—with more reassurance that the Secretary of State’s powers really are necessary.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We talk time and again about this Bill being world leading, but with that comes a responsibility to show global leadership. Other countries around the world will be looking to us, and this Parliament, when they adopt their own, similar legislation, and we need to be mindful of that when looking at what powers we give to a Secretary of State—particularly in overruling any independence of Ofcom or Parliament’s sovereignty for that matter.

New clause 10 provides a viable alternative. The Minister knows that this is an area where even his Back Benchers are divided. He must closely consider new clause 10 and recognise that placing power in Ofcom’s hands is an important step forward. None of us wants to see a situation where the Secretary of State is able to influence the regulator. We feel that, without this important clause and concession, the Government could be supporting a rather dangerous precedent in terms of independence in regulatory systems more widely.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I welcome the shadow Minister’s support for this review clause, which is important. I will not add to her comments.

Question put and agreed to.

Clause 149 accordingly ordered to stand part of the Bill.

Clause 150

Harmful communications offence

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I beg to move amendment 112, in clause 150, page 127, line 28, at end insert “and;

(b) physical harm that has been acquired as a consequence of receiving the content of a message sent online.”

This amendment would expand the definition of harm for the purposes of the harmful communications offence to incorporate physical harm resulting from messages received online.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss amendment 113, in clause 150, page 127, line 28, at end insert “; or

(b) physical harm resulting from an epileptic seizure, where the seizure has been triggered by the intentional sending of flashing images to a person with epilepsy.”

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I move the amendment in my name and will speak to amendment 113, which is in the name of the hon. Member for Blackpool North and Cleveleys (Paul Maynard).

The amendment would put into effect Zach’s law in full. Zach, as many Members know, is an amazing, energetic and bright young boy from my constituency. I had the absolute pleasure of visiting Zach and his mum Clare at their home in Hartshead a few weeks ago. We chatted about school and his forthcoming holiday, and he even invited me to the pub. However, Zach also has epilepsy.

Disgustingly, he was trolled online a few years ago and sent flashing images by bullies, designed to trigger his condition and give him an epileptic seizure, a seizure that not only would cause him and his family great distress, but can be extremely dangerous and cause Zach significant psychological and physical harm. I know that we are all united in our disgust at such despicable actions and committed to ensuring that this type of unbelievable online bullying is against the law under the Bill.

On Second Reading, I raised the matter directly with the Minister and I am glad that he pointed to clause 150 and stated very explicitly that subsection (4) will cover the type of online harm that Zach has encountered. However, we need more than just a commitment at the Dispatch Box by the Minister, or verbal reassurances, to protect Zach and the 600,000 other people in the UK with epilepsy.

The form of online harm that Zach and others with epilepsy have suffered causes more than just “serious distress”. Members know that the Bill as drafted lists

“psychological harm amounting to at least serious distress”

as a qualifying criterion of the offence. However, I believe that does not accurately and fully reflect the harm that epilepsy trolling causes, and that it leaves a significant loophole that none of us here wish to see exploited

For many people with epilepsy, the harm caused by this vicious online trolling is not only psychological but physical too. Seizures are not benign events. They can result in broken bones, concussion, bruises and cuts, and in extreme cases can be fatal. It is simply not right to argue that physical harm is intrinsically intertwined with psychological harm. They are different harms with different symptoms. While victims may experience both, that is not always the case.

Professor Sander, medical director of the Epilepsy Society and professor of neurology at University College London Hospitals NHS Foundation Trust, who is widely considered one of the world’s leading experts on epilepsy, has said:

“Everyone experiences seizures differently. Some people may be psychologically distressed by a seizure and not physically harmed. Others may be physically harmed but not psychologically distressed. This will vary from person to person, and sometimes from seizure to seizure depending on individual circumstances.”

Amendment 112 will therefore expand the scope of clause 150 and insert on the face of the Bill that an offence will also be committed under the harmful communications clause when physical harm has occurred as a consequence of receiving a message sent online with malicious intent. In practical terms, if a person with epilepsy were to receive a harmful message online that triggers their epilepsy and they subsequently fall off their chair and hit their head, that physical harm will be proof of a harmful communication offence, without the need to prove any serious psychological distress that may have been caused.

This simple but effective amendment, supported by the Epilepsy Society, will ensure that the horrific trolling that Zach and others with epilepsy have had to endure will be covered in full by the Bill. That will mean that the total impact that such trolling has on the victims is reflected beyond solely psychological distress, so there can be no ambiguity and nowhere for those responsible for sending these images and videos to hide.

I am aware that the Minister has previously pointed to the possibility of a standalone Bill—a proposal that is under discussion in the Ministry of Justice. That is all well and good, but that should not delay our action when the Bill before us is a perfectly fit legislative vehicle to end epilepsy trolling, as the Law Commission report recommended.

I thank colleagues from across the House for the work they have done on this important issue. I sincerely hope that the amendment is one instance where we can be united in this Committee. I urge the Minister to adopt amendment 112, to implement Zach’s law in full and to provide the hundreds of thousands of people across the UK living with epilepsy the legal protections they need to keep them safe online. It would give me no greater pleasure than to call at Zach’s house next time I am in the area and tell him that this is the case.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

May I praise the hon. Member for Batley and Spen for such an eloquent and heartfelt explanation of the reason why this amendment to the Bill is so important?

I have been campaigning on Zach’s law for the past nine months. I have spoken to Zach multiple times and have worked closely with my hon. Friend the Member for Stourbridge (Suzanne Webb) in engaging directly with Facebook, Twitter and the big platforms to try to get them to do something, because we should not need to have a law to stop them sending flashing images. We had got quite far a few months ago, but now that seems to have stalled, which is very frustrating.

I am stuck between my heart and my head on this amendment. My heart says we need to include the amendment right now, sort it out and get it finalised. However, my head says we have got to get it right. During the Joint Committee for Online Safety before Christmas and in the evidence sessions for this Bill, we heard that if the platforms want to use a loophole and get around things they will. I have even seen that with regard to the engagements and the promises we have had.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

That is an excellent point. I have yet to make up my mind which way to vote if the amendment is pressed to a vote; I do not know whether this is a probing amendment. Having spoken to the Epilepsy Society and having been very close to this issue for many months, for me to feel comfortable, I want the Minister not just to say, as he has said on the Floor of the House, to me personally, in meetings and recently here, that the clause should cover epilepsy, and does seem to, and that he is very confident of that, but to give some assurance that we will change the law in some form.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I am incredibly grateful for the hon. Member’s comments and contribution. I agree wholeheartedly. We need more than a belief and an intention. There is absolutely no reason why we cannot have this in black and white in the Bill. I hope he can find a way to do the right thing today and vote for the amendment.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

The phrase “Do the right thing” is at the heart of this. My hon. Friend the Member for Ipswich (Tom Hunt) presented the Flashing Images Bill yesterday. A big part of this is about justice. I am conscious that we have got to get the balance right; stopping this happening has an impact for the people who choose to do this. I am keen to hear what the Minister says. We have got to get this right. I am keen to get some assurances, which will very much sway my decision on the vote today.

--- Later in debate ---
The question then arises which legislative vehicle the offence will go in. I am aware of the private Member’s Bill, but it will take a very long time and we probably would not want to rely on it, so I am in the process of getting cross-Government agreement on which legislative vehicle will be used. I do not want to say any more about that now, because it is still subject to collective agreement, but I am expecting to come back to the House on Report and confirm which Bill the measure will go in.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I genuinely appreciate the Minister’s comments, but why would we spend more time doing other pieces of legislation when we can do it right here and right now? The amendment will solve the problem without causing any more pain or suffering over a long period of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One of the pieces of legislation that could be used is this Bill, because it is in scope. If the hon. Lady can bear with me until Report, I will say more about the specific legislative vehicle that we propose to use.

On the precise wording to be used, I will make a couple of points about the amendments that have been tabled—I think amendment 113 is not being moved, but I will speak to it anyway. Amendment 112, which was tabled by the hon. Member for Batley and Spen, talks about bringing physical harm in general into the scope of clause 150. Of course, that goes far beyond epilepsy trolling, because it would also bring into scope the existing offence of assisting or encouraging suicide, so there would be duplicative law: there would be the existing offence of assisting or encouraging suicide and the new offence, because a communication that encouraged physical harm would do the same thing.

If we included all physical harm, it would duplicate the proposed offence of assisting or encouraging self-harm that is being worked on by the Ministry of Justice and the Law Commission. It would also duplicate offences under the Offences Against the Person Act 1861, because if a communication caused one person to injure another, there would be duplication between the offence that will be created by clause 150 and the existing offence. Clearly, we cannot have two offences that criminalise the same behaviour. To the point made by the hon. Member for Aberdeen North, it would not be right to create two epilepsy trolling offences. We just need one, but it needs to be right.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a good question, and it ties into my next point. Clearly, amendment 113 is designed to create a two-sentence epilepsy trolling offence. When trying to create a brand-new offence—in this case, epilepsy trolling—it is unlikely that two sentences’ worth of drafting will do the trick, because a number of questions need to be addressed. For example, the drafting will need to consider what level of harm should be covered and exactly what penalty would be appropriate. If it was in clause 150, the penalty would be two years, but it might be higher or lower, which needs to be addressed. The precise definitions of the various terms need to be carefully defined as well, including “epilepsy” and “epileptic seizures” in amendment 113, which was tabled by my hon. Friend the Member for Blackpool North and Cleveleys. We need to get proper drafting.

My hon. Friend the Member for Eastbourne mentioned that the Epilepsy Society had some thoughts on the drafting. I know that my colleagues in the Ministry of Justice and, I am sure, the office of the parliamentary counsel, would be keen to work with experts from the Epilepsy Society to ensure that the drafting is correct. Report will likely be before summer recess—it is not confirmed, but I am hoping it will be—and getting the drafting nailed down that quickly would be challenging.

I hope that, in a slightly indirect way, that answers the question. We do not have collective agreement about the precise legislative vehicle to use; however, I hope it addresses the questions about how the timing and the choreography could work.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

We have talked a lot about the Epilepsy Society this afternoon, and quite rightly too, as they are the experts in this field. My understanding is that it is perfectly happy with the language in this amendment—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Which one?

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Amendment 112. I think that the Epilepsy Society feels that this would be covered. I am also confused, because the Minister said previously that it was his belief and intention that this clause would cover epilepsy trolling, but he is now acknowledging that it does not. Why would we not, therefore, just accept the amendment that covers it and save everybody a lot of time?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Representations have been made by the three Members here that epilepsy deserves its own stand-alone offence, and the Government have just agreed to do that, so take that as a win. On why we would not just accept amendment 112, it may well cover epilepsy, and may well cover it to the satisfaction of the Epilepsy Society, but it also, probably inadvertently, does a lot more than that. It creates a duplication with the offence of assisting or encouraging suicide.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Surely that is almost a bonus?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

No, it is not a bonus, because we cannot have two different laws that criminalise the same thing. We want to have laws that are, essentially, mutually exclusive. If a person commits a particular act, it should be clear which Act the offence is being committed under. Imagine that there were two different offences for the same act with different sentences—one is two years and one is 10 years. Which sentence does the judge then apply? We do not want to have law that overlaps, where the same act is basically a clear offence under two different laws. Just by using the term “physical harm”, amendment 112 creates that. I accept that it would cover epilepsy, but it would also cover a whole load of other things, which would then create duplication.

That is why the right way to do this is essentially through a better drafted version of amendment 113, which specifically targets epilepsy. However, it should be done with drafting that has been done properly—with respect to my hon. Friend the Member for Blackpool North and Cleveleys, who drafted the amendment—with definitions that are done properly, and so on. That is what we want to do.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My hon. Friend makes an extremely powerful point that is incapable of being improved upon.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Or perhaps it is.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

It is wonderful that we have such consensus on this issue. I am grateful to colleagues for that. I am very concerned about the pressures on parliamentary time, and the fact that we are kicking this issue down the road again. We could take action today to get the process moving. That is what Zach and his family want and what other people who have been subjected to this hideous bullying want. Without a firm timeframe for another way of getting this done, I am struggling to understand why we cannot do this today.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The progress that the campaign has made, with the clear commitment from the Government that we are going to legislate for a specific epilepsy trolling offence, is a huge step forward. I entirely understand the hon. Lady’s impatience. I have tried to be as forthcoming as I can be about likely times, in answer to the question from the hon. Member for Aberdeen North, within the constraints of what is currently collectively agreed, beyond which I cannot step.

Amendment 112 will sort out the epilepsy, but unfortunately it will create duplicative criminal law. We cannot let our understandable sense of urgency end up creating a slightly dysfunctional criminal statute book. There is a path that is as clear as it reasonably can be. Members of the Committee will probably have inferred the plan from what I said earlier. This is a huge step forward. I suggest that we bank the win and get on with implementing it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I think that is probably a good place to leave my comments. I can offer public testimony of my hon. Friend’s tenacity in pursuing this issue.

I ask the hon. Member for Batley and Spen to withdraw the amendment. I have given the reasons why: because it would create duplicative criminal law. I have been clear about the path forward, so I hope that on that basis we can work together to get this legislated for as a new offence, which is what she, her constituent and my hon. Friends the Members for Watford and for Eastbourne and others have been calling for.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I appreciate the Minister’s comments and the support from across the House. I would like to the push the amendment to a vote.

Question put, That the amendment be made.

Online Safety Bill (Fifteenth sitting)

Kim Leadbeater Excerpts
Committee stage
Thursday 23rd June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 23 June 2022 - (23 Jun 2022)
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

--- Later in debate ---
I hope those comments make it clear that we already have a statutory advocate: the Children’s Commissioner. Clause 140 contains facilities for other organisations besides our existing statutory advocate to formally and legally raise with Ofcom issues that may arise. Ofcom is bound to reply—it is not optional. Ofcom has to listen to complaints and it has to respond.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.

Online Safety Bill (Sixteenth sitting) Debate

Full Debate: Read Full Debate

Kim Leadbeater

Main Page: Kim Leadbeater (Labour - Batley and Spen)

Online Safety Bill (Sixteenth sitting)

Kim Leadbeater Excerpts
Committee stage
Tuesday 28th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 28 June 2022 - (28 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

I rise to speak in favour of new clauses 14 to 16, on media literacy. As we have discussed in Committee, media literacy is absolutely vital to ensure that internet users are aware of the tools available to protect themselves. Knowledge and understanding of the risks online, and how to protect against them, are the first line of defence for us all.

We all know that the Bill will not eliminate all risk online, and it will not entirely clean up the internet. Therefore, ensuring that platforms have robust tools in place, and that users are aware of them, is one of the strongest tools in the Bill to protect internet users. As my hon. Friend the Member for Pontypridd said, including the new clauses in the Bill would help to ensure that we all make decisions based on sound evidence, rather than on poorly informed opinions that can harm not just individuals but democracy itself. The new clauses, which would place a duty on Ofcom to promote media literacy and publish a strategy, are therefore crucial.

I am sure we all agree about the benefits of public health information that informs us of the role of a healthy diet and exercise, and of ways that we can adopt a healthier lifestyle. I do not want to bring up the sensitive subject of the age of members of the Committee, as it got me into trouble with some of my younger colleagues last week, but I am sure many of us will remember the Green Cross Code campaign, the stop smoking campaigns, the anti-drink driving ads, and the powerful campaign to promote the wearing of seatbelts—“Clunk click every trip”. These were publicly funded and produced information campaigns that have stuck in our minds and, I am sure, protected thousands of lives across the country. They laid out the risks and clearly stated the actions we all need to take to protect ourselves.

When it comes to online safety, we need a similar mindset to inform the public of the risks and how we can mitigate them. Earlier in Committee, the right hon. Member for Basingstoke, a former Secretary of State for Digital, Culture, Media and Sport, shared her experience of cyber-flashing and the importance of knowing how to turn off AirDrop to prevent such incidents from occurring in the first place. I had no idea about this simple change that people can make to protect themselves from such an unpleasant experience. That is the type of situation that could be avoided with an effective media literacy campaign, which new clauses 14 to 16 would legislate for.

I completely agree that platforms have a significant duty to design and implement tools for users to protect themselves while using platforms’ services. However, I strongly believe that only a publicly funded organisation such as Ofcom can effectively promote their use, explain the dangers of not using them and target such information at the most vulnerable internet users. That is why I wholeheartedly support these vital new clauses.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously recognise and support the intent behind the new clause, which is to make sure that work is undertaken by Ofcom specifically, and the Government more widely, on media literacy. That is important for the reasons laid out by the hon. Members for Aberdeen North and for Batley and Spen.

Ofcom already has a statutory duty to promote media literacy in relation to electronic media, which includes everything in scope of the Bill and more beyond. That is set out in the Communications Act 2003, so the statutory duty exists already. The duty proposed in new clause 14 is actually narrower in scope than the existing statutory duty on Ofcom, and I do not think it would be a very good idea to give Ofcom an online literacy duty with a narrower scope than the one it has already. For that reason, I will resist the amendment, because it narrows the duties rather than widens them.

I would also point out that a number of pieces of work are being done non-legislatively. The campaigns that the hon. Member for Batley and Spen mentioned—dating often, I think, back to the 1980s—were of course done on a non-legislative basis and were just as effective for it. In that spirit, Ofcom published “Ofcom’s approach to online media literacy” at the end of last year, which sets out how Ofcom plans to expand, and is expanding, its media literacy programmes, which cover many of the objectives specified in the new clause. Therefore, Ofcom itself has acted already—just recently—via that document.

Finally, I have two points about what the Government are doing. First, about a year ago the Government published their own online media literacy strategy, which has been backed with funding and is being rolled out as we speak. When it comes to disinformation more widely, which we have debated previously, we also have the counter-disinformation unit working actively on that area.

Therefore, through the Communications Act 2003, the statutory basis exists already, and on a wider basis than in these new clauses; and, through the online media literacy strategy and Ofcom’s own approach, as recently set out, this important area is well covered already.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I completely understand and accept the point that there are groups of people in society who suffer disproportionate harms, as we have debated previously, and that obviously includes women and girls. There are of course other groups as well, such as ethnic minorities or people whose sexual orientation makes them the target of completely unacceptable abuse in a way that other groups do not suffer.

I accept the point about having this “on the face of the Bill”. We have debated this. That is why clauses 10 and 12 use the word “characteristic”—we debated this word previously The risk assessment duties, which are the starting point for the Bill’s provisions, must specifically and expressly—it is on the face of the Bill—take into account characteristics, first and foremost gender, but also racial identity, sexual orientation and so on. Those characteristics must be expressly addressed by the risk assessments for adults and for children, in order to make sure that the special protections or vulnerabilities or the extra levels of abuse people with those characteristics suffer are recognised and addressed. That is why those provisions are in the Bill, in clauses 10 and 12.

A point was raised about platforms not responding to complaints raised about abusive content that has been put online—the victim complains to the platform and nothing happens. The hon. Members for Pontypridd and for Aberdeen North are completely right that this is a huge problem that needs to be addressed. Clause 18(2) places a duty—they have to do it; it is not optional—on these platforms to operate a complaints procedure that is, in paragraph (c),

“easy to access, easy to use (including by children)”

and that, in paragraph (b),

“provides for appropriate action to be taken”.

They must respond. They must take appropriate action. That is a duty under clause 18. If they do not comply with that duty on a systemic basis, they will be enforced against. The shadow Minister and the hon. Member for Aberdeen North are quite right. The days of the big platforms simply ignoring valid complaints from victims have to end, and the Bill will end them.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I am extremely impressed by the Minister’s knowledge of the Bill, as I have been throughout the Committee’s sittings. It is admirable to see him flicking from page to page, finding where the information about violence against women and girls is included, but I have to concur with the hon. Member for Aberdeen North and my Front-Bench colleagues. There is surely nothing to be lost by specifically including violence against women and girls on the face of the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I hope I have made very clear in everything I have said, which I do not propose to repeat, that the way the Bill operates, in several different areas, and the way the criminal law has been constructed over the past 10 years, building on the work of previous Governments, is that it is designed to make sure that the crimes committed overwhelmingly against women and girls are prioritised. I think the Bill does achieve the objective of providing that protection, which every member of this Committee wishes to see delivered. I have gone through it in some detail. It is woven throughout the fabric of the Bill, in multiple places. The objective of new clause 23 is more than delivered.

In conclusion, we will be publishing a list of harms, including priority harms for children and adults, which will then be legislated for in secondary legislation. The list will be constructed with the vulnerability of women and girls particularly in mind. When Committee members see that list, they will find it reassuring on this topic. I respectfully resist the new clause, because the Bill is already incredibly strong in this important area as it has been constructed.

--- Later in debate ---
Brought up, and read the First time.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I beg to move, That the clause be read a Second time.

New clause 25 would place an obligation on Ofcom to report annually to Parliament with an update on the effectiveness of the Online Safety Bill, which would also indicate Ofcom’s ability to implement the measures in the Bill to tackle online harms.

As we have discussed, chapter 7 of the Bill compels Ofcom to compile and issue reports on various aspects of the Bill as drafted. Some of those reports are to be made public by Ofcom, and others are to be issued to the Secretary of State, who must subsequently lay them before Parliament. However, new clause 25 would place a direct obligation on Ofcom to be transparent to Parliament about the scale of harms being tackled, the type of harms encountered and the effectiveness of the Bill in achieving its overall objectives.

The current proposal in clause 135 for an annual transparency report is not satisfactory. Those transparency reports are not required to be laid before Parliament. The clause places vague obligations on reporting patterns, and it will not give Parliament the breadth of information needed to allow us to decide the Online Safety Bill’s effectiveness.

Clause 149 is welcome. It will ensure that a review conducted by the Secretary of State in consultation with Ofcom is placed before Parliament. However, that review is a one-off that will provide just a small snapshot of the Bill’s effectiveness. It may not fully reflect Ofcom’s concerns as the regulator, and most importantly it will not disclose the data and information that Parliament needs to accurately assess the impact of the Bill.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I agree with the hon. Member wholeheartedly. It should be Parliament that is assessing the effectiveness of the Bill. The Committee has discussed many times how groundbreaking the Bill could be, how difficult it has been to regulate the internet for the first time, the many challenges encountered, the relationship between platforms and regulator and how other countries will be looking at the legislation as a guide for their own regulations. Once this legislation is in place, the only way we can judge how well it is tackling harm in the UK is with clear public reports detailing information on what harms have been prevented, who has intervened to remove that harm, and what role the regulator—in this case Ofcom—has had in protecting us online.

New clause 25 will place a number of important obligations on Ofcom to provide us with that crucial information. First, Ofcom will report annually to Parliament on the overall effectiveness of the Act. That report will allow Ofcom to explore fully where the Act is working, where it could be tightened and where we have left gaps. Throughout the Bill we are heaping considerable responsibility on to Ofcom, and it is only right that Ofcom is able to feedback publicly and state clearly where its powers allow it to act, and where it is constrained and in need of assistance.

Secondly, new clause 25 will compel Ofcom to monitor, collate and publish figures relating to the number of harms removed by category 1 services, which is an important indicator for us to know the scale of the issue and that the Act is working.

Thirdly, we need to know how often Ofcom is intervening, compared with how often the platforms themselves are acting. That crucial figure will allow us to assess the balance of regulation, which assists not only us in the UK but countries looking at the legislation as a guide for their own regulation.

Finally, Ofcom will detail the harms removed by type to identify any areas where the Act may be falling short, and where further attention may be needed.

I hope the Committee understands why this information is absolutely invaluable, when we have previously discussed our concerns that this groundbreaking legislation will need constant monitoring. I hope it will also understand why the information needs to be transparent in order to instil trust in the online space, to show the zero-tolerance approach to online harms, and to show countries across the globe that the online space can be effectively regulated to protect citizens online. Only Parliament, as the legislature, can be an effective monitor of that information. I hope I can count on the Government’s support for new clause 25.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I speak in support of new clause 25. As my hon. Friend has argued, transparency is critical to the Bill. It is too risky to leave information and data about online harms unpublished. That is why we have tabled several amendments to the Bill to increase reporting, both to the regulator and publicly.

New clause 25 is an important addition that would offer an overview of the effectiveness of the Bill and act as a warning bell for any unaddressed historical or emerging harms. Not only would such a report benefit legislators, but the indicators included in the report would be helpful for both Ofcom and user advocacy groups. We cannot continue to attempt to regulate the internet blind. We must have the necessary data and analysis to be sure that the provisions in the Bill are as effective as they can be. I hope the Minister can support this new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The idea that a report on Ofcom’s activities be delivered to Parliament so that it can be considered is an excellent one. In fact, it is such an excellent idea that it has been set out in statute since 2002: the Office of Communications Act 2002 already requires Ofcom to provide a report to the Secretary of State on the carrying out of all of its functions, which will include the new duties we are giving Ofcom under the Bill. The Secretary of State must then lay that report before each House of Parliament. That is a well-established procedure for Ofcom and for other regulatory bodies. It ensures the accountability of Ofcom to the Department and to Parliament.

I was being slightly facetious there, because the hon. Member for Batley and Spen is quite right to raise the issue. However, the duty she is seeking to create via new clause 25 is already covered by the duties in the Office of Communications Act. The reports that Ofcom publish under that duty will include their new duties under the Bill. Having made that clear, I trust that new clause 25 can be withdrawn.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I would like to press new clause 25 to a Division. It is important that it is included in the Bill.

Question put, That the clause be read a Second time.

Online Safety Bill

Kim Leadbeater Excerpts
Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - -

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Online Safety Bill

Kim Leadbeater Excerpts
Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Order. Just a quick reminder: I know it is extremely difficult, and I do not want to interrupt hon. Members when they are making their speeches, but it is important that we try to address the amendments that are before us today. There will be a separate debate on whether to recommit the Bill and on the other ideas, so they can be addressed at that point. As I say, it is important to relate remarks to the amendments that are before us.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - -

I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

ONLINE SAFETY BILL (First sitting)

Kim Leadbeater Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

ONLINE SAFETY BILL (Second sitting)

Kim Leadbeater Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

It is a pleasure to serve under your chairship, Dame Angela. It is lovely to be back in a Public Bill Committee with many familiar faces—and a few new ones, including the Minister. However, after devoting many weeks earlier this year to the previous Committee, I must admit that it is with some frustration that we are back here with the Government intent on further weakening their Bill.

Throughout the passage of the Bill, I have raised a number of specific concerns, from democratic and journalistic exemptions, to age verification, recognised news publishers, advocacy bodies and media literacy. On clause 14, while I support the principles of Government amendments 15 and 16, I draw the Minister’s attention to the importance of amendment (a) to amendment 15 and amendment (a) to amendment 16. He has already said that he is sympathetic to those amendments. Let me try to convince him to turn that sympathy into action.

I will focus primarily on an issue that is extremely important to me and to many others: extremism and radicalisation. However, while I will focus on the dangers of extremism and radicalisation, be it right-wing, Islamist, incel or other, the dangers that I am about to set out—the chain of events that leads to considerable harm online—are the same for self-harm content, eating disorder content, health disinformation, climate change disinformation or any dangerous, hateful material directed at people based on their sex, sexual orientation, ethnicity, religion or other characteristics.

Such content is not just deeply offensive and often wholly inaccurate; it is dangerous and vile and serves only to spread harm, misinformation and conspiracy. To be clear, such content is not about a social media user stating how upset and angry they are about the football result, or somebody disagreeing legitimately and passionately about a political issue. It is not the normal, everyday social media content that most people see on their feeds.

This is content that is specifically, carefully and callously designed to sit just below the criminal threshold, yet that can still encourage violence, self-harm or worse. It is content used by extremists of all types that lures vulnerable people in, uses social media likes and comments to create the illusion of legitimacy and popularity, and then directly targets those most likely to be susceptible, encouraging them either to commit harm or to move on to smaller but high-harm platforms that may fall out of the scope of the Bill. This is not free speech; it is content that can act as a dangerous gateway to radicalisation and extremism. The Government know how dangerous it is because their own report from His Majesty’s Prison and Probation Service last year found:

“The Internet appears to be playing an increasingly prominent role in radicalisation processes of those convicted of extremist offences in England and Wales.”

Hon. Members will understand my deep and personal interest in this matter. Since the murder of my sister, a Member of this House, six and a half years ago by a far-right extremist, I have worked hard to bring communities and people together in the face of hatred. Some of that work has included meeting former extremists and discussing how they were radicalised. Those conversations were never easy, but what became very clear to me was that such people are not born extremists. Their radicalisation starts somewhere, and it is often somewhere that appears to be completely innocent, such as a Facebook group about issues or problems in their community, a Twitter discussion about current affairs or the state of the country, or even a page for supporters of their football team.

One day, a comment is posted that is not illegal and is not hate speech, but that references a conspiracy or a common trope. It is an ideological remark placed there to test the water. The conversation moves on and escalates. More disturbing or even violent comments start to be made. They might be accompanied by images or videos, leading those involved down a more sinister path. Nothing yet is illegal, but clearly—I hope we would all agree—it is unacceptable.

The number of contributors reduces, but a few remain. No warnings are presented, no flags are raised and it appears like normal social media content. However, the person reading it might be lonely or vulnerable, and now feels that they have found people to listen to them. They might be depressed or unhappy and looking to blame their situation on something or someone. They might feel that nobody understands them, but these people seem to.

The discussion is then taken to a more private place, to the smaller but more harmful platforms that may fall outside the scope of the Bill, but that will now become the go-to place for spreading extremism, misinformation and other harmful content. The radicalisation continues there—harder to track, harder to monitor and harder to stop. Let us remember, however, that all of that started with those legal but harmful comments being witnessed. They were clearly unacceptable, but mainstream social media give them legitimacy. The Online Safety Bill will do nothing to stop that.

Unfortunately, that chain of events occurs far too often. It is a story told many times, about how somebody vulnerable is lured in by those wishing to spread their hatred. It is hosted by major social media platforms. Hon. Members may remember the case of John, a teenager radicalised online and subsequently sentenced. His story was covered by The Guardian last year. John was feeling a sense of hopelessness, which left him susceptible to the messaging of the far right. Aged 15, he felt “written off”: he was in the bottom set at school, with zero exam expectations, and feeling that his life opportunities would be dismal. The far right, however, promised him a future. John became increasingly radicalised by an online barrage of far-right disinformation. He said:

“I was relying on the far right for a job. They were saying that when they got power they would be giving jobs to people like me”.

John now says:

“Now I know the posts were all fake, but the 15-year-old me didn’t bother to fact-check.”

For some people in the room, that might seem like a totally different world. Thankfully, for most of us, it is. However, if Members take the time to see some of that stuff online, it is extremely disturbing and alarming. It is a world that we do not understand, but we have to be aware that it exists. The truth, as we can see, is that such groups use popular online platforms to lure in young people and give them a sense of community. One white nationalist group actively targets younger recruits and recently started Call of Duty warcraft gaming tournaments for its supporters. Let us be clear: John was 15, but he could easily have been 18, 19 or indeed significantly older.

John was radicalised by the far right, but we know that similar methods are used by Islamist extremists. A 2020 report from New York University’s Centre for Global Affairs stated:

“The age of social media has allowed ISIS to connect with a large-scale global audience that it would not be able to reach without it...Through strategic targeting, ISIS selects those who are most vulnerable and susceptible to radicalization”.

That includes those who are

“searching for meaning or purpose in their life, feeling anger and…alienated from society”.

The ages that are most vulnerable are 15 to 25.

Social media platforms allow ISIS to present its propaganda as mainstream news at little to no cost. Preventing that harm and breaking those chains of radicalisation is, however, possible, and the Bill could go much further to put the responsibility not on the user, but on the platforms. I believe that those platforms need unique regulation, because social media interaction is fundamentally different from real-life social interaction.

Social media presents content to us as if it is the only voice and viewpoint. On social media, people are far more likely to say things that they never would in person. On social media, those views spread like wildfire in a way that they would not in real life. On social media, algorithms find such content and pump it towards us, in a way that can become overwhelming and that can provide validity and reassurance where doubt might otherwise set in.

Allowing that content to remain online without warnings, or allowing it to be visible to all users unless they go searching through their settings to turn it off—which is wholly unrealistic—is a dereliction of duty and a missed opportunity to clean up the platforms and break the chains of radicalisation. As I set out, the chain of events is not unique to one form of radicalisation or hateful content. The same online algorithms that present extremist content to users also promote negative body image, eating disorders, and self-harm and suicide content.

I hope the Committee realises why I am so impassioned about “legal but harmful” clauses, and why I am particularly upset that a few Conservative Members appear to believe that such content should remain unchecked online because of free speech, with full knowledge that it is exactly that content that serves as the gateway for people to self-harm and to be radicalised. That is not free speech.

--- Later in debate ---
That is not the way to go; we should be writing in the protections. We should be starting from the point of view that no one wants to see content on the promotion of suicide; if they do, they can tick a box to see it. We should start from that point of view: allowing people to opt in if they want to see free speech in an untrammelled way on whatever platform it is.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.

The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.

We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Online Safety Bill

Kim Leadbeater Excerpts
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.