Kim Leadbeater debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Tue 17th Jan 2023
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Mon 5th Dec 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Thu 23rd Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 14th Jun 2022

Oral Answers to Questions

Kim Leadbeater Excerpts
Thursday 22nd February 2024

(2 months, 1 week ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lucy Frazer Portrait Lucy Frazer
- View Speech - Hansard - - - Excerpts

I would just point out that the Labour party has said absolutely nothing in relation to what it would do, so to stand up here and say that we have no plan is absolutely unacceptable. I can be absolutely clear that we do have a plan. We have worked very hard with the sectors. We have already set out in our White Paper the steps that we are taking on a very important aspect in relation to transparency. I will continue to work with the sector on all these areas to ensure that this extremely complex matter comes to a satisfactory conclusion for the creative industries.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

10. For what reason charity lotteries have annual sale limits.

Jamie Stone Portrait Jamie Stone (Caithness, Sutherland and Easter Ross) (LD)
- Hansard - - - Excerpts

16. If she will take steps to remove annual sales limits on charity lotteries.

Stuart Andrew Portrait The Parliamentary Under-Secretary of State for Culture, Media and Sport (Stuart Andrew)
- View Speech - Hansard - - - Excerpts

The limits for society lotteries allow them to raise funding for charities but to remain distinct from other forms of gambling and from the national lottery. The limits were last increased recently, in 2020, but I am aware that some operators want to see the limits raised or removed entirely. It is important that any decisions that are made are based on strong evidence. As such, I have commissioned research in this area, which I hope we will review by the end of the year.

Kim Leadbeater Portrait Kim Leadbeater
- View Speech - Hansard - -

The People’s Postcode Lottery funds some brilliant organisations across Batley and Spen, including the fantastic Rainbow Baby Bank in Heckmondwike and the Game Changerz youth provision in Birstall. However, the current sales limits prevent the People’s Postcode Lottery from giving away even more grants to worthy community organisations across the country, in all our constituencies. Will the Minister therefore explain why casinos and bookies, for example, do not face a sales limit but charity lotteries, which are low risk and fund so many valuable local charities, face that barrier?

Stuart Andrew Portrait Stuart Andrew
- View Speech - Hansard - - - Excerpts

As I said at the reception that the People’s Postcode Lottery held the other night, it was my privilege to set up a society lottery when I worked in a hospice. I recognise the value of such lotteries to charities, and I am aware of the issues that the PPL has raised. I have worked with the Gambling Commission to suggest ways that it can grow under the current network, as it is the largest brand in the sector, but as I say, I want to see more research. We need to understand what the potential harms are, and what the potential effects are on the national lottery. There is not enough data at the moment. That is why I am commissioning independent research, so that we can make decisions based on evidence.

Online Safety Bill

Kim Leadbeater Excerpts
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.

It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.

I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.

I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.

We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.

That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.

Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.

ONLINE SAFETY BILL (Second sitting)

Kim Leadbeater Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

It is a pleasure to serve under your chairship, Dame Angela. It is lovely to be back in a Public Bill Committee with many familiar faces—and a few new ones, including the Minister. However, after devoting many weeks earlier this year to the previous Committee, I must admit that it is with some frustration that we are back here with the Government intent on further weakening their Bill.

Throughout the passage of the Bill, I have raised a number of specific concerns, from democratic and journalistic exemptions, to age verification, recognised news publishers, advocacy bodies and media literacy. On clause 14, while I support the principles of Government amendments 15 and 16, I draw the Minister’s attention to the importance of amendment (a) to amendment 15 and amendment (a) to amendment 16. He has already said that he is sympathetic to those amendments. Let me try to convince him to turn that sympathy into action.

I will focus primarily on an issue that is extremely important to me and to many others: extremism and radicalisation. However, while I will focus on the dangers of extremism and radicalisation, be it right-wing, Islamist, incel or other, the dangers that I am about to set out—the chain of events that leads to considerable harm online—are the same for self-harm content, eating disorder content, health disinformation, climate change disinformation or any dangerous, hateful material directed at people based on their sex, sexual orientation, ethnicity, religion or other characteristics.

Such content is not just deeply offensive and often wholly inaccurate; it is dangerous and vile and serves only to spread harm, misinformation and conspiracy. To be clear, such content is not about a social media user stating how upset and angry they are about the football result, or somebody disagreeing legitimately and passionately about a political issue. It is not the normal, everyday social media content that most people see on their feeds.

This is content that is specifically, carefully and callously designed to sit just below the criminal threshold, yet that can still encourage violence, self-harm or worse. It is content used by extremists of all types that lures vulnerable people in, uses social media likes and comments to create the illusion of legitimacy and popularity, and then directly targets those most likely to be susceptible, encouraging them either to commit harm or to move on to smaller but high-harm platforms that may fall out of the scope of the Bill. This is not free speech; it is content that can act as a dangerous gateway to radicalisation and extremism. The Government know how dangerous it is because their own report from His Majesty’s Prison and Probation Service last year found:

“The Internet appears to be playing an increasingly prominent role in radicalisation processes of those convicted of extremist offences in England and Wales.”

Hon. Members will understand my deep and personal interest in this matter. Since the murder of my sister, a Member of this House, six and a half years ago by a far-right extremist, I have worked hard to bring communities and people together in the face of hatred. Some of that work has included meeting former extremists and discussing how they were radicalised. Those conversations were never easy, but what became very clear to me was that such people are not born extremists. Their radicalisation starts somewhere, and it is often somewhere that appears to be completely innocent, such as a Facebook group about issues or problems in their community, a Twitter discussion about current affairs or the state of the country, or even a page for supporters of their football team.

One day, a comment is posted that is not illegal and is not hate speech, but that references a conspiracy or a common trope. It is an ideological remark placed there to test the water. The conversation moves on and escalates. More disturbing or even violent comments start to be made. They might be accompanied by images or videos, leading those involved down a more sinister path. Nothing yet is illegal, but clearly—I hope we would all agree—it is unacceptable.

The number of contributors reduces, but a few remain. No warnings are presented, no flags are raised and it appears like normal social media content. However, the person reading it might be lonely or vulnerable, and now feels that they have found people to listen to them. They might be depressed or unhappy and looking to blame their situation on something or someone. They might feel that nobody understands them, but these people seem to.

The discussion is then taken to a more private place, to the smaller but more harmful platforms that may fall outside the scope of the Bill, but that will now become the go-to place for spreading extremism, misinformation and other harmful content. The radicalisation continues there—harder to track, harder to monitor and harder to stop. Let us remember, however, that all of that started with those legal but harmful comments being witnessed. They were clearly unacceptable, but mainstream social media give them legitimacy. The Online Safety Bill will do nothing to stop that.

Unfortunately, that chain of events occurs far too often. It is a story told many times, about how somebody vulnerable is lured in by those wishing to spread their hatred. It is hosted by major social media platforms. Hon. Members may remember the case of John, a teenager radicalised online and subsequently sentenced. His story was covered by The Guardian last year. John was feeling a sense of hopelessness, which left him susceptible to the messaging of the far right. Aged 15, he felt “written off”: he was in the bottom set at school, with zero exam expectations, and feeling that his life opportunities would be dismal. The far right, however, promised him a future. John became increasingly radicalised by an online barrage of far-right disinformation. He said:

“I was relying on the far right for a job. They were saying that when they got power they would be giving jobs to people like me”.

John now says:

“Now I know the posts were all fake, but the 15-year-old me didn’t bother to fact-check.”

For some people in the room, that might seem like a totally different world. Thankfully, for most of us, it is. However, if Members take the time to see some of that stuff online, it is extremely disturbing and alarming. It is a world that we do not understand, but we have to be aware that it exists. The truth, as we can see, is that such groups use popular online platforms to lure in young people and give them a sense of community. One white nationalist group actively targets younger recruits and recently started Call of Duty warcraft gaming tournaments for its supporters. Let us be clear: John was 15, but he could easily have been 18, 19 or indeed significantly older.

John was radicalised by the far right, but we know that similar methods are used by Islamist extremists. A 2020 report from New York University’s Centre for Global Affairs stated:

“The age of social media has allowed ISIS to connect with a large-scale global audience that it would not be able to reach without it...Through strategic targeting, ISIS selects those who are most vulnerable and susceptible to radicalization”.

That includes those who are

“searching for meaning or purpose in their life, feeling anger and…alienated from society”.

The ages that are most vulnerable are 15 to 25.

Social media platforms allow ISIS to present its propaganda as mainstream news at little to no cost. Preventing that harm and breaking those chains of radicalisation is, however, possible, and the Bill could go much further to put the responsibility not on the user, but on the platforms. I believe that those platforms need unique regulation, because social media interaction is fundamentally different from real-life social interaction.

Social media presents content to us as if it is the only voice and viewpoint. On social media, people are far more likely to say things that they never would in person. On social media, those views spread like wildfire in a way that they would not in real life. On social media, algorithms find such content and pump it towards us, in a way that can become overwhelming and that can provide validity and reassurance where doubt might otherwise set in.

Allowing that content to remain online without warnings, or allowing it to be visible to all users unless they go searching through their settings to turn it off—which is wholly unrealistic—is a dereliction of duty and a missed opportunity to clean up the platforms and break the chains of radicalisation. As I set out, the chain of events is not unique to one form of radicalisation or hateful content. The same online algorithms that present extremist content to users also promote negative body image, eating disorders, and self-harm and suicide content.

I hope the Committee realises why I am so impassioned about “legal but harmful” clauses, and why I am particularly upset that a few Conservative Members appear to believe that such content should remain unchecked online because of free speech, with full knowledge that it is exactly that content that serves as the gateway for people to self-harm and to be radicalised. That is not free speech.

--- Later in debate ---
That is not the way to go; we should be writing in the protections. We should be starting from the point of view that no one wants to see content on the promotion of suicide; if they do, they can tick a box to see it. We should start from that point of view: allowing people to opt in if they want to see free speech in an untrammelled way on whatever platform it is.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.

The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.

We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, this clause requires providers of relevant services to publish annual transparency reports and sets out Ofcom’s powers in relation to those reports. The information set out in transparency reports is intended to help users to understand the steps that providers are taking to help keep them safe and to provide Ofcom with the information required to hold them to account.

These duties on regulated services are very welcome indeed. Labour has long held the view that mandatory transparency reporting and reporting mechanisms are vital to hold platforms to account, and to understand the true nature of how online harm is driven and perpetuated on the internet.

I will reiterate the points that were made in previous Committee sittings about our concerns about the regularity of these transparency reports. I note that, sadly, those reports remain unchanged and therefore they will only have to be submitted to Ofcom annually. It is important that the Minister truly considers the rapid rate at which the online world can change and develop, so I urge him to reconsider this point and to make these reports a biannual occurrence. Labour firmly believes that increasing the frequency of the transparency reports will ensure that platforms and services remain on the pulse, and are forced to be aware of and act on emergent risks. In turn, that would compel Ofcom to do the same in its role as an industry regulator.

I must also put on the record some of our concerns about subsections (12) and (13), which state that the Secretary of State of the day could amend by regulation the frequency of the transparency reporting, having consulted Ofcom first. I hope that the Minister can reassure us that this approach will not result in our ending up in a position where, perhaps because of Ofcom’s incredible workload, transparency reporting becomes even less frequent than an annual occurrence. We need to see more transparency, not less, so I really hope that he can reassure me on this particular point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that transparency should be at the heart of this Bill and that the Government have missed an opportunity to accelerate the inclusion of a provision in the Bill, namely the requirement to give researchers and academics access to platform data? Data access must be prioritised in the Bill and without such prioritisation the UK will fall behind the rest of Europe in safety, research and innovation. The accessibility and transparency of that data from a research perspective are really important.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. We both made the point at length in previous sittings of the Committee about the need to ensure transparency, access to the data, and access to reporting for academics, civil society and researchers.

That also goes to the point that it is not for this Committee or this Minister—it is not in his gift—to determine something that we have all discussed in this place at length, which is the potential requirement for a standalone Committee specifically to consider online harm. Such a Committee would look at whether this legislation is actively doing what we need it to do, whether it needs to be reviewed, whether it could look at the annual reports from Ofcom to determine the length and breadth of harm on the internet, and whether or not this legislation is actually having an impact. That all goes to the heart of transparency, openness and the review that we have been talking about.

I want to go further and raise concerns about how public the reports will be, as we have touched on. The Government claim that their so-called triple shield approach will give users of platforms and services more power and knowledge to understand the harms that they may discover online. That is in direct contradiction to the Bill’s current approach, which does not provide any clarity about exactly how the transparency reports will be made available to the public. In short, we feel that the Government are missing a significant opportunity. We have heard many warnings about what can happen when platforms are able to hide behind a veil of secrecy. I need only point to the revelations of whistleblowers, including Frances Haugen, to highlight the importance of that point.

As the Bill stands, once Ofcom has issued a notice, companies will have to produce a transparency report that

“must…be published in the manner and by the date specified in the notice”.

I want to press the Minister on that and ask him to clarify the wording. We are keen for the reports to be published publicly and in an accessible way, so that users, civil society, researchers and anyone else who wants to see them can make sense of them. The information contained in the transparency reports is critical to analysing trends and harms, so I hope that the Minister will clarify those points in his response.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does my hon. Friend agree that if the Government are to achieve their objective—which we all share—for the Bill to be world-leading legislation, we cannot rely on whistleblowers to tell us what is really going on in the online space? That is why transparency is vital. This is the perfect opportunity to provide that transparency, so that we can do some proper research into what is going on out there. We cannot rely on whistleblowers to give us such information.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. We want the Bill to work. We have always wanted the Bill to work. We want it to achieve its aim of keeping children, adults and everyone who uses the internet safe from the harms that are perpetuated there. If there is no transparency, how will we know that the platforms are breaking the rules covertly, and whether they are hiding content and getting round the rules? That is what they do; we know it, because we have heard it from whistleblowers, but we cannot rely on whistleblowers alone to highlight exactly what happens behind the closed doors of the platforms.

We need the transparency and the reports to be made public, so that we can see whether the legislation is working. If that does not happen, although we have waited five years, we will need another piece of legislation to fix it. We know that the Bill is not perfect, and the Minister knows that—he has said so himself—but, ultimately, we need to know that it works. If it does not, we have a responsibility as legislators to put something in place that does. Transparency is the only way in which we will figure that out.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

ONLINE SAFETY BILL (First sitting)

Kim Leadbeater Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Online Safety Bill

Kim Leadbeater Excerpts
Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Order. Just a quick reminder: I know it is extremely difficult, and I do not want to interrupt hon. Members when they are making their speeches, but it is important that we try to address the amendments that are before us today. There will be a separate debate on whether to recommit the Bill and on the other ideas, so they can be addressed at that point. As I say, it is important to relate remarks to the amendments that are before us.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - -

I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

Rugby League World Cup 2022

Kim Leadbeater Excerpts
Tuesday 25th October 2022

(1 year, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Judith Cummins Portrait Judith Cummins (Bradford South) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered the contribution of the Rugby League World Cup 2022 to culture and sport in the UK.

It is a pleasure to serve under your chairmanship, Mr Davies—a fellow Bradford MP who recognises the power of rugby league. I am thrilled to be here to debate the rugby league world cup. I have the honour of speaking about the sport, the tournament and the importance of its legacy.

I know many Members share my passion for and belief in rugby league. The power and potential of rugby league is phenomenal, because of what it means to our communities and what it can achieve in those communities. That passion and belief is shared by Members right across the political divide. Rugby league is a unifying force indeed.

The world cup is always a special moment in the sporting calendar, but this year it promises to make a huge impact, setting a new bar for the sport and recognition of all that it offers to the country and on the international stage. The road to this world cup has not been easy. Preparations for the tournament began in November 2015. The lifting of the trophies will mark a seven-year journey that has spanned a global pandemic and multiple crises. That we have got here is a testament to the organisers, whose determination parallels the sport itself. Teams from around the globe have gathered in the birthplace of the sport. For the English heartlands of working-class communities in our northern towns and cities, rugby league has come home.

There is so much to celebrate and marvel at, both on and off the pitch. Since its foundation in 1895, rugby league has always been groundbreaking, and the world cup is no exception. For the first time, the men’s, women’s and wheelchair tournaments will be staged simultaneously. It will be the biggest, best and most inclusive rugby league event in history.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

I thank my hon. Friend for securing this important debate. I am very proud of the excellent rugby league clubs in Batley and Spen, including the brilliant Batley Bulldogs, Birkenshaw Blue Dogs and Birstall Victoria, along with Batley Boys and Batley Girls. They provide opportunities for boys and girls, men and women of all ages and from all different backgrounds to play this fantastic sport, and are at the heart of our communities. Does my hon. Friend agree that the world cup—men’s, women’s and wheelchair—is a fantastic springboard to get more people into grassroots rugby league, who will hopefully rise up to be the world cup stars of the future?

Judith Cummins Portrait Judith Cummins
- Hansard - - - Excerpts

I thank my hon. Friend for her important intervention and I welcome her statement about the inclusivity of rugby league. The competition does offer a springboard for grassroots rugby league to re-emerge much stronger post pandemic.

Online Safety Bill

Kim Leadbeater Excerpts
Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

I am grateful to the right hon. Member for Kingston upon Hull North (Dame Diana Johnson) for keeping her powder dry and deferring her speech until the next group of amendments, so Members now have five minutes each.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - -

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Online Safety Bill (Fifteenth sitting)

Kim Leadbeater Excerpts
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

New clause 3 would make provision for a statutory user advocacy body representing the interests of children. It would also allow the Secretary of State to appoint a new or existing body as the statutory user advocate. A strong, authoritative and well-resourced voice that can speak for children in regulatory debates would ensure that complex safeguarding issues are well understood, and would also actively inform the regulator’s decisions.

Charities have highlighted that the complaints and reporting mechanisms in the Bill may not always be appropriate for children. Ofcom’s own evidence shows that only 14% to 12 to 15-year-old children have ever reported content. Children who are most at risk of online harms may find it incredibly challenging to complete a multi-stage reporting and complaints process. Dame Rachel de Souza told the Committee:

“I worry that the Bill does not do enough to respond to individual cases of abuse and that it needs to do more to understand issues and concerns directly from children. Children should not have to exhaust the platforms’ ineffective complaints routes, which can take days, weeks or even months. I have just conducted a survey of 2,000 children and asked them about their experiences in the past month. Of those 2,000 children, 50% had seen harmful content and 40% had tried to get content about themselves removed and had not succeeded. For me, there is something really important about listening to children and taking their complaints into account.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

A children’s advocacy body would be able to support children with redress mechanisms that are fundamentally targeted at adults. Given how many children now use the internet, that is an essential element that is missing from the Bill. That is why the super-complaints mechanism needs to be strengthened with specific arrangements for children, as advocated by the National Society for the Prevention of Cruelty to Children and other children’s organisations. A statutory user advocacy body could support the regulator, as well as supporting child users. It would actively promote the interests of children in regulatory decision making and offer support by ensuring that an understanding of children’s behaviour and safeguarding is front and centre in its approach.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

My hon. Friend is making a really valid point. As I look around the room—I mean this with no disrespect to anybody—I see that we are all of an age at which we do not understand the internet in the same way that children and young people do. Surely, one of the key purposes of the Bill is to make sure that children and young people are protected from harms online, and as the Children’s Commissioner said in her evidence, their voices have to be heard. I am sure that, like me, many Members present attend schools as part of their weekly constituency visits, and the conversations we have with young people are some of the most empowering and important parts of this job. We have to make sure that the voices of the young people who we all represent are heard in this important piece of legislation, and it is really important that we have an advocacy body to ensure that.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I very much agree with my hon. Friend. She is quite right: we have to remember that we do not see these things as children and young people do.

The user advocacy body that my hon. Friend has just spoken in support of could also shine a light on the practices that are most harmful to children by using data, evidence and specialist expertise to point to new and emerging areas of harm. That would enable the regulator to ensure its risk profiles and regulatory approach remain valid and up to date. In his evidence, Andy Burrows of the NSPCC highlighted the importance of an advocacy body acting as an early warning system:

“Given the very welcome systemic approach of the regime, that early warning function is particularly important, because there is the potential that if harms cannot be identified quickly, we will see a lag where whole regulatory cycles are missed. User advocacy can help to plug that gap, meaning that harms are identified at an earlier stage, and then the positive design of the process, with the risk profiles and company risk assessments, means that those harms can be built into that particular cycle.”––[Official Report, Online Safety Public Bill Committee, 24 May 2022; c. 16, Q22.]

The provision in the new clause is comparable to those that already exist in many other sectors. For example, Citizens Advice is the statutory user advocate for consumers of energy and the postal services, and there are similar arrangements representing users of public transport. Establishing a children’s user advocacy body would ensure that the most vulnerable online users of all—children at risk of online sexual abuse—receive equivalent protections to customers of post offices or passengers on a bus.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

--- Later in debate ---
I hope those comments make it clear that we already have a statutory advocate: the Children’s Commissioner. Clause 140 contains facilities for other organisations besides our existing statutory advocate to formally and legally raise with Ofcom issues that may arise. Ofcom is bound to reply—it is not optional. Ofcom has to listen to complaints and it has to respond.
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I agree wholeheartedly about the importance of the role of the Children’s Commissioner and she does a fantastic job, but is it not testament to the fact that there is a need for this advocacy body that she is advocating for it and thinks it is a really good idea? The Children Act 2004 is a fantastic Act, but that was nearly 20 years ago and the world has changed significantly since then. The Bill shows that. The fact that she is advocating for it may suggest that she sees the need for a separate entity.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a danger if we over-create statutory bodies with overlapping responsibilities. I just read out the current statutory functions of the Children’s Commissioner under the 2004 Act. If we were to agree to the new clause, we would basically be creating a second statutory advocate or body with duties that are the same as some of those that the Children’s Commissioner already exercises. I read from section 2 of the Act, where those duties are set out. I do not think that having two people with conflicting or competing duties would be particularly helpful.

Online Safety Bill (Thirteenth sitting)

Kim Leadbeater Excerpts
Committee stage & Committee Debate - 13th sitting
Tuesday 21st June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 21 June 2022 - (21 Jun 2022)
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

It is fantastic to hear that those other things are happening—that is all well and good—but surely we should explicitly call out disinformation and misinformation in the Online Safety Bill. The package of other measures that the Minister mentions is fantastic, but I think they have to be in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady says that those measures should be in the Bill—more than they already are—but as I have pointed out, the way in which the legal architecture of the Bill works means that the mechanisms to do that would be adding a criminal offence to schedule 7 as a priority offence, for example, or using a statutory instrument to designate the relevant kind of harm as a priority harm, which we plan to do in due course for a number of harms. The Bill can cover disinformation with the use of those mechanisms.

We have not put the harmful to adults content in the Bill; it will be set out in statutory instruments. The National Security Bill is still progressing through Parliament, and we cannot have in schedule 7 of this Bill an offence that has not yet been passed by Parliament. I hope that that explains the legal architecture and mechanisms that could be used under the Bill to give force to those matters.

On amendment 57, the Government feel that six months is a very short time within which to reach clear conclusions, and that 18 months is a more appropriate timeframe in which to understand how the Bill is bedding in and operating. Amendment 58 would require Ofcom to produce a code of practice on system-level disinformation. To be clear, the Bill already requires Ofcom to produce codes of practice that set out the steps that providers will take to tackle illegal content— I mentioned the new National Security Bill, which is going through Parliament—and harmful content, which may, in some circumstances, include disinformation.

Disinformation that is illegal or harmful to individuals is in scope of the duties set out in the Bill. Ofcom’s codes of practice will, as part of those duties, have to set out the steps that providers should take to reduce harm to users that arises from such disinformation. Those steps could include content-neutral design choices or interventions of other kinds. We would like Ofcom to have a certain amount of flexibility in how it develops those codes of practice, including by being able to combine or disaggregate those codes in ways that are most helpful to the general public and the services that have to pay regard to them. That is why we have constructed them in the way we have. I hope that provides clarity about the way that disinformation can be brought into the scope of the Bill and how that measure then flows through to the codes of practice. I gently resist amendments 57 and 58 while supporting the clause standing part of the Bill.

Question put, That the amendment be made.

Online Safety Bill (Ninth sitting)

Kim Leadbeater Excerpts
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There is a difference between random individuals posting stuff on Facebook, as opposed to content generated by what we have defined as a “recognised news publisher”. We will debate that in a moment. We recognise that is different in the Bill. Although the Opposition are looking to make amendments to clause 50, they appear to accept that the press deserve special protection. Article 10 case law deriving from the European convention on human rights also recognises that the press have a special status. In our political discourse we often refer generally to the importance of the freedom of the press. We recognise that the press are different, and the press have made the case—both newspapers and broadcasters, all of which now have websites—that their reader engagement is an integral part of that free speech. There is a difference between that and individuals chucking stuff on Facebook outside of the context of a news article.

There is then a question about whether, despite that, those comments are still sufficiently dangerous that they merit regulation by the Bill—a point that the shadow Minister, the hon. Member for Pontypridd, raised. There is a functional difference between comments made on platforms such as Facebook, Twitter, TikTok, Snapchat or Instagram, and comments made below the line on a news website, whether it is The Guardian, the Daily Mail, the BBC—even The National. The difference is that on social media platforms, which are the principal topic of the Bill, there is an in-built concept of virality—things going viral by sharing and propagating content widely. The whole thing can spiral rapidly out of control.

Virality is an inherent design feature in social media sites. It is not an inherent design feature of the comments we get under the news website of the BBC, The Guardian or the Daily Mail. There is no way of generating virality in the same way as there is on Facebook and Twitter. Facebook and Twitter are designed to generate massive virality in a way that comments below a news website are not. The reach, and the ability for them to grow exponentially, is orders of magnitude lower on a news website comment section than on Facebook. That is an important difference, from a risk point of view.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - -

This issue comes down to a fundamental point—are we looking at volume or risk? There is no difference between an individual—a young person in this instance—seeing something about suicide or self-harm on a Facebook post or in the comments section of a newspaper article. The volume—whether it goes viral or not—does not matter if that individual has seen that content and it has directed them to somewhere that will create serious harm and lead them towards dangerous behaviour. The volume is not the point.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Lady raises an important philosophical question that underpins much of the Bill’s architecture. All the measures are intended to strike a balance. Where there are things that are at risk of leading to illegal activity, and things that are harmful to children, we are clamping down hard, but in other areas we are being more proportionate. For example, the legal but harmful to adult duties only apply to category 1 companies, and we are looking at whether that can be extended to other high-risk companies, as we debated earlier. In the earlier provisions that we debated, about “have regard to free speech”, there is a balancing exercise between the safety duties and free speech. A lot of the provisions in the Bill have a sense of balance and proportionality. In some areas, such as child sexual exploitation and abuse, there is no balance. We just want to stop that—end of story. In other areas, such as matters that are legal but harmful and touch on free speech, there is more of a balancing exercise.

In this area of news publisher content, we are again striking a balance. We are saying that the inherent harmfulness of those sites, owing to their functionality—they do not go viral in the same way—is much lower. There is also an interaction with freedom of the press, as I said earlier. Thus, we draw the balance in a slightly different way. To take the example of suicide promotion or self-harm content, there is a big difference between stumbling across something in comment No. 74 below a BBC article, versus the tragic case of Molly Russell—the 14-year-old girl whose Instagram account was actively flooded, many times a day, with awful content promoting suicide. That led her to take her own life.

I think the hon. Member for Batley and Spen would probably accept that there is a functional difference between a comment that someone has to scroll down a long way to find and probably sees only once, and being actively flooded with awful content. In having regard to those different arguments—the risk and the freedom of the press—we try to strike a balance. I accept that they are not easy balances to strike, and that there is a legitimate debate to be had on them. However, that is the reason that we have adopted this approach.

--- Later in debate ---
“Recognised news publisher”
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I beg to move amendment 107, in clause 50, page 46, line 46, leave out from end to end of clause and insert

“is a member of an approved regulator (as defined in section 42 of the Crime and Courts Act 2013).”

This amendment expands the definition of a recognised news publisher to incorporate any entity that is a member of an approved regulator.

The primary purpose of the Bill is to protect social media users from harm, and it will have failed if it does not achieve that. Alongside that objective, the Bill must protect freedom of expression and, in particular, the freedom of the press, which I know we are all committed to upholding and defending. However, in evaluating the balance between freedom of the press and the freedom to enjoy the digital world without encountering harm, the Bill as drafted has far too many loopholes and risks granting legal protection to those who wish to spread harmful content and disinformation in the name of journalism.

Amendment 107 will address that imbalance and protect the press and us all from harm. The media exemption in the Bill is a complete exemption, which would take content posted by news publishers entirely out of the scope of platforms’ legal duties to protect their users. Such a powerful exemption must be drafted with care to ensure it is not open to abuse. However, the criteria that organisations must meet to qualify for the exemption, which are set out in clause 50, are loose and, in some cases, almost meaningless. They are open to abuse, they are ambiguous and they confer responsibility on the platforms themselves to decide which publishers meet the Bill’s criteria and which do not.

In evidence that we heard recently, it was clear that the major platforms do not believe it is a responsibility they should be expected to bear, nor do they have the confidence or feel qualified to do so. Furthermore, David Wolfe, chairman of the Press Recognition Panel, has advised that the measure represents a threat to press freedom. I agree.

Opening the gates for any organisation to declare themselves a news publisher by obtaining a UK address, jotting down a standards code on the back of an envelope and inviting readers to send an email if they have any complaints is not defending the press; it is opening the profession up to abuse and, in the long term, risks weakening its rights and protections.

Let us discuss those who may wish to exploit that loophole and receive legal protection to publish harmful content. A number of far-right websites have made white supremacist claims and praised Holocaust deniers. Those websites already meet several of the criteria for exemption and could meet the remaining criteria overnight. The internet is full of groups that describes themselves as news publishers but distribute profoundly damaging and dangerous material designed to promote extremist ideologies and stir up hatred.

We can all think of high-profile individuals who use the internet to propagate disinformation, dangerous conspiracy theories and antisemitic, Islamophobic, homophobic or other forms of abuse. They might consider themselves journalists, but the genuine professionals whose rights we want to protect beg to differ. None of those individuals should be free to publish harmful material as a result of exemptions that are designed for quite a different purpose. Is it really the Government’s intention that any organisation that meets their loose criteria, as defined in the Bill, should be afforded the sacrosanct rights and freedoms of the press that we all seek to defend?

I turn to disinformation, and to hostile state actors who wish to sow the seeds of doubt and division in our politics and our civic life. The Committee has already heard that Russia Today is among those expected to benefit from the exemption. I have a legal opinion from Tamsin Allen, a senior media lawyer at Bindmans LLP, which notes that,

“were the bill to become law in its present form, Russia Today would benefit from the media exemption. The exemption for print and online news publications is so wide that it would encompass virtually all publishers with multiple contributors, an editor and some form of complaints procedure and standards code, no matter how inadequate. I understand that RT is subject to a standards code in Russia and operates a complaints procedure. Moreover, this exemption could also apply to a publisher promoting hate or violence, providing it met the (minimal) standards set out in the bill and constituted itself as a ‘news’ or ‘gossip’ publication. The only such publications which would not be exempt are those published by organisations proscribed under the Terrorism Act.”

If hostile foreign states can exploit this loophole in the Bill to spread disinformation to social media users in the UK, that is a matter of national security and a threat to our freedom and open democracy. The requirement to have a UK address offers little by way of protection. International publishers spreading hate, disinformation or other forms of online harm could easily set up offices in the UK to qualify for this exemption and instantly make the UK the harm capital of the world. For those reasons, the criteria must change.

We heard from several individuals in evidence that the exemption should be removed entirely from the Bill, but we are committed to freedom of the press as well as providing proper protections from harm. Instead of removing the exemption, I propose a change to the qualifying criteria to ensure that credible publishers can access it while extremist and harmful publishers cannot.

My amendment would replace the convoluted list of requirements with a single and simple requirement for the platforms to follow and adhere to: that all print and online media that seeks to benefit from the exemption should be independently regulated under the royal charter provisions that this House has already legislated for. If, as the Bill already says, broadcast media should be defined in this way, why not print media too? Unlike the Government’s criteria, the likes of Russia Today, white supremacist blogs and other deeply disturbing extremist publications simply could not satisfy this requirement. If they were ever to succeed in signing up to such a regulator, they would swiftly be expelled for repeated standards breaches.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Batley and Spen for her speech. There is agreement across the House, in this Committee and in the Joint Committee that the commitment to having a free press in this country is extremely important. That is why recognised news publishers are exempted from the provisions of the Bill, as the hon. Lady said.

The clause, as drafted, has been looked at in some detail over a number of years and debated with news publishers and others. It is the best attempt that we have so far collectively been able to come up with to provide a definition of a news publisher that does not infringe on press freedom. The Government are concerned that if the amendment were adopted, it would effectively require news publishers to register with a regulator in order to benefit from the exemption. That would constitute the imposition of a mandatory press regulator by the back door. I put on record that this Government do not support any kind of mandatory or statutory press regulation, in any form, for reasons of freedom of the press. Despite what has been said in previous debates, we think to do that would unreasonably restrict the freedom of the press in this country.

While I understand its intention, the amendment would drive news media organisations, both print and broadcast, into the arms of a regulator, because they would have to join one in order to get the exemption. We do not think it is right to create that obligation. We have reached the philosophical position that statutory or mandatory regulation of the press is incompatible with press freedom. We have been clear about that general principle and cannot accept the amendment, which would violate that principle.

In relation to hostile states, such as Russia, I do not think anyone in the UK press would have the slightest objection to us finding ways to tighten up on such matters. As I have flagged previously, thought is being given to that issue, but in terms of the freedom of the domestic press, we feel very strongly that pushing people towards a regulator is inappropriate in the context of a free press.

The characterisation of these provisions is a little unfair, because some of the requirements are not trivial. The requirement in 50(2)(f) is that there must be a person—I think it includes a legal person as well as an actual person—who has legal responsibility for the material published, which means that, unlike with pretty much everything that appears on the internet, there is an identified person who has legal responsibility. That is a very important requirement. Some of the other requirements, such as having a registered address and a standards code, are relatively easy to meet, but the point about legal responsibility is very important. For that reason, I respectfully resist the amendment.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - -

I will not push the amendment to a vote, but it is important to continue this conversation, and I encourage the Minister to consider the matter as the Bill proceeds. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

I beg to move amendment 86, in clause 50, page 47, line 3, after “material” insert—

“or special interest news material”.