Debates between Kirsty Blackman and Kim Leadbeater during the 2019 Parliament

Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Thu 23rd Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 24th May 2022
Tue 24th May 2022

ONLINE SAFETY BILL (Second sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

This may be the only place in the Bill where I do not necessarily agree wholeheartedly with the Labour Front Benchers. I agree with the vast majority of what was said, but I have some concerns about making mandatory the requirement for transparency reports to be public in all circumstances, because there are circumstances in which that would simply highlight loopholes, allowing people to exploit them in a way that we do not want them to do.

Specifically on the regularity of reporting and some level of transparency, given that the Minister is keen on the commercial imperative and ensuring that people are safe, we need a higher level of transparency than we currently see among the platforms. There is a very good case to be made for some of the transparency reporting to be made public, particularly for the very largest platforms to be required to make it public, or to make sections of it public.

I want to talk about the speed of change to the terms of service and about proportionality. If Ofcom could request transparency reporting only annually, imagine that it received transparency information three days before Elon Musk took over Twitter. Twitter would be a completely different place three days later, and Ofcom would be unable to ask for more transparency information for a whole year, by which point a significant amount of damage could have been done. We have seen that the terms of service can change quickly. Ofcom would not have the flexibility to ask for an updated transparency report, even if drastic changes were made to the services.

Another thing slightly concerns me about doing this annually and not allowing a bit more flexibility. Let us say that a small platform that none of us has ever heard of, such as Mastodon, shoots to prominence overnight. Let us also say that, as a small platform, Mastodon was previously regulated, and Ofcom had made a request for transparency information shortly before Elon Musk took over Twitter and people had migrated to Mastodon. Mastodon would now be suffering from very different issues than those it had when it had a small number of users, compared with the significant number that it has now. It would have changed dramatically, yet Ofcom would not have the flexibility to seek that information. We know that platforms in the online world have sudden stellar increases in popularity overnight. Some have been bubbling along for ages with nobody using them. Not all of them are brand-new platforms that suddenly shoot to prominence. The lack of flexibility is a problem.

Lastly, I agree about researchers being able to access the transparency information provided. It is really important that we recognise that Ofcom is not the only expert. Ofcom has a huge amount of expertise, and it is massively increasing its staff numbers to cope with these issues, but the reality is that those staff are not academic researchers. They are unable to look at the issues and are not necessarily the most prominent experts in the field of child protection, for example. That is not to take away from the expertise in Ofcom, but we could allow it to ask a regulated group of researchers to look at the information and point out any issues that may not have been spotted, particularly given the volume of transparency reports that there are likely to be.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady makes an important point. In terms of transparency, the question for me is, what are the Government worried about? Surely part of the Bill is about finding out what is really going on, and the only way that we will do that is by having access to the information. The more transparency, the better. The hon. Lady is right that having experts who can research what is going on is fundamental. If there is a concern around the workload for Ofcom, that is a separate issue that the Minister needs to address, but surely the more work that is done in terms of research and transparency, the better.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

ONLINE SAFETY BILL (First sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Roger. I absolutely agree with the hon. Member for Warrington North. The platform works by stitching things together, so a video could have a bit of somebody else’s video in it, and that content ends up being shared and disseminated more widely.

This is not an attack on every algorithm. I am delighted to see lots of videos of cats—it is wonderful, and it suits me down to the ground—but the amendment asks platforms to analyse how those processes contribute to the development of habit-forming behaviour and to mitigate the harm caused to children by habit-forming features in the service. It is not saying, “You can’t use algorithms” or “You can’t use anything that may encourage people to linger on your site.” The specific issue is addiction—the fact that people will get sucked in and stay on platforms for hours longer than is healthy.

There is a demographic divide here. There is a significant issue when we compare children whose parents are engaged in these issues and spend time—and have the time to spend—assisting them to use the internet. There is a divide between the experiences of those children online and the experiences of children who are generally not nearly as well off, whose parents may be working two or three jobs to try to keep their homes warm and keep food on the table, so the level of supervision those children have may be far lower. We have a parental education gap, where parents are not able to instruct or teach their children a sensible way to use these things. A lot of parents have not used things such as TikTok and do not know how it works, so they are unable to teach their children.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Does the hon. Lady agree that this feeds into the problem we have with the lack of a digital media literacy strategy in the Bill, which we have, sadly, had to accept? However, that makes it even more important that we protect children wherever we have the opportunity to do so, and this amendment is a good example of where we can do that.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Lady makes an excellent point. This is not about mandating that platforms stop doing these things; it is about ensuring that they take this issue into account and that they agree—or that we as legislators agree—with the Royal College of Psychiatrists that we have a responsibility to tackle it. We have a responsibility to ask Ofcom to tackle it with platforms.

This comes back to the fact that we do not have a user advocacy panel, and groups representing children are not able to bring emerging issues forward adequately and effectively. Because of the many other inadequacies in the Bill, that is even more important than it was. I assume the Minister will not accept my amendment—that generally does not happen in Bill Committees—but if he does not, it would be helpful if he could give Ofcom some sort of direction of travel so that it knows it should take this issue into consideration when it deals with platforms. Ofcom should be talking to platforms about habit-forming features and considering the addictive nature of these things; it should be doing what it can to protect children. This threat has emerged only in recent years, and things will not get any better unless we take action.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Does the hon. Lady agree that people out there in the real world have absolutely no idea what a platform’s terms of service are, so we are being expected to make a judgment on something about which we have absolutely no knowledge?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Online Safety Bill

Debate between Kirsty Blackman and Kim Leadbeater
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of amendments 15 to 19 in the names of my hon. Friends and, later, amendments 11 and 12 in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright).

As we discussed at great length in Committee—my first Bill Committee; a nice simple one to get me started—the Bill has a number of critical clauses to address the atrocious incidence of child sexual expectation online. Amendments 15 to 19 are aimed at strengthening those protections and helping to ensure that the internet is a safer place for every young person. Amendments 15 and 16 will bring into scope tens of millions of interactions with accounts that actively enable the discovery and sharing of child abuse material. Amendments 17 to 19 will tackle the issue of cross-platform abuse, where abuse starts on one platform and continues on another. These are urgent measures that children’s charities and advocacy groups have long called for, and I seriously hope this House will support them.

Last week, along with the shadow Minister and the then Minister, I attended an extremely moving reception hosted by one of those organisations, the NSPCC. It included a speech by Rachel, a mother of a victim of online grooming and child sexual exploitation. She outlined in a very powerful way how her son Ben was forced from the age of 13 to take and share photos of himself that he did not want to, and to enter Skype chats with multiple men. He was then blackmailed with those images and subjected to threats of violence to his family. Rachel said to us:

“We blamed ourselves and I thought we had failed…I felt like I hadn’t done enough to protect our children”.

I want to say to you, Rachel, that you did not fail Ben. Responsibility for what happened to Ben lies firmly with the perpetrators of these heinous crimes, but what did fail Ben and has failed our young people for far too long is the lack of urgency and political will to regulate the wild west of the internet. No one is pretending that this is an easy task, and we are dealing with a highly complex piece of legislation, but if we are to protect future Bens we have to strengthen this Bill as much as possible.

Another young woman, Danielle, spoke during the NSPCC event. She had been a victim of online CSE that had escalated into horrific real-world physical and sexual abuse. She told us how she has to live with the fear that her photos may appear online and be shared without her knowledge or control. She is a strong young woman who is moving on with her life with huge resilience, but her trauma is very real. Amendment 19 would ensure that proportionate measures are in place to prevent the encountering or dissemination of child abuse content—for example, through intelligence sharing of new and emerging threats. This will protect Danielle and people like her, giving them some comfort that measures are in place to stop the spread of these images and to place far more onus on the platforms to get on top of this horrific practice.

Amendments 11 and 12, in the name of the right hon. and learned Member for Kenilworth and Southam, will raise the threshold for non-broadcast media outlets to benefit from the recognised news publisher exemption by requiring that such publishers are subject to complaints procedures that are both suitable and sufficient. I support those amendments, which, while not perfect, are a step forward in ensuring that this exception is protected from abuse.

I am also pleased that the Government have listened to some of my and other Members’ concerns and have now agreed to bring forward amendments at a later stage to exclude sanctioned publishers such as Russia Today from accessing this exemption. However, there are hundreds if not thousands of so-called news publishers across the internet that pose a serious threat, from the far right and also from Islamist, antisemitic and dangerous conspiratorial extremism. We must act to ensure that journalistic protections are not abused by those wishing to spread harm. Let us be clear that this is as much about protecting journalism as it is about protecting users from harm.

We cannot overstate the seriousness of getting this right. Carving out protections within the Bill creates a risk that if we do not get the criteria for this exemption right, harmful and extremist websites based internationally will simply establish offices in the UK, just so that they too can access this powerful new protection. Amendments 11 and 12 will go some way towards ensuring that news publishers are genuine, but I recognise that the amendments are not the perfect solution and that more work is needed as the Bill progresses in the other place.

In closing, I hope that we can find consensus today around the importance of protecting children online and restricting harmful content. It is not always easy, but I know we can find common ground in this place, as we saw during the Committee stage of the Bill when I was delighted to gain cross-party support to secure the introduction of Zach’s law, inspired by my young constituent Zach Eagling, which will outlaw the dreadful practice of epilepsy trolling online.

Online Safety Bill (Fifteenth sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

The hon. Lady is making some excellent points. I wholeheartedly agree with her about funding for bodies that might be able to support the advocacy body or act as part of it. She makes a really important point, which we have not focused on enough during the debate, about the positive aspects of the internet. It is very easy to get bogged down in all the negative stuff, which a lot of the Bill focuses on, but she is right that the internet provides a safe space, particularly for young people, to seek out their own identity. Does she agree that the new clause is important because it specifically refers to protected characteristics and to the Equality Act 2010? I am not sure where else that appears in the Bill, but it is important that it should be there. We are thinking not just about age, but about gender, disability and sexual orientation, which is why this new clause could be really important.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. I had not thought about it in those terms, but the hon. Member is right that the new clause gives greater importance to those protected characteristics and lays that out in the Bill.

I appreciate that, under the risk assessment duties set out in the Bill, organisations have to look at protected characteristics in groups and at individuals with those protected characteristics, which I welcome, but I also welcome the inclusion of protected characteristics in the new clause in relation to the duties of the advocacy body. I think that is really important, especially, as the hon. Member for Batley and Spen just said, in relation to the positive aspects of the internet. It is about protecting free speech for children and young people and enabling them to find community and enjoy life online and offline.

Will the Minister give serious consideration to the possibility of a user advocacy body? Third sector organisations are calling for that, and I do not think Ofcom could possibly have the expertise to match such a body.

Online Safety Bill (Eighth sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Committee stage
Thursday 9th June 2022

(1 year, 10 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 9 June 2022 - (9 Jun 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

In terms of the secondary processes that kick in after the AI has scanned the data, I assume it will be up to Ofcom and the provider to discuss what happens then. Once the AI identifies something, does it automatically get sent to the National Crime Agency, or does it go through a process of checking to ensure the AI has correctly identified something? I agree with what the Minister has reiterated on a number of occasions; if it is child sexual abuse material then I have no problem with somebody’s privacy being invaded in order for that to be taken to the relevant authorities and acted on.

I want to make one last point. The wording of new clause 20 is about a report on those proactive technologies. It is about requiring Ofcom to come up with and justify the use of those proactive technologies. To give the hon. Member for Wolverhampton North East some reassurance, it is not saying, “This will definitely happen.” I assume that Ofcom will be able to make the case—I am certain it will be able to—but it will have to justify it in order to be able to require those companies to undertake that use.

My key point is about the future-proofing of this, ensuring that it is not just a one-off, and that, if Ofcom makes a designation about the use of proactive technologies, it is able to make a re-designation or future designation, should new proactive technologies come through, so that we can require those new proactive technologies to be used to identify things that we cannot identify with the current proactive technologies.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

I want to associate myself with the comments of the right hon. Member for Basingstoke and the hon. Member for Aberdeen North, and to explore the intersection between the work we are doing to protect children and the violence against women and girls strategy. There is one group, girls, who apply to both. We know that they are sadly one of the most vulnerable groups for online harm and abuse, and we must do everything we can to protect them. Having a belt and braces approach, with a code of conduct requirement for the violence against women and girls strategy, plus implementing new clause 20 on this technology that can protect girls in particular, although not exclusively, is a positive thing. Surely, the more thorough we are in the preventive approach, the better, rather than taking action after it is too late?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I agree 100%. The case that the shadow Minister, the hon. Member for Pontypridd, made and the stories she highlighted about the shame that is felt show that we are not just talking about a one-off impact on people’s lives, but potentially years of going through those awful situations and then many years to recover, if they ever do, from the situations they have been through.

I do not think there is too much that we could do, too many codes of practice we could require or too many compliances we should have in place. I also agree that girls are the most vulnerable group when considering this issue, and we need to ensure that this Bill is as fit for purpose as it can be and meets the Government’s aim of trying to make the internet a safe place for children and young people. Because of the additional risks that there are for girls in particular, we need additional protections in place for girls. That is why a number of us in this room are making that case.

Online Safety Bill (Seventh sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I would have been quite happy to move the amendment, but I do not think the Opposition would have been terribly pleased with me if I had stolen it. I have got my name on it, and I am keen to support it.

As I have said, I met the NSPCC yesterday, and we discussed how clause 31(3) might work, should the Minister decide to keep it in the Bill and not accept the amendment. There are a number of issues with the clause, which states that the child user condition is met if

“a significant number of children”

are users of the service, or if the service is

“likely to attract a significant number of users who are children”.

I do not understand how that could work. For example, a significant number of people who play Fortnite are adults, but a chunk of people who play it are kids. If some sort of invisible percentage threshold is applied in such circumstances, I do not know whether that threshold will be met. If only 20% of Fortnite users are kids, and that amounts only to half a million children, will that count as enough people to meet the child access assessment threshold?

Fortnite is huge, but an appropriate definition is even more necessary for very small platforms and services. With the very far-right sites that we have mentioned, it may be that only 0.5% of their users are children, and that may amount only to 2,000 children—a very small number. Surely, because of the risk of harm if children access these incredibly damaging and dangerous sites that groom people for terrorism, they should have a duty to meet the child access requirement threshold, if only so that we can tell them that they must have an age verification process—they must be able to say, “We know that none of our users are children because we have gone through an age verification process.” I am keen for children to be able to access the internet and meet their friends online, but I am keen for them to be excluded from these most damaging sites. I appreciate the action that the Government have taken in relation to pornographic content, but I do not think that this clause allows us to go far enough in stopping children accessing the most damaging content that is outwith pornographic content.

The other thing that I want to raise is about how the number of users will be calculated. The Minister made it very clear earlier on, and I thank him for doing so, that an individual does not have to be a registered user to be counted as a user of a site. People can be members of TikTok, for example, only if they are over 13. TikTok has some hoops in place—although they are not perfect—to ensure that its users are over 13, and to be fair, it does proactively remove users that it suspects are under 13, particularly if they are reported. That is a good move.

My child is sent links to TikTok videos through WhatsApp, however. He clicks on the links and is able to watch the videos, which will pop up in the WhatsApp mini-browser thing or in the Safari browser. He can watch the videos without signing up as a registered user of TikTok and without using the platform itself—the videos come through Safari, for example, rather than through the app. Does the Minister expect that platforms will count those people as users? I suggest that the majority of people who watch TikTok by those means are doing so because they do not have a TikTok account. Some will not have accounts because they are under 13 and are not allowed to by TikTok or by the parental controls on their phones.

My concern is that, if the Minister does not provide clarity on this point, platforms will count just the number of registered users, and will say, “It’s too difficult for us to look at the number of unregistered users, so in working out whether we meet the criteria, we are not even going to consider people who do not access our specific app or who are not registered users in some way, shape or form.” I have concerns about the operation of the provisions and about companies using that “get out of jail free” card. I genuinely believe that the majority of those who access TikTok other than through its platform are children and would meet the criteria. If the Minister is determined to keep subsection (3) and not accept the amendment, I feel that he should make it clear that those users must be included in the counting by any provider assessing whether it needs to fulfil the child safety duties.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I agree with thon. Lady’s important point, which feeds into the broader question of volume versus risk—no matter how many children see something that causes harm and damage, one is one too many—and the categorisation of service providers into category 1 to category 2A and category 2B. The depth of the risk is the problem, rather than the number of people who might be affected. The hon. Lady also alluded to age verification—I am sure we will come to that at some point—which is another can of worms. The important point, which she made well, is about volume versus risk. The point is not how many children see something; even if only a small number of children see something, the damage has been done.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree. In fact, I have tabled an amendment to widen category 1 to include sites with the highest risk of harm. The Minister has not said that he agrees with my amendment specifically, but he seems fairly amenable to increasing and widening some duties to include the sites of highest risk. I have also tabled another new clause on similar issues.

I am glad that these clauses are in the Bill—a specific duty in relation to children is important and should happen—but as the shadow Minister said, clause 31(3) is causing difficulty. It is causing difficulty for me and for organisations such as the NSPCC, which is unsure how the provisions will operate and whether they will do so in the way that the Government would like.

I hope the Minister will answer some of our questions when he responds. If he is not willing to accept the amendment, will he give consideration to how the subsection could be amended in the future—we have more stages, including Report and scrutiny in the other place—to ensure that there is clarity and that the intention of the purpose is followed through, rather than being an intention that is not actually translated into law?

Online Safety Bill (Sixth sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I rise to speak to amendments 105 and 106, in my name, on protecting democracy and democratic debate.

Within the Bill, there are significant clauses intended to prevent the spread of harm online, to protect women and girls against violence and to help prevent child sexual exploitation, while at the same time protecting the right of journalists to do their jobs. Although those clauses are not perfect, I welcome them.

The Bill is wide-ranging. The Minister talked on Second Reading about the power in clause 150 to protect another group—those with epilepsy—from being trolled with flashing images. That subject is close to my heart due to the campaign for Zach’s law—Zach is a young boy in my constituency. I know we will return to that important issue later in the Committee, and I thank the Minister for his work on it.

In protecting against online harm while preserving fundamental rights and values, we must also address the threats posed to those involved in the democratic process. Let me be clear: this is not self-serving. It is about not just MPs but all political candidates locally and nationally and those whose jobs facilitate the execution of our democratic process and political life: the people working on elections or for those elected to public office at all levels across the UK. These people must be defended from harm not only for their own protection, but to protect our democracy itself and, with it, the right of all our citizens to a political system capable of delivering on their priorities free from threats and intimidation.

Many other groups in society are also subjected to a disproportionate amount of targeted abuse, but those working in and around politics sadly receive more than almost any other people in this country, with an associated specific set of risks and harms. That does not mean messages gently, or even firmly, requesting us to vote one way or another—a staple of democratic debate—but messages of hate, abuse and threats intended to scare people in public office, grind them down, unfairly influence their voting intentions or do them physical and psychological harm. That simply cannot be an acceptable part of political life.

As I say, we are not looking for sympathy, but we have a duty to our democracy to try to stamp that out from our political discourse. Amendment 105 would not deny anybody the right to tell us firmly where we are going wrong—quite right, too—but it is an opportunity to draw the essential distinction between legitimately holding people in public life to account and illegitimate intimidation and harm.

The statistics regarding the scale of online abuse that MPs receive are shocking. In 2020, a University of Salford study found that MPs received over 7,000 abusive or hate-filled tweets a month. Seven thousand separate messages of harm a month on Twitter alone directed at MPs is far too many, but who in this room does not believe that the figure is almost certainly much higher today? Amnesty conducted a separate study in 2017 looking at the disproportionate amount of abuse that women and BAME MPs faced online, finding that my right hon. Friend the Member for Hackney North and Stoke Newington (Ms Abbott) was the recipient of almost a third of all the abusive tweets analysed, as alluded to already by the hon. Member for Edinburgh—

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Aberdeen North.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I knew that. [Laughter.]

Five years later, we continue to see significant volumes of racist, sexist and homophobic hate-filled abuse and threats online to politicians of all parties. That is unacceptable in itself, but we must ask whether this toxic environment helps to keep decent people in politics or, indeed, attracts good people into politics, so that our democracy can prosper into the future across the political spectrum. The reality we face is that our democracy is under attack online each and every day, and every day we delay acting is another day on which abuse becomes increasingly normalised or is just seen as part of the job for those who have put themselves forward for public service. This form of abuse harms society as a whole, so it deserves specific consideration in the Bill.

While elected Members and officials are not a special group of people deserving of more legal protections than anyone else, we must be honest that the abuse they face is distinct and specific to those roles and directly affects our democracy itself. It can lead to the most serious physical harm, with two Members of Parliament having been murdered in the last six years, and many others face death threats or threats of sexual or other violence on a daily basis. However, this is not just about harm to elected representatives; online threats are often seen first, and sometimes only, by their members of staff. They may not be the intended target, but they are often the people harmed most. I am sure we all agree that that is unacceptable and cannot continue.

All of us have probably reported messages and threats to social media platforms and the police, with varying degrees of success in terms of having them removed or the individuals prosecuted. Indeed, we sadly heard examples of that from my hon. Friend the shadow Minister. Often we are told that nothing can be done. Currently, the platforms look at their own rules to determine what constitutes freedom of speech or expression and what is hateful speech or harm. That fine line moves. There is no consistency across platforms, and we therefore urgently need more clarity and a legal duty in place to remove that content quickly.

Amendment 105 would explicitly include in the Bill protection and consideration for those involved in UK elections, whether candidates or staff. Amendment 106 would go further and place an obligation on Ofcom to produce a code of practice, to be issued to the platforms. It would define what steps platforms must take to protect those involved in elections and set out what content is acceptable or unacceptable to be directed at them.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I want to make a few comments on the amendment. As a younger female parliamentarian, I find that I am often asked to speak to young people about becoming an MP or getting involved in politics. I find it difficult to say to young women, “Yes, you should do this,” and most of the reason for that is what people are faced with online. It is because a female MP cannot have a Twitter account without facing abuse. I am sure male MPs do as well, but it tends to be worse for women.

We cannot engage democratically and with constituents on social media platforms without receiving abuse and sometimes threats as well. It is not just an abusive place to be—that does not necessarily meet the threshold for illegality—but it is pretty foul and toxic. There have been times when I have deleted Twitter from my phone because I just need to get away from the vile abuse that is being directed towards me. I want, in good conscience, to be able to make an argument to people that this is a brilliant job, and it is brilliant to represent constituents and to make a difference on their behalf at whatever level of elected politics, but right now I do not feel that I am able to do that.

When my footballing colleague, the hon. Member for Batley and Spen, mentions “UK elections” in the amendment, I assume she means that in the widest possible way—elections at all levels.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

indicated assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Sometimes we miss out the fact that although MPs face abuse, we have a level of protection as currently elected Members. Even if there were an election coming up, we have a level of security protection and access that is much higher than for anybody else challenging a candidate or standing in a council or a Scottish Parliament election. As sitting MPs, we already have an additional level of protection because of the security services we have in place. We need to remember, and I assume this is why the amendment is drawn in a pretty broad way, that everybody standing for any sort of elected office faces significant risk of harm—again, whether or not that meets the threshold for illegality.

There are specific things that have been mentioned. As has been said, epilepsy is specifically mentioned as a place where specific harm occurs. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. Given the importance of democracy, which is absolutely vital, we need to have a democratic system where people are able to stand in elections and make their case. That is why we have election addresses and a system where the election address gets delivered through every single person’s door. There is an understanding and acceptance by people involved in designing democratic processes that the message of all candidates needs to get out there. If the message of all candidates cannot get out there because some people are facing significant levels of abuse online, then democracy is not acting in the way that it should be. These amendments are fair and make a huge amount of sense. They are protecting the most important tenets of democracy and democratic engagement.

I want to say something about my own specific experiences. We have reported people to the police and have had people in court over the messages they have sent, largely by email, which would not be included in the Bill, but there have also been some pretty creepy ones on social media that have not necessarily met the threshold. As has been said, it is my staff who have had to go to court and stand in the witness box to explain the shock and terror they have felt on seeing the email or the communication that has come in, so I think any provision should include that.

Finally, we have seen situations where people working in elections—this is not an airy-fairy notion, but something that genuinely happened—have been photographed and those pictures have been shared on social media, and they have then been abused as a result. They are just doing their job, handing out ballot papers or standing up and announcing the results on the stage, and they have to abide by the processes that are in place now. In order for us to have free and fair elections that are run properly and that people want to work at and support, we need to have that additional level of protection. The hon. Member for Batley and Spen made a very reasonable argument and I hope the Minister listened to it carefully.

Online Safety Bill (Fifth sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

You are making some really important points about the world of the internet and online gaming for children and young people. That is where we need some serious consideration about obligations on providers about media literacy for both children and grown-ups. Many people with children know that this is a really dangerous space for young people, but we are not quite sure we have enough information to understand what the threats, risks and harms are. That point about media literacy, particularly in regard to the gaming world, is really important.

None Portrait The Chair
- Hansard -

Order. Before we proceed, the same rules apply in Committee as on the Floor of the House to this extent: the Chair is “you”, and you speak through the Chair, so it is “the hon. Lady”. [Interruption.] One moment.

While I am on my feet, I should perhaps have said earlier, and will now say for clarification, that interventions are permitted in exactly the same way as they are on the Floor of the House. In exactly the same way, it is up to the Member who has the Floor to decide whether to give way or not. The difference between these debates and those on the Floor of the House is of course that on the Floor of the House a Member can speak only once, whereas in Committee you have the opportunity to come back and speak again if you choose to do so. Once the Minister is winding up, that is the end of the debate. The Chair would not normally admit, except under exceptional circumstances, any further speech, as opposed to an intervention.

Online Safety Bill (Fourth sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q In terms of online gaming, and predators moving children from more mainstream to less regulated platforms, do you think there are improvements in the Bill that relate to that, or do you think more can be done?

Lynn Perry: Grooming does happen within gaming, and we know that online video games offer some user-to-user interaction. Users sometimes have the ability to create content within platforms, which is in scope for the Bill. The important thing will be enforcement and compliance in relation to those provisions. We work with lots of children and young people who have been sexually exploited and abused, and who have had contact through gaming sites. It is crucial that this area is in focus from the perspective of building in, by design, safety measures that stop perpetrators being able to communicate directly with children.

Private messaging is another area for focus. We also consider it important for Ofcom to have regulatory powers to compel firms to use technology that could identify child abuse and grooming.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q If I could address one question to each witness, that would be fantastic. I do a lot of work with women in sport, including football. Obviously, we have the Women’s Euros coming up, and I have my Panini sticker album at the ready. Do you think the Bill could do more to address the pervasive issue of online threats of violence and abuse against women and girls, including those directed at women in sport, be they players, officials or journalists?

Sanjay Bhandari: I can see that there is something specific in the communications offences and that first limb around threatening communications, which will cover a lot of the things we see directed at female football pundits, like rape threats. It looks as though it would come under that. With our colleagues in other civil society organisations, particularly Carnegie UK Trust, we are looking at whether more should be done specifically about tackling misogyny and violence against women and girls. It is something that we are looking at, and we will also work with our colleagues in other organisations.

Online Safety Bill (Second sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q If someone has consented to take part in pornography and they later change their mind and would like it to be taken down, do you think they should have the right to ask a porn website, for example, to take it down?

Professor Clare McGlynn: That is quite challenging not only for pornography platforms but for sex workers, in that if you could participate in pornography but at any time thereafter withdraw your consent, it is difficult to understand how a pornography company and the sex worker would be able to make a significant amount of money. The company would be reluctant to invest because it might have to withdraw the material at any time. In my view, that is a quite a challenge. I would not go down that route, because what it highlights is that the industry can be exploitative and that is where the concern comes from. I think there are other ways to deal with an exploitative porn industry and other ways to ensure that the material online has the full consent of participants. You could put some of those provisions into the Bill—for example, making the porn companies verify the age and consent of those who are participating in the videos for them to be uploaded. I think that is a better way to deal with that, and it would ensure that sex workers themselves can still contract to perform in porn and sustain their way of life.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you very much—this is extremely interesting and helpful. You have covered a lot of ground already, but I wonder whether there is anything specific you think the Bill should be doing more about, to protect girls—under-18s or under-16s—in particular?

Janaya Walker: A lot of what we have discussed in terms of naming violence against women and girls on the face of the Bill includes children. We know that four in five offences of sexual communications with a child involved girls, and a lot of child abuse material is targeted at girls specifically. The Bill as a whole takes a very gender-neutral approach, which we do not think is helpful; in fact, we think it is quite harmful to trying to reduce the harm that girls face online.

This goes against the approach taken in the Home Office violence against women and girls strategy and its domestic abuse plan, as well as the gold-standard treaties the UK has signed up to, such as the Istanbul convention, which we signed and have recently committed to ratifying. The convention states explicitly that domestic laws, including on violence against women and girls online, need to take a very gendered approach. Currently, it is almost implied, with references to specific characteristics. We think that in addressing the abuse that girls, specifically, experience, we need to name girls. To clarify, the words “women”, “girls”, “gender” and “sex” do not appear in the Bill, and that is a problem.

Jessica Eagelton: May I add a point that is slightly broader than your question? Another thing that the Bill does not do at the moment is provide for specialist victim support for girls who are experiencing online abuse. There has been some discussion about taking a “polluter pays” approach; where platforms are not compliant with the duties, for example, a percentage of the funds that go to the regulator could go towards victim support services, such as the revenge porn helpline and Refuge’s tech abuse team, that provide support to victims of abuse later on.

Professor Clare McGlynn: I can speak to pornography. Do you want to cover that separately, or shall I do that now?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Finally, do you think it would be desirable for Ofcom to consider a system with more consistency in parental controls, so that parents can always ensure that their children cannot talk to anybody outside their circle? Would that be helpful?

Dr Rachel O'Connell: There is a history of parental controls, and only 36% of parents use them. Ofcom research consistently says that it is 70%, but in reality, it is lower. When using age verification, the parents are removing the ability to watch everything. It is a platform; they are providing the digital playground. In the same way, when you go on swings and slides, there is bouncy tarmac because you know the kids are going to use them. It is like creating that health and safety environment in a digital playground.

When parents receive a notification that their child wants to access something, there could be a colour-coded nutrition-style thing for social media, livestreaming and so on, and the parents could make an informed choice. It is then up to the platform to maintain that digital playground and run those kinds of detection systems to see if there are any bad actors in there. That is better than parental controls because the parent is consenting and it is the responsibility of the platform to create the safer environment. It is not the responsibility of the parent to look over the child’s shoulder 24/7 when they are online.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q The age verification stuff is really interesting, so thank you to our witnesses. On violence against women and girls, clauses 150 to 155 set out three new communications offences. Do you think those offences will protect women from receiving offensive comments, trolling and threats online? What will the Bill mean for changing the way you manage those risks on your platforms?

Jared Sine: I do not know the specific provisions but I am familiar with the general concept of them. Any time you put something in law, it can either be criminalised or have enforcement behind it, and I think that helps. Ultimately, it will be up to the platforms to come up with innovative technologies or systems such as “Are You Sure?” and “Does This Bother You?” which say that although the law says x, we are going to go beyond that to find tools and systems that make it happen on our platform. Although I think it is clearly a benefit to have those types of provisions in law, it will really come down to the platforms taking those extra steps in the future. We work with our own advisory council, which includes the founder of the #MeToo movement, REIGN and others, who advise us on how to make platforms safer for those things. That is where the real bread gets buttered, so to speak.

Online Safety Bill (First sitting)

Debate between Kirsty Blackman and Kim Leadbeater
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q My last question is about future-proofing the Bill. Obviously, an awful lot of things will happen in the online world that do not currently happen there, and some of those we cannot foresee. Do you think the Bill is wide enough and flexible enough to allow changes to be made so that new and emerging platforms can be regulated?

Kevin Bakhurst: Overall, we feel that it is. By and large, the balance between certainty and flexibility in the Bill is probably about right and will allow some flexibility in future, but it is very hard to predict what other harms may emerge. We will remain as flexible as possible.

Richard Wronka: There are some really important updating tools in the Bill. The ability for the Secretary of State to introduce new priority harms or offences—with the approval of Parliament, of course—is really important.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Ofcom is required to produce certain codes, for example on terrorism, but others that were floated in the Green Paper are no longer in the Bill. Are you working on such codes, for example on hate crime and wider harm, and if not, what happens in the meantime? I guess that links to my concerns about the democratic importance and journalistic content provisions in the Bill, to which you have alluded. They are very vague protections and I am concerned that they could be exploited by extremists who suddenly want to identify as a journalist or a political candidate. Could you say a little about the codes and about those two particular clauses and what more you think we could do to help you with those?

Richard Wronka: I will cover the codes first. You are absolutely right that the Bill requires Ofcom to publish codes of practice, particularly on CSEA and on terror, as well as on fraudulent advertising and other areas. We are doing the work right now so that we are ready to progress with that process as soon as we get powers and duties, because it is really important that we are ready to move as quickly as possible. We will set out further detail on exactly how we plan to do that in a roadmap document that we are looking to publish before the summer break, so that will provide some of the detail.

A really important point here is that the Bill quite rightly covers a wide set of harms. We are mindful of the fact that the temptation of having a code that covers every single harm could be counterproductive and confusing for platforms, even for those that want to comply and do the right thing. One of the balancing acts for us as we produce that code framework will be to get the right coverage for all the issues that everyone is rightly concerned about, but doing that in a way that is streamlined and efficient, so that services can apply the provisions of those codes.

Richard Wronka: Shall I pick up on the second bit very quickly? I think you are right; this is one of our central concerns about the definitions. As far as possible, this should be a matter for Parliament. It is really important that to know Parliament has a view on this. Ultimately, the regulator will take a view based on what Parliament says. We have some experience in this area, but as Richard said, we recognise the challenge—it is extremely complex. We can see the policy intent of doing it, quite rightly, and the importance of enshrining freedom of expression as far as possible, but Parliament can help to add clarity and, as you rightly say, be aware of some of the potential loopholes. At the moment, someone could describe themselves as a citizen journalist; where does that leave us? I am not quite sure. Parliament could help to clarify that, and we would be grateful.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Q Just one more question. We know that women and minorities face more abuse online than men do. Is that something that you have found in your experience, particularly Twitter? What are you doing to ensure that the intersectionality of harms is considered in the work that you are doing to either remove or downgrade content?

Katy Minshall: That is absolutely the case and it has been documented by numerous organisations and research. Social media mirrors society and society has the problems you have just described. In terms of how we ensure intersectionality in our policies and approaches, we are guided by our trust and safety council, which is a network of dozens of organisations around the world, 10 of which are here in the UK, and which represents different communities and different online harms issues. Alongside our research and engagement, the council ensures that when it comes to specific policies, we are constantly considering a range of viewpoints as we develop our safety solutions.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you to the witnesses. I share your concerns about the lack of clarity regarding the journalistic content and democratic content exemptions. Do you think those exemptions should be removed entirely, or can you suggest what we might do to make them clearer in the Bill?

Katy Minshall: At the very least, there must be tighter definitions. I am especially concerned when it comes to the news publisher exemption. The Secretary of State has indicated an amendment that would mean that services like Twitter would have to leave such content up while an appeals process is ongoing. There is no timeline given. The definition in the Bill of a news publisher is, again, fairly vague. If Ben and I were to set up a news website, nominally have some standards and an email address where people could send complaints, that would enable it to be considered a news publisher under the Bill. If we think about some of the accounts that have been suspended from social media over the years, you can absolutely see them creating a news website and saying, “I have a case to come back on,” to Twitter or TikTok or wherever it maybe.

Ben Bradley: We share those concerns. There are already duties to protect freedom of expression in clause 19. Those are welcome. It is the breadth of the definition of journalistic and democratic content that is a concern for us, particularly when it comes to things like the expediated and dedicated appeals mechanism, which those people would be able to claim if their content was removed. We have already seen people like Tommy Robinson on the far right present themselves as journalists or citizen journalists. Giving them access to a dedicated and expediated appeals mechanism is an area of concern.

There are different ways you could address that, such as greater clarity in those definitions and removing subjective elements. At the minute, it is whether or not a user considers their content to be journalistic; that it is not an objective criterion but about their belief about their content.

Also, if you look at something like the dedicated and expediated appeals mechanism, could you hold that in reserve so that if a platform were found to be failing in its duties to journalistic content or in its freedom of expression duties, Ofcom could say, like it can in other areas of the Bill, “Okay, we believe that you need to create this dedicated mechanism, because you have failed to protect those duties.”? That would, I think, minimise the risk for exploitation of that mechanism.