All 3 Lucy Powell contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Mon 5th Dec 2022
Tue 17th Jan 2023

Online Safety Bill

Lucy Powell Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- Hansard - -

Thank you, Madam Deputy Speaker. It has been a busy day, and I will try to keep my remarks short. It is a real shame that the discussion of an important landmark Bill, with so many Members wanting to contribute, has been squeezed into such a tiny amount of time.

Labour supports the principles of the Online Safety Bill. There has been a wild west online for too long. Huge platforms such as Facebook and Google began as start-ups but now have huge influence over almost every aspect of our lives: how we socialise and shop, where we get our news and views, and even the outcomes of elections and propaganda wars. There have been undoubted benefits, but the lack of regulation has let harms and abuses proliferate. From record reports of child abuse to soaring fraud and scams, from racist tweets to Russia’s disinformation campaigns, there are too many harms that, as a society, we have been unable or unwilling to address.

There is currently no regulator. However, neither the Government nor silicon valley should have control over what we can say and do online. We need strong, independent regulation.

Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will give way once on this point.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I am grateful. The Secretary of State talked about getting the tech giants to follow their own rules, but we know from Frances Haugen, the Facebook whistleblower, that companies were driving children and adults to harmful content, because it increased engagement. Does that not show that we must go even further than asking them to follow their own rules?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I very much agree with my hon. Friend, and I will come on to talk about that shortly.

The Online Safety Bill is an important step towards strong, independent regulation. We welcome the Bill’s overall aim: the duty of care framework based on the work of the Carnegie Trust. I agree with the Secretary of State that the safety of children should be at the heart of this regulation. The Government have rightly now included fraud, online pornography and cyber-flashing in the new draft of the Bill, although they should have been in scope all along.

Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I am not going to give way, sorry.

Before I get onto the specifics, I will address the main area of contention: the balance between free speech and regulation, most notably expressed via the “legal but harmful” clauses.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will give way one last time.

Christian Wakeford Portrait Christian Wakeford
- Hansard - - - Excerpts

I thank my hon. Friend. The Government have set out the priority offences in schedule 7 to the Bill, but legal harms have clearly not been specified. Given the torrent of racist, antisemitic and misogynistic abuse that grows every single day, does my hon. Friend know why the Bill has not been made more cohesive with a list of core legal harms, allowing for emerging threats to be dealt with in secondary legislation?

--- Later in debate ---
Lucy Powell Portrait Lucy Powell
- Hansard - -

I will come on to some of those issues. My hon. Friend makes a valid point.

I fear the Government’s current solution to the balance between free speech and regulation will please no one and takes us down an unhelpful rabbit hole. Some believe the Bill will stifle free speech, with platforms over-zealously taking down legitimate political and other views. In response, the Government have put in what they consider to be protections for freedom of speech and have committed to setting out an exhaustive list of “legal but harmful” content, thus relying almost entirely on a “take down content” approach, which many will still see as Government overreach.

On the other hand, those who want harmful outcomes addressed through stronger regulation are left arguing over a yet-to-be-published list of Government-determined harmful content. This content-driven approach moves us in the wrong direction away from the “duty of care” principles the Bill is supposed to enshrine. The real solution is a systems approach based on outcomes, which would not only solve the free speech question, but make the Bill overall much stronger.

What does that mean in practice? Essentially, rather than going after individual content, go after the business models, systems and policies that drive the impact of such harms—[Interruption.] The Minister for Security and Borders, the right hon. Member for East Hampshire (Damian Hinds), says from a sedentary position that that is what the Bill does, but none of the leading experts in the field think the same. He should talk to some of them before shouting at me.

The business models of most social media companies are currently based on engagement, as my hon. Friend the Member for Liverpool, Walton (Dan Carden) outlined. The more engagement, the more money they make, which rewards controversy, sensationalism and fake news. A post containing a racist slur or anti-vax comment that nobody notices, shares or reads is significantly less harmful than a post that is quickly able to go viral. A collective pile-on can have a profoundly harmful effect on the young person on the receiving end, even though most of the individual posts would not meet the threshold of harmful.

Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- Hansard - - - Excerpts

Will my hon. Friend give way on that point?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will not, sorry. Facebook whistleblower Frances Haugen, who I had the privilege of meeting, cited many examples to the Joint Committee on the draft Online Safety Bill of Facebook’s models and algorithms making things much worse. Had the Government chosen to follow the Joint Committee recommendations for a systems-based approach rather than a content-driven one, the Bill would be stronger and concerns about free speech would be reduced.

Lucy Powell Portrait Lucy Powell
- Hansard - -

I am sorry, but too many people want to speak. Members should talk to their business managers, who have cut—[Interruption.] I know the hon. Gentleman was Chair of the Committee—[Interruption.]

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

Order. The hon. Lady is not giving way. Let us get on with the debate.

Lucy Powell Portrait Lucy Powell
- Hansard - -

The business managers have failed everybody on both sides given the time available.

A systems-based approach also has the benefit of tackling the things that platforms can control, such as how content spreads, rather than what they cannot control, such as what people post. We would avoid the cul-de-sac of arguing over the definitions of what content is or is not harmful, and instead go straight to the impact. I urge the Government to adopt the recommendations that have been made consistently to focus the Bill on systems and models, not simply on content.

Turning to other aspects of the Bill, key issues with its effectiveness remain. The first relates to protecting children. As any parent will know, children face significant risks online, from poor body image, bullying and sexist trolling to the most extreme grooming and child abuse, which is, tragically, on the rise. This Bill is an important opportunity to make the internet a safe place for children. It sets out duties on platforms to prevent children from encountering illegal, harmful or pornographic content. That is all very welcome.

However, despite some of the Government’s ambitious claims, the Bill still falls short of fully protecting children. As the National Society for the Prevention of Cruelty to Children argues, the Government have failed to grasp the dynamics of online child abuse and grooming—[Interruption.] Again, I am being heckled from the Front Bench, but if Ministers engage with the children’s charities they will find a different response. For example—[Interruption.] Yes, but they are not coming out in support of the Bill, are they? For example, it is well evidenced that abusers will often first interact with children on open sites and then move to more encrypted platforms. The Government should require platforms to collaborate to reduce harm to children, prevent abuse from being displaced and close loopholes that let abusers advertise to each other in plain sight.

The second issue is illegal activity. We can all agree that what is illegal offline should be illegal online, and all platforms will be required to remove illegal content such as terrorism, child sex abuse and a range of other serious offences. It is welcome that the Government have set out an expanded list, but they can and must go further. Fraud was the single biggest crime in the UK last year, yet the Business Secretary dismissed it as not affecting people’s everyday lives.

The approach to fraud in this Bill has been a bit like the hokey-cokey: the White Paper said it was out, then it was in, then it was out again in the draft Bill and finally it is in again, but not for the smaller sites or the search services. The Government should be using every opportunity to make it harder for scammers to exploit people online, backed up by tough laws and enforcement. What is more, the scope of this Bill still leaves out too many of the Law Commission’s recommendations of online crimes.

The third issue is disinformation. The war in Ukraine has unleashed Putin’s propaganda machine once again. That comes after the co-ordinated campaign by Russia to discredit the truth about the Sergei Skripal poisonings. Many other groups have watched and learned: from covid anti-vaxxers to climate change deniers, the internet is rife with dangerous disinformation. The Government have set up a number of units to tackle disinformation and claim to be working with social media companies to take it down. However, that is opaque and far from optimal. The only mention of disinformation in the Bill is that a committee should publish a report. That is far from enough.

Returning to my earlier point, it is the business models and systems of social media companies that create a powerful tool for disinformation and false propaganda to flourish. Being a covid vaccine sceptic is one thing, but being able to quickly share false evidence dressed up as science to millions of people within hours is a completely different thing. It is the power of the platform that facilitates that, and it is the business models that encourage it. This Bill hardly begins to tackle those societal and democratic harms.

The fourth issue is online abuse. From racism to incels, social media has become a hotbed for hate. I agree with the Secretary of State that that has poisoned public life. I welcome steps to tackle anonymous abuse. However, we still do not know what the Government will designate as legal but harmful, which makes it very difficult to assess whether the Bill goes far enough, or indeed too far. I worry that those definitions are left entirely to the Secretary of State to determine. A particularly prevalent and pernicious form of online hate is misogyny, but violence against women and girls is not mentioned at all in the Bill—a serious oversight.

The decision on which platforms will be regulated by the Bill is also arbitrary and flawed. Only the largest platforms will be required to tackle harmful content, yet smaller platforms, which can still have a significant, highly motivated, well-organised and particularly harmful user base, will not. Ofcom should regulate based on risk, not just on size.

The fifth issue is that the regulator and the public need the teeth to take on the big tech companies, with all the lawyers they can afford. It is a David and Goliath situation. The Bill gives Ofcom powers to investigate companies and fine them up to 10% of their turnover, and there are some measures to help individual users. However, if bosses in Silicon Valley are to sit up and take notice of this Bill, it must go further. It should include stronger criminal liability, protections for whistleblowers, a meaningful ombudsman for individuals, and a route to sue companies through the courts.

The final issue is future-proofing, which we have heard something about already. This Bill is a step forward in dealing with the likes of Twitter, Facebook and Instagram—although it must be said that many companies have already begun to get their house in order ahead of any legislation—but it will have taken nearly six years for the Bill to appear on the statute book.

Since the Bill was first announced, TikTok has emerged on the scene, and Facebook has renamed itself Meta. The metaverse is already posing dangers to children, with virtual reality chat rooms allowing them to mix freely with predatory adults. Social media platforms are also adapting their business models to avoid regulation; Twitter, for example, says that it will decentralise and outsource moderation. There is a real danger that when the Bill finally comes into effect, it will already be out of date. A duty of care approach, focused on outcomes rather than content, would create a much more dynamic system of regulation, able to adapt to new technologies and platforms.

In conclusion, social media companies are now so powerful and pervasive that regulating them is long overdue. Everyone agrees that the Bill should reduce harm to children and prevent illegal activity online, yet there are serious loopholes, as I have laid out. Most of all, the focus on individual content rather than business models, outcomes and algorithms will leave too many grey areas and black spots, and will not satisfy either side in the free speech debate.

Despite full prelegislative scrutiny, the Government have been disappointingly reluctant to accept those bigger recommendations. In fact, they are going further in the wrong direction. As the Bill progresses through the House, we will work closely with Ministers to improve and strengthen it, to ensure that it truly becomes a piece of world-leading legislation.

None Portrait Several hon. Members rose—
- Hansard -

Online Safety Bill (Programme) (No. 4)

Lucy Powell Excerpts
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - -

There has been long-standing consensus since the Bill was first mooted more than four years ago—before anyone had even heard of TikTok—that online and social media needed regulating. Despite our concerns about both the previous drafting and the new amendments, we support the principle of the Online Safety Bill, but I take issue with the Secretary of State’s arguments today. [Interruption.] I think the hon. Member for Peterborough (Paul Bristow) is trying to correct my language from a sedentary position. Perhaps he wants to listen to the argument instead, because what he and the Secretary of State are doing today will take the Bill a massive step backwards, not forwards.

The consensus has not just been about protecting children online, although of course that is a vital part of the Bill; it is also about the need to tackle the harms that these powerful platforms present when they go unmitigated. As we have heard this evening, there is a cross-party desire to strengthen and broaden the Bill, not water it down, as we are now hearing. Alas, we are not there.

This is not a perfect Bill and was never going to be, but even since the last delay before the summer, we have had the coroner’s inquest into the tragic Molly Russell case, Russian disinformation campaigns and the takeover and ongoing implosion of Twitter. Yet the Government are now putting the entire Bill at risk. It has already been carried over once, so if we do not complete its passage before the end of this parliamentary Session, it will fall completely. The latest hold-up is to enable the Government to remove “legal but harmful” clauses. This goes against the very essence of the Bill, which was created to address the particular power of social media to share, to spread and to broadcast around the world very quickly.

Peter Bone Portrait Mr Peter Bone (Wellingborough) (Con)
- Hansard - - - Excerpts

I understand the shadow Minister’s concern about what the Government are trying to do, but I do not understand why she is speaking against a programme motion that gives the Opposition more time to scrutinise the Bill. It must be the first time I have heard a member of the Opposition demand less time in which to scrutinise a Bill.

Lucy Powell Portrait Lucy Powell
- Hansard - -

I shall come on to that. It is we, on the Opposition side of the House, who are so determined to get the Bill on to the statute book that I find myself arguing against the Government’s further delay. Let us not forget that six months have passed between the first day on Report and the second, today—the longest ever gap between two days of Report in the history of the House—so it is delay after delay.

Disinformation, abuse, incel gangs, body shaming, covid denial, holocaust denial, scammers—the list goes on, all of it actively encouraged by unregulated engagement algorithms and business models that reward sensational, extreme, controversial and abusive behaviour. It is these powers and models that need regulating, for individuals on the receiving end of harm but also to deal with harms to society, democracy and our economy. The enormous number of amendments that have been tabled in the last week should be scrutinised, but we now face a real trade-off between the Bill not passing through the other place in time and the provision of more scrutiny. As I told the Secretary of State a couple of weeks ago in private, our judgment is this: get the Bill to the other place as soon as possible, and we will scrutinise it there.

Sara Britcliffe Portrait Sara Britcliffe (Hyndburn) (Con)
- Hansard - - - Excerpts

Does the hon. Lady agree that what the Labour party did was initiate a vote of no confidence in the Prime Minister rather than making progress with the Bill—which she says is so important—at the time when it was needed?

Lucy Powell Portrait Lucy Powell
- Hansard - -

The hon. Lady remembers incorrectly. It was members of her own party who tabled the motion of no confidence. Oh, I have just remembered: they did not have confidence in the Prime Minister at the time, did they? We have had two Prime Ministers since then, so I am not sure that they have much confidence—[Interruption.]

Lucy Powell Portrait Lucy Powell
- Hansard - -

I will move on now, thank you.

We would not have been here at all if the Secretary of State had stuck to the guns of her predecessor, who, to be fair to her—I know she is not here today—saw off a raft of vested interests to enable the Bill to progress. The right hon. Member for Mid Bedfordshire (Ms Dorries) understood that this is not about thwarting the right to hold views that most of us find abhorrent, but about not allowing those views to be widely shared on a powerful platform that, in the offline world, just does not exist. She understood that the Online Safety Bill came from a fundamental recognition that the algorithms and the power of platforms to push people towards content that, although on its own may not be illegal, cumulatively causes significant harm. Replacing the prevention of harm with an emphasis on free speech lets the platforms off the hook, and the absence of duties to prevent harm and dangerous outcomes will allow them to focus on weak user controls.

Simply holding platforms to account for their own terms and conditions—the Secretary of State referred to that earlier—which, as we saw just this week at Twitter, can be rewritten or changed at whim, will not constitute robust enough regulation to deal with the threat that these platforms present. To protect children, the Government are relying on age verification, but as those with teenage children are well aware—including many of us in the House—most of them pass themselves off as older that they are, and verification is easy to get around. The proposed three shields for adults are just not workable and do not hold up to scrutiny. Let us be clear that the raft of new amendments that have been tabled by the Government this week are nothing more than a major weakening and narrowing of this long-awaited legislation.

This is not what Labour would do. We would tackle at root the power of the platforms to negatively shape all our lives. But we are where we are, and it is better to have the regulator in place with some powers than to have nothing at all. I fear that adding more weeks in Committee in the Commons, having already spent years and years debating this Bill, will not make it any better anyway. Going back into Committee is an unprecedented step, and where might that end? What is to prevent another new Minister or Secretary of State from changing their mind again in the new year, or to prevent there being another reshuffle or even another Prime Minister? That might happen! This is a complex and important Bill, but it is also long, long overdue. We therefore support the original programme motion to get the Bill into the other place immediately, and we will not be voting to put the Bill back into Committee.

Online Safety Bill

Lucy Powell Excerpts
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - -

I am relieved to finally speak on Third Reading of this important Bill. We have had a few false dawns along the way, but we are almost there. The Bill has seen parliamentary dramas, arcane procedures and a revolving door of Ministers. Every passing week throws up another example of why stronger online regulation is urgently needed, from the vile Andrew Tate and the damning Molly Russell inquest to threats to democracy and, most recently, Elon Musk’s takeover of Twitter and ripping up of its rules.

The power of the broadcast media in the past was that it reached into everybody’s living rooms. Today, in the digital age, social media is in every room in our home, in every workplace, in every school, at every event and, with the rise of virtual reality, also in our heads. It is hard to escape. What began as ideas on student campuses to join up networks of old friends are now multibillion-pound businesses that attract global advertising budgets and hold hugely valuable data and information on every aspect of our lives.

In the digital age, social media is a central influence on what we buy, often on what we think, how we interact and how we behave. The power and the money at stake are enormous, yet the responsibilities are minimal and the accountability non-existent. The need to constantly drive engagement and growth has brought with it real and actual harms to individuals, democracy, our economy, society and public health, with abusers and predators finding a new profitable home online. These harms are driven by business models and engagement algorithms that actively promote harmful content. The impact on children and young people can be particularly acute, even life-threatening.

It is for those reasons and others that, as a country and on a cross-party basis, we embarked many years ago on bringing communications from the analogue era into the digital age. Since the Bill was first mooted, we have had multiple Select Committee reports, a Joint Committee and even two Public Bill Committees. During that time, the pace of change has continued. Nobody had even heard of TikTok when we first discussed the Bill. Today, it is one of the main ways that young people get their news. It is a stark reminder of just how slow-moving Government legislation is and how we will probably need to return to these issues once again very soon—I am sorry to break that to everybody—but we have got there for now. We will at least establish a regulator with some tough powers, albeit with a much narrower scope than was originally conceived.

George Howarth Portrait Sir George Howarth (Knowsley) (Lab)
- Hansard - - - Excerpts

I warmly endorse what my hon. Friend is saying. Does she agree with the right hon. Member for Chelmsford (Vicky Ford), who intervened on the Secretary of State, that further work is needed to prevent platforms from promoting different forms of eating disorders?

Lucy Powell Portrait Lucy Powell
- Hansard - -

I absolutely endorse those comments and I will come on to that briefly.

We never thought that the Online Safety Bill was perfect and we have been trying to work with the Government to improve it at every stage. Some of that has paid off and I put on record my thanks to my hon. Friend the Member for Pontypridd (Alex Davies-Jones) for her truly brilliant work, which has been ably supported by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley). I thank the various Ministers for listening to our proposals on scam ads, epilepsy trolling and dealing with small but high harm platforms, and I thank the various Secretaries of State for their constructive approaches. Most of all, I, too, thank the campaigners, charities and families who have been most affected by the Bill.

I welcome today’s last-minute concessions. We have been calling for criminal liability from the start as a means to drive culture change, and we look forward to seeing the detail of the measure when it is tabled in the other place. I also welcome that the Bill will finally outlaw conversion practices, including for trans people, and will take tougher action on people traffickers who advertise online.

On major aspects, however, the Government have moved in the wrong direction. They seem to have lost their mettle and watered down the Bill significantly by dumping whole swathes of it, including many of the harms that it was originally designed to deal with. There are still protections for children, albeit that age verification is difficult and many children pass themselves off as older online, but all the previous work on tackling wider harms has been dropped.

In failing to reconcile harms that are not individually illegal with the nature of powerful platforms that promote engagement and outcomes that are harmful, the Government have let the big tech companies off the hook and left us all more at risk. Online hate, disinformation, sensationalism, abuse, terrorism, racism, self-harm, eating disorders, incels, misogyny, antisemitism, and many other things, are now completely out of scope of the Bill and will continue to proliferate. That is a major loophole that massively falls short of the Bill’s original intention.

I hope that the other place will return to some of the core principles of the duty of care, giving the regulator wider powers to direct terms and conditions, and getting transparency and accountability for the engagement algorithms and economic business models that monetise misery, as Ian Russell described it. I am confident that the other place will consider those issues carefully, sensitively and intelligently. As I have said, if the Bill is not strengthened, it will fall to the next Labour Government to bring in further legislation. For now, I am pleased to finally be able to support the Online Safety Bill to pass its Third Reading.