Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - - - Excerpts

My Lords, it is a pleasure to follow other noble Lords who have spoken. I too support this key first amendment. Clarity of purpose is essential in any endeavour. The amendment overall sets out the Bill’s aims and enhances what will be vital legislation for the world, I hope, as well as for the United Kingdom. The Government have the very welcome ambition of making Britain the safest country in the world to go online. The OSB is a giant step in that direction.

As has been said, there has been remarkable consensus across the Committee on what further measures may still be needed to improve the Bill and on this first amendment, setting out these seven key purposes. Noble Lords may be aware that in the Christian tradition the number seven is significant: in the medieval period the Church taught the dangers of the seven deadly sins, the merits of the seven virtues and the seven acts of mercy. Please speak to me later if a refresher course is needed.

Amendment 1 identifies seven deadly dangers—I think they are really deadly. They are key risks which we all acknowledge are unwelcome and destructive companions of the new technologies which bring so many benefits: risks to public health or national security; the risk of serious harm to children; the risk of new developments and technologies not currently in scope; the disproportionate risk to those who manifest one or more protected characteristics; risks that occur through poor design; risks to freedom of expression and privacy; and risks that come with low transparency and low accountability. Safety and security are surely one of the primary duties of government, especially the safety and security of children and the vulnerable. There is much that is good and helpful in new technology but much that can be oppressive and destructive. These seven risks are real and present dangers. The Bill is needed because of actual and devastating harm caused to people and communities.

As we have heard, we are living through a period of rapid acceleration in the development of AI. Two days ago, CBS broadcast a remarkable documentary on the latest breakthroughs by Google and Microsoft. The legislation we craft in these weeks needs future-proofing. That can happen only through a clear articulation of purpose so that the framework provided by the Bill continues to evolve under the stewardship of the Secretary of State and of Ofcom.

I have been in dialogue over the past five years with tech companies in a variety of contexts and I have seen a variety of approaches, from the highly responsible in some companies to the frankly cavalier. Good practice, especially in design, needs stronger regulation to become uniform. I really enjoyed the analogy from the noble Lord, Lord Allan, a few minutes ago. We would not tolerate for a moment design and safety standards in aeroplanes, cars or washing machines which had the capacity to cause harm to people, least of all to children. We should not tolerate lesser standards in our algorithms and technologies.

There is no map for the future of technology and its use, even over the rest of this decade, but this amendment provides a compass—a fixed point for navigation in the future, for which future generations will thank this Government and this House. These seven deadly dangers need to be stated clearly in the Bill and, as the noble Baroness, Lady Kidron, said, to be a North Star for both the Secretary of State and Ofcom. I support the amendment.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.

I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.

I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?

That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I welcome this opportunity to clarify the purposes of the Bill, but I am not sure that the amendment helps as my North Star. Like the Bill, it throws up as many questions as answers, and I found myself reading it and thinking “What does that word mean?”, so I am not sure that clarity was where I ended up.

It is not a matter of semantics, but in some ways you could say—and certainly this is as publicly understood—that the name of the Bill, the Online Safety Bill, gives it its chief purpose. Yet however well-intentioned, and whatever the press releases say or the headlines print, even a word such as “safety” is slippery, because safety as an end can be problematic in a free society. My worry about the Bill is unintended consequences, and that is not rectified by the amendment. As the Bill assumes safety as the ultimate goal, we as legislators face a dilemma. We have the responsibility of weighing up the balance between safety and freedom, but the scales in the Bill are well and truly weighted towards safety at the expense of freedom before we start, and I am again not convinced the amendment weights them back again.

Of course, freedom is a risky business, and I always like the opportunity to quote Karl Marx, who said:

“You cannot pluck the rose without its thorns!”


However, it is important to recognise that “freedom” is not a dirty word, and we should avoid saying that risk-free safety is more important than freedom. How would that conversation go with the Ukrainian people who risk their safety daily for freedom? Also, even the language of safety, or indeed what constitutes the harms that the Bill and the amendments promise to keep the public safe from, need to be considered in the cultural and social context of the norms of 2023. A new therapeutic ethos now posits safety in ever-expanding pseudo-psychological and subjective terms, and this can be a serious threat to free speech. We know that some activists often exploit that concept of safety to claim harm when they merely encounter views they disagree with. The language of safety and harm is regularly used to cancel and censor opponents—and the Government know that, so much so that they considered it necessary to introduce the Higher Education (Freedom of Speech) Bill to secure academic freedom against an escalating grievance culture that feigns harm.

Part of the triple shield is a safety duty to remove illegal content, and the amendment talks about speech within the law. That sounds unobjectionable—in my mind it is far better than “legal but harmful”, which has gone—but, while illegality might sound clear and obvious, in some circumstances it is not always clear. That is especially true in any legal limitations of speech. We all know about the debates around hate speech, for example. These things are contentious offline and even the police, in particular the College of Policing, seem to find the concept of that kind of illegality confusing and, at the moment, are in a dispute with the Home Secretary over just that.

Is it really appropriate that this Bill enlists and mandates private social media companies to judge criminality using the incredibly low bar of “reasonable grounds to infer”? It gets even murkier when the legal standard for permissible speech online will be set partly by compelling platforms to remove content that contravenes their terms and conditions, even if these terms of service restrict speech far more than domestic UK law does. Big tech is being incited to censor whatever content it wishes as long as it fits in with their Ts & Cs. Between this and determining, for example, what is in filters—a whole different issue—one huge irony here, which challenges one of the purposes of the Bill, is that despite the Government and many of us thinking that this legislation will de-fang and regulate big tech’s powers, actually the legislation could inadvertently give those same corporates more control of what UK citizens read and view.

Another related irony is that the Bill was, no doubt, designed with Facebook, YouTube, Twitter, Google, TikTok and WhatsApp in mind. However, as the Bill’s own impact assessment notes, 80% of impacted entities have fewer than 10 employees. Many sites, from Wikipedia to Mumsnet, are non-profit or empower their own users to make moderation or policy decisions. These sites, and tens of thousands of British businesses of varying sizes, perhaps unintentionally, now face an extraordinary amount of regulatory red tape. These onerous duties and requirements might be actionable if not desirable for larger platforms, but for smaller ones with limited compliance budgets they could prove a significant if not fatal burden. I do not think that is the purpose of the Bill, but it could be an unintended outcome. This also means that regulation could, inadvertently, act as barrier to entry to new SMEs, creating an ever more monopolistic stronghold for big tech, at the expense of trialling innovations or allowing start-ups to emerge.

I want to finish with the thorny issue of child protection. I have said from the beginning—I mean over the many years since the Bill’s inception—that I would have been much happier if it was more narrowly titled as the Children’s Online Safety Bill, to indicate that protecting children was its sole purpose. That in itself would have been very challenging. Of course, I totally agree with Amendment 1’s intention

“to provide a higher level of protection for children than for adults”.

That is how we treat children and adults offline.