All 11 Viscount Colville of Culross contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 6th Sep 2023

Online Safety Bill

Viscount Colville of Culross Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I declare an interest as a series producer of online and linear content. I, like many noble Lords, can hardly believe that this Bill has finally come before your Lordships’ House. It was in 2017, when I first joined the Communications and Digital Committee, that we started to look at online advertising. We went on to look at regulating the internet in three separate inquiries. I am pleased to see some of those recommendations in the Bill.

It is not surprising that I support the words of the present chair of the committee, the noble Baroness, Lady Stowell, when she said that the Secretary of State still has far too many powers over the regulator. Draft codes of practice, in which Ofcom can give the parameters and direction for the tech companies, and the review of their implementation, are going to be central in shaping its terms of service. Generally, in democracies, we are seeing regulators of the media given increasing independence, with Governments limiting themselves to setting up their framework and then allowing them to get on with the task at hand. I fear the Bill is not doing that. I understand that the codes will be laid before Parliament, but I would support Parliament having a much stronger power over the shaping of those regulations.

I know that Labour supports a Select Committee having the power to scrutinise this work, but having served on the Communications and Digital Committee, I fear that the examination of consultations from Ofcom would monopolise its entire work. I support the pre-legislative committee’s suggestion of a Joint Committee of Parliament, whose sole job would be to examine regulations and give input. I will support amendments to this effect.

I am also worried about Clauses 156 and 157. I listened to the Minister when he said that amendments to the Secretary of State’s powers of guidance will be brought before the House and that they will be used only in exceptional circumstances. However, the list of subjects on which I understand the Minister will then be able to intervene is still substantial, ranging from public safety through economic policy and burdens to business. Are the Government prepared to consider further limiting these powers to intervene?

I will also look at risk assessments in the Bill. They need to go further than illegal content and child safety. The empowerment lists in Clause 12 are not risk assessed and do not seem to have enough flexibility for what noble Lords know is an ever-changing world of harms. The volume of online content means that moderation is carried out by algorithms. During the inquiries in which I was involved, we were told repeatedly that algorithms are very bad at distinguishing humour and context when deciding on harmful content. Ensuring that the platforms’ systems moderate correctly is difficult. There was a recent case of that: the farcical blocking by Twitter of the astronomer Dr Mary McIntyre, whose account was suspended because her six-second video of a meteor shower was mistaken by the Twitter algorithms for a porn video. For weeks, she was unable to get any response from Twitter. Such mistakes happen only too frequently. Dr McIntyre’s complaint is only one of millions made every year against the tech companies, for being either too keen or not keen enough to take down content and, in some cases, to block accounts. So the Bill needs to include a risk assessment which looks at the threat to free speech from any changes in those systems. Ofcom needs to be able to create those risk assessments and to produce annual reports which can then be laid before a Joint Committee for Parliament’s consideration. That should be supported by an ombudsman.

I would also like to see the definition of safety duties on platforms to take down illegal content changed from “reasonable grounds” to the platform being aware that the content is “manifestly illegal”—and, if possible, for third parties, such as the NCA, to be involved in the process. That will reduce the chance of chilling free speech online as much as possible.

I am also aware that there has been concern over the duties to protect news publishers and journalistic content. Like other noble Lords, I am worried that the scope in respect of the latter is drawn too widely in the Bill, and that it covers all content. I would support amendments which concentrate on protecting journalism in the public interest. The term “in the public interest” is well known to the courts, is present in Section 4 of the Defamation Act, and is used to great effect to protect journalism which is judged to be in the public interest.

I welcome the Bill after its long journey to this House. I am sure that the hard work of fellow Peers and collaboration with the Minister will ensure that it leaves this House in a clearer, more comprehensive and safer state. The well-being of future generations of internet users in this country depends on us getting it right.

Online Safety Bill

Viscount Colville of Culross Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

I too wish my noble friend Lady Kidron a happy birthday.

I will speak to Amendment 261. Having sat through the Communications Committee’s inquiries on regulating the internet, it seemed to me that the real problem was the algorithms and the way they operated. We have heard that again and again throughout the course of the Bill. It is no good worrying just about the content, because we do not know what new services will be created by technology. This morning we heard on the radio from the Google AI expert, who said that we have no idea where AI will go or whether it will become cleverer than us; what we need to do is to keep an eye on it. In the Bill, we need to make sure that we are looking at the way technology is being developed and the possible harms it might create. I ask the Minister to include that in his future-proofing of the Bill, because, in the end, this is a very fast-moving world and ecosystem. We all know that what is present now in the digital world might well be completely changed within a few years, and we need to remain cognisant of that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.

This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.

The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.

The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.

As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.

I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.

However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.

We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.

I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.

The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.

This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.

I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.

Online Safety Bill

Viscount Colville of Culross Excerpts
I very much hope that my noble friend will say what I want to say, which is that, yes, there is an issue and we would like to do something. We understand the motivation here, but this is very much the wrong way of going about it. It is inimical to free speech and it leads to absurd conclusions.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

I support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be

“designed to effectively … reduce the likelihood of the user encountering content”

they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.

At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.

The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.

Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.

The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.

The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.

Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.

Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.

I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.

I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.

I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.

Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.

I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.

Online Safety Bill

Viscount Colville of Culross Excerpts
Baroness Bull Portrait Baroness Bull (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Baroness, Lady Stowell, to which I have added my name. As we heard, the amendments originally sat in a different group, on the treatment of legal content accessed by adults. Noble Lords will be aware from my previous comments that my primary focus for the Bill has been on the absence of adequate provisions for the protection of adults, particularly those who are most vulnerable. These concerns underpin the brief remarks I will make.

The fundamental challenge at the heart of the Bill is the need to balance protection with the right to freedom of expression. The challenge, of course, is how. The noble Baroness’s amendments seek to find that balance. They go beyond the requirements on transparency reporting in Clause 68 in several ways. Amendment 46 would provide a duty for category 1 services to maintain an up-to-date document for users of the service, ensuring that users understand the risks they face and how, for instance, user empowerment tools can be used to help mitigate these risks. It also provides a duty for category 1 services to update their risk assessments before making any “significant change” to the design or operation of their service. This would force category 1 services to consider the impact of changes on users’ safety and make users aware of changes before they happen, so that they can take any steps necessary to protect themselves and prepare for them. Amendment 47 provides additional transparency by providing a duty for category 1 services to release a public statement of the findings of the most recent risk assessment, which includes any impact on freedom of expression.

The grouping of these amendments is an indication, if any of us were in doubt, of the complexity of balancing the rights of one group against the rights of another. Regardless of the groupings, I hope that the Minister takes note of the breadth and depth of concerns, as well as the willingness across all sides of the Committee to work together on a solution to this important issue.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I put my name to Amendment 51, which is also in the name of the noble Lords, Lord Stevenson and Lord McNally. I have done so because I think Clause 15 is too broad and too vague. I declare an interest, having been a journalist for my entire career. I am currently a series producer of a series of programmes on Ukraine.

This clause allows journalism on the internet to be defined simply as the dissemination of information, which surely covers all posts on the internet. Anyone can claim that they are a journalist if that is the definition. My concern is that it will make a nonsense of the Bill if all content is covered as journalism.

I support the aims behind the clause to protect journalism in line with Article 10. However, I am also aware of the second part of Article 10, which warns that freedom of speech must be balanced by duties and responsibilities in a democratic society. This amendment aims to hone the definition of journalism to that which is in the public interest. In doing so, I hope it will respond to the demands of the second part of Article 10.

It has never been more important to create this definition of journalism in the public interest. We are seeing legacy journalism of newspapers and linear television being supplanted by digital journalism. Both legacy and new journalism need to be protected. This can be a single citizen journalist, or an organisation like Bellingcat, which draws on millions of digital datapoints to create astonishing digital journalism to prove things such as that Russian separatist fighters shot down flight MH17 over Ukraine.

The Government’s view is that the definition of “in the public interest” is too vague to be useful to tech platforms when they are systematically filtering through possible journalistic content that needs to be protected. I do not agree. The term “public interest” is well known to the courts from the Defamation Act 2013. The law covers the motivation of a journalist, but does not go on to define the content of journalism to prove that it is in the public interest.

--- Later in debate ---
Amendment 51 in the name of the noble Lord, Lord Stevenson of Balmacara, seeks to change the duty of category 1 services to protect journalistic content so it applies only to journalism which they have judged to be in the public interest. This would delegate an inappropriate amount of power to platforms. Category 1 platforms are not in a position to decide what information is in the interests of the British public. Requiring them to do so would undermine why we introduced the Clause 15 duties—
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.

We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.

In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.

Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.

Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.

Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.

Online Safety Bill

Viscount Colville of Culross Excerpts
Lord Lucas Portrait Lord Lucas (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I also have a pair of amendments in this group. I am patron of a charity called JobsAware, which specialises in dealing with fraudulent job advertisements. It is an excellent example of collaboration between government and industry in dealing with a problem such as this. Going forward, though, they will be much more effective if there is a decent flow of information and if this Bill provides the mechanism for that. I would be very grateful if my noble friend would agree to a meeting, between Committee and Report, to discuss how that might best be achieved within the construct of this Bill.

It is not just the authorities who are able to deter these sort of things from happening. If there is knowledge spread through reputable networks about who is doing these things, it becomes much easier for other people to stop them happening. At the moment, the experience in using the internet must bear some similarity to walking down a Victorian street in London with your purse open. It really is all our responsibility to try to do something about this, since we now live so much of our life online. I very much look forward to my noble friend’s response.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.

Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.

We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.

Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.

To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.

The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:

“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]


Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:

“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.


I ask the Minister to at least look at some of these amendments favourably.

Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.

Online Safety Bill

Viscount Colville of Culross Excerpts
I also hope that the Government support Parliament in enhancing its oversight of the regulators in which so much power is being vested. However expert, independent and professional they may be—I note that my noble friend Lord Grade is not in the Chamber today, as I believe he is overseas this week, but no one respects and admires my noble friend more than I do, and I am not concerned in any way about the expertise and professionalism of Ofcom—none the less we are in a situation where they are being vested with a huge amount of power and we need to make sure that the oversight of them is right. Even if I do not support that which is specifically put forward by the noble Lord, Lord Stevenson, this is an area where we need to move forward but we need the Government to support us in doing so if we are going to make it happen. I look forward to what my noble friend has to say in response to this group.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I have put my name to Amendments 113, 114, 117, 118, 120 and 257. As the noble Baroness, Lady Stowell, has said, it is crucial that Ofcom both has and is seen to have complete independence from political interference when exercising its duty as a regulator.

On Ofcom’s website there is an article titled “Why Independence Matters in Regulating TV and Radio”—for the purposes of the Bill, I suggest that we add “Online”. It states:

“We investigate following our published procedures which contain clear, transparent and fair processes. It’s vital that our decisions are always reached independently and impartially”.


I am sure there are few Members of the Committee who would disagree with that statement. That sentiment is supported by a recent UNESCO conference to create global guidance for online safety regulation, whose concluding statement said that

“an independent authority is better placed to act impartially in the public interest and to avoid undue influence from political or industry interests”.

As the noble Baroness, Lady Stowell, has said, that is what successive Governments have striven to do with Ofcom’s regulation of broadcast and radio. Now the Government and Parliament must succeed in doing the same by setting up this Bill to ensure absolute independence for Ofcom in regulating the digital space.

The codes of practice drawn up by Ofcom will be central to the guidance for the parameters set out by the media regulator for the tech companies, so it is essential that the regulator, when setting them up, can act independently from political interference. In my view and that of many local Lords, Clause 39 does not provide that level of independence from political interference. No impartial observer can think that the clause as drafted allows Ofcom the independence that it needs to shape the limits of the tech platforms’ content. In my view, this is a danger to freedom of expression in our country by giving permission for the Secretary of State to interfere continually and persistently in Ofcom’s work.

Amendments 114 and 115 would ensure a badly needed reinforcement of the regulator’s independence. I see why the Minister would want a Secretary of State to have the right to direct the regulator, but I ask him to bear in mind that it will not always be a Minister he supports who is doing the directing. In those circumstances, surely he would prefer a Secretary of State to observe or have regard to the views on the draft codes of practice. Likewise, the endless ping-pong envisaged by Clause 39(7) and (9) allows huge political pressure and interference to be placed on the regulator. This would not be allowed in broadcast regulation, so why is it allowed for online regulation, which is already the dominant medium and can get only more dominant and more important?

Amendment 114 is crucial. Clause 39(1)(a), allowing the Minister’s direction to cover public policy, covers almost everything and is impossibly broad and vague. If the Government want an independent regulator, can the Minister explain how this power would facilitate that goal? I am unsure of how the Government will approach this issue, but I am told that they want to recognise the concerns about an overmighty Secretary of State by bringing forward their own amendment, limiting the powers of direction to specific policy areas. Can the Minister confirm that he is looking at using the same areas as in the Communications Act 2003, which are

“national security … relations with the government of a country … compliance with international obligations of the United Kingdom … the safety of the public or of public health”?

I worry about any government amendment which might go further and cover economic policy and burden to business. I understand that the Government would want to respond to the concerns that this Bill might create a burden on business and therefore could direct Ofcom to ease regulations in these areas. However, if this area is to be included, surely it will create a lobbyists’ charter. We all know how effective the big tech companies have been at lobbying the Government and slowing down the process of shaping this Bill. The Minister has only to talk to some of the Members who have helped to shape the Bill to know the determination and influence of those lobbying companies.

To allow the DCMS Secretary of State to direct Ofcom continuously to modify the codes of practice until they are no longer a burden to business would dramatically dilute the power and independence of the UK’s world-respected media regulator. Surely this is not what the people of Britain would want; the Minister should not want it either. The words “vague” and “broad” are used repeatedly by freedom of speech campaigners when looking at the powers of political interference in the Bill.

When the draft Bill came out, I was appalled by the extraordinary powers that it gave the Secretary of State to modify the content covered by “legal but harmful”, and I am grateful to the Government for responding to the Joint Committee and many other people’s concerns about this potentially authoritarian power. Clause 39 is not in the same league, but for all of us who want to ensure that Ministers do not have the power to interfere in the independence of Ofcom, I ask the Minister to accept the well-thought-through solutions represented by these amendments and supported by all Benches. I also support the request made by the noble Baroness, Lady Stowell, that Parliament should be involved in the oversight of Ofcom. I ask the Minister to respond to these widely supported amendments, either by accepting them or by tabling amendments of his own which guarantee the independence of the regulator.

Online Safety Bill

Viscount Colville of Culross Excerpts
Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.

Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.

Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.

I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I have put my name to Amendments 195, 239 and 263. I also strongly support Amendment 125 in the name of my noble friend Lady Kidron.

During this Committee there have been many claims that a group of amendments is the most significant, but I believe that this group is the most significant. This debate comes after the Prime Minister and the Secretary of State for Science and Technology met the heads of leading AI research companies in Downing Street. The joint statement said:

“They discussed safety measures … to manage risks”


and called for

“international collaboration on AI safety and regulation”.

Surely this Bill is the obvious place to start responding to those concerns. If we do not future-proof this Bill against the changes in digital technology, which are ever increasing at an ever-faster rate, it will be obsolete even before it is implemented.

My greatest concern is the arrival of AI. The noble Baroness, Lady Harding, has reminded us of the warnings from the godfather of AI, Geoffrey Hinton. If he is not listened to, who on earth should we be listening to? I wholeheartedly support Amendment 125. Machine-generated content is present in so much of what we see on the internet, and its presence is increasing daily. It is the future, and it must be within scope of this Bill. I am appalled by the examples that the noble Baroness, Lady Harding, has brought before us.

In the Communications and Digital Committee inquiry on regulating the internet, we decided that horizon scanning was so important that we called for a digital authority to be created which would look for harms developing in the digital world, assess how serious a threat they posed to users and develop a regulated response. The Government did not take up these suggestions. Instead, Ofcom has been given the onerous task of enforcing the triple shield which under this Bill will protect users to different degrees into the future.

Amendment 195 in the name of the right reverend Prelate the Bishop of Oxford will ensure that Ofcom has knowledge of how well the triple shield is working, which must be essential. Surveys of thousands of users undertaken by companies such as Kantar give an invaluable snapshot of what is concerning users now. These must be fed into research by Ofcom to ensure that future developments across the digital space are monitored, updated and brought to the attention of the Secretary of State and Parliament on a regular basis.

Amendment 195 will reveal trends in harms which might not be picked up by Ofcom under the present regime. It will look at the risk arising for individuals from the operation of Part 3 services. Clause 12 on user empowerment duties has a list of content and characteristics from which users can protect themselves. However, the characteristics for which or content with which users can be abused will change over time and these changes need to be researched, anticipated and implemented.

This Bill has proved in its long years of gestation that it takes time to change legislation, while changes on the internet take just minutes or are already here. The regime set up by these future-proofing amendments will at least go some way to protecting users from these fast-evolving harms. I stress to your Lordships’ Committee that this is very much precautionary work. It should be used to inform the Secretary of State of harms which are coming down the line. I do not think it will give power automatically to expand the scope of harms covered by the regime.

Amendment 239 inserts a new clause for an Ofcom future management of risks review. This will help feed into the Secretary of State review regime set out in Clause 159. Clause 159(3)(a) currently looks at ensuring that regulated services are operating using systems and process which, so far as relevant, are minimising the risk of harms to individuals. The wording appears to mean that the Secretary of State will be viewing all harms to individuals. I would be grateful if the Minister could explain to the Committee the scope of the harms set out in Clause 159(3)(a)(i). Are they meant to cover only the harms of illegality and harms to children, or are they part of a wider examination of the harms regime to see whether it needs to be contracted or expanded? I would welcome an explanation of the scope of the Secretary of State’s review.

The real aim of Amendment 263 is to ensure that the Secretary of State looks at research work carried out by Ofcom. I am not sure how politicians will come to any conclusions in the Clause 159 review unless they are required to look at all the research published by Ofcom on future risk. I would like the Minister to explain what research the Secretary of State would rely on for this review unless this amendment is accepted. I hope Amendment 263 will also encourage the Secretary of State to look at possible harms not only from content, but also from the means of delivering this content.

This aim was the whole point of Amendment 261, which has already been debated. However, it needs to be borne in mind when considering that harms come not just from content, but also from the machine technology which delivers it. Every day we read about new developments and threats posed by a fast-evolving internet. Today it is concerns about ChatGPT and the race for the most sophisticated artificial intelligence. The amendments in this group will provide much-needed reinforcement to ensure that the Online Safety Bill remains a beacon for continuing safety online.

Lord Bishop of Chelmsford Portrait The Lord Bishop of Chelmsford
- Parliament Live - Hansard - - - Excerpts

My Lords, I shall speak in favour of Amendments 195, 239 and 263, tabled in the names of my right reverend friend the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, who I thank for his comments.

My right reverend friend the Bishop of Oxford regrets that he is unable to attend today’s debate. I know he would have liked to be here. My right reverend friend tells me that the Government’s Centre for Data Ethics and Innovation, of which he was a founding member, devoted considerable resource to horizon scanning in its early years, looking for the ways in which AI and tech would develop across the world. The centre’s analysis reflected a single common thread: new technologies are developing faster than we can track them and they bring with them the risk of significant harms.

This Bill has also changed over time. It now sets out two main duties: the illegal content duty and the children duty. These duties have been examined and debated for years, including by the joint scrutiny committee. They are refined and comprehensive. Risk assessments are required to be “suitable and sufficient”, which is traditional language from 20 years of risk-based regulation. It ensures that the duties are fit for purpose and proportionate. The duties must be kept up to date and in line with any service changes. Recent government amendments now helpfully require companies to report to Ofcom and publish summaries of their findings.

However, in respect of harms to adults, in November last year the Government suddenly took a different tack. They introduced two new groups of duties as part of a novel triple shield framework, supplementing the duty to remove illegal harms with a duty to comply with their own terms of service and a duty to provide user empowerment tools. These new duties are quite different in style to the illegal content and children duties. They have not benefited from the prior years of consultation.

As this Committee’s debates have frequently noted, there is no clear requirement on companies to assess in the round how effective their implementation of these new duties is or to keep track of their developments. The Government have changed this Bill’s system for protecting adults online late in the day, but the need for risk assessments, in whatever system the Bill is designed around, has been repeated again and again across Committee days. Even at the close of day eight on Tuesday, the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, referred explicitly to the role of risk assessment in validating the Bill’s systems of press reforms. Surely this persistence across days and groups of debate reflects the systemically pivotal role of risk assessments in what is, after all, meant to be a systems and processes rather than a content-orientated Bill.

But it seems that many people on many sides of this Committee believe that an important gap in risk assessment for harms to adults has been introduced by these late changes to the Bill. My colleague the right reverend Prelate is keen that I thank Carnegie UK for its work across the Bill, including these amendments. It notes:

“Harms to adults which might trickle down to become harms to children are not assessed in the current Bill”.


The forward-looking parts of its regime need to be strengthened to ensure that Parliament and the Secretary of State review new ways in which harms manifesting as technology race along, and to ensure that they then have the right advice for deciding what to do about them. To improve that advice, Ofcom needs to risk assess the future and then to report its findings.

Online Safety Bill

Viscount Colville of Culross Excerpts
As the children’s charity Barnardo’s said—and I declare an interest as vice president—children do not have a voice. I feel that we have a responsibility to protect them, and we must expect the Government to take children into consideration and show that they have a holistic view about protecting them from harm. I hope that the Government will embrace these amendments by continuing to listen to common sense and will support them.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, it is a great pleasure to follow the veteran campaigner on this issue, the noble Baroness, Lady Benjamin. I, too, rise briefly to support Amendments 35 to 37A, 85 and 240 in the name of my noble friend Lady Kidron.

In Committee, I put my name to amendments that aimed to produce risk assessments on harms to future-proof the Bill. Sadly, they were thought unnecessary by the Government. Now the Minister has another chance to make sure that Ofcom will be able to assess and respond to potential harms from one of the fastest-changing sectors in the world in order to protect our children. I praise the Minister for having come so far but, if this Bill is to stand the test of time, we will have to be prepared for the ever-changing mechanisms that would deliver that content to children. Noble Lords have already told the House about the fast-changing algorithms and the potential of AI to create harms. Many tech companies do not even understand how their algorithms work; a risk assessment of their functions would ensure that they found out soon enough.

In the Communications and Digital Select Committee inquiry into regulating the internet, we recommended that, because the changes in digital delivery and technology were happening so fast, a specific body needed to be set up to horizon scan. In these amendments, we would build these technological changes into this Bill’s regulatory mechanism to safeguard our children in future. I hope that noble Lords will support the amendment.

Lord Bethell Portrait Lord Bethell (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I also support the amendments from the noble Baroness, Lady Kidron. It is relatively easy to stand here and make the case for age verification for porn: it is such a black and white subject and it is disgusting pornography, so of course children should be protected from it. Making the case for the design of the attention economy is more subtle and complex—but it is incredibly important, because it is the attention economy that is driving our children to extreme behaviours.

I know this from my own personal life; I enjoy incredibly lovely online content about wild-water swimming, and I have been taken down a death spiral towards ice swimming and have become a compulsive swimmer in extreme temperatures, partly because of the addiction generated by online algorithms. This is a lovely and heart-warming anecdote to give noble Lords a sense of the impact of algorithms on my own imagination, but my children are prone to much more dangerous experiences. The plasticity of their brains is so much more subtle and malleable; they are, like other children, open to all sorts of addiction, depression, sleeplessness and danger from predators. That is the economy that we are looking at.

I point noble Lords to the intervention from the surgeon general in America, Admiral Vivek Murthy—an incredibly impressive individual whom I came across during the pandemic. His 25-page report on the impact of social media on the young of America is incredibly eye-opening reading. Some 95% of American children have come across social media, and one-third of them see it almost constantly, he says. He attributes to the impact of social media depression, anxiety, compulsive behaviours and sleeplessness, as well as what he calls the severe impact on the neurological development of a generation. He calls for a complete bar on all social media for the under-13s and says that his own children will never get anywhere near a mobile phone until they are 16. That is the state of the attention economy that the noble Baroness, Lady Kidron, talks about, and that is the state of the design of our online applications. It is not the content itself but the way in which it is presented to our children, and it traps their imagination in the kind of destructive content that can lead them into all kinds of harms.

Admiral Murthy calls on legislators to act today—and that was followed on the same day by a commitment from the White House to look into this and table legislation to address the kind of design features that the noble Baroness, Lady Kidron, is looking at. I think that we should listen to the surgeon general in America and step up to the challenge that he has given to American legislators. I am enormously grateful to my noble friend the Minister for the incredible amount of work that he has already done to try to bridge the gap in this matter, but there is a way to go. Like my noble friend Lady Harding, I hope very much indeed that he will be able to tell us that he has been able to find a way across the gap, or else I shall be supporting the noble Baroness, Lady Kidron, in her amendment.

Online Safety Bill

Viscount Colville of Culross Excerpts
Overall, we need accountable decision-makers, not unaccountable regulators, and we need them to be subject to parliamentary scrutiny. That is the burden of my argument and the effect of my amendments. I hope that they will command the support of the House.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, the codes of practice are among the most important documents that Ofcom will produce as a result of the Bill—in effect, deciding what content we, the users of the internet, will see. The Government’s right to modify these drafts affects us all, so it is absolutely essential that the codes are trusted.

I, too, welcome the Government’s Amendments 134 to 138, which are a huge improvement on the Clause 39 that was presented in Committee. I am especially grateful that the Government have not proceeded with including economic conditions as a reason for the Secretary of State to modify draft codes, which the noble Baroness, Lady Harding, pointed out in Committee would be very damaging. But I would like the Minister to go further, which is why I put my name to Amendments 139, 140, 144 and 145.

Amendment 139 is so important at the moment. My fear is about the opt-out from publishing these directions from the Secretary of State for Ofcom to modify the draft codes, which will then allow them to be made behind closed doors between the Government and the regulator. This should not be allowed to happen. It would happen at a time when trust in the Government is low and there is a feeling that so many decisions affecting us all are taken without our knowledge. Surely it is right that there should be as much transparency as possible in exposing the pressure that the Minister is placing on the regulator. I hope that, if this amendment is adopted, it will allow Parliament to impose the bright light of transparency on the entire process, which is in danger of becoming opaque.

I am sure that no one wants a repeat of what happened under Section 94 of the Telecommunications Act 1984, which gave the Secretary of State power to give directions of a “general character” to anyone, in the “interests of national security” or international relations, as long as they did not disclose important information to Parliament. The Minister’s power to operate in total secrecy, without any accountability to Parliament, was seen by many as wrong and undemocratic. It was subsequently repealed. Amendments 139 and 140 will prevent the creation of a similar problem.

Likewise, I support Amendment 144, which builds on the previous amendments, as another brake on the control of the Secretary of State over this important area of regulations. Noble Lords in this House know how much the Government dislike legislative ping-pong—which we will see later this evening, I suspect. I ask the Minister to transfer this dislike to limiting ping-pong between the Government and the regulator over the drafting of codes of practice. It would also prevent the Secretary of State or civil servants expanding their control of the draft codes of practice from initial parameters to slightly wider sets of parameters each time that they are returned to the Minister for consideration. It will force the civil servants and the Secretary of State to make a judgment on the limitation of content and ensure that they stick to it. As it is, the Secretary of State has two bites of the cherry. They are involved in the original shaping of the draft codes of practice and then they can respond to Ofcom’s formulation. I hope the Minister would agree that it is sensible to stop this process from carrying on indefinitely. I want the users of the digital world to have full faith that the control of online content they see is above board —and not the result of secretive government overreach.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, not for the first time I find myself in quite a different place from my noble friend Lord Moylan. Before I go through some detailed comments on the amendments, I want to reflect that at the root of our disagreement is a fundamental view about how serious online safety is. The logical corollary of my noble friend’s argument is that all decisions should be taken by Secretaries of State and scrutinised in Parliament. We do not do that in other technical areas of health and safety in the physical world and we should not do that in the digital world, which is why I take such a different view—

--- Later in debate ---
Lord McNally Portrait Lord McNally (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, my name is also to this amendment. I am moved by a phrase used by the noble Lord, Lord Stevenson, on Monday; he said the passage of this Bill has been a “series of conversations”. So it has been. The way the Minister has engaged with the House on many of the concerns that the Bill tries to cover has been greatly to his credit.

It is somewhat unknown how much the new technologies will impact on our democracy, our privacy and the safety of our children, although they have all been discussed with great thoroughness. That is why the opt-out for recognised news publishers is something of a puzzle, unless you assume that the Government have caved in to pressure from that sector. Why should it be given this opt-out? It is partly because if you ask the press to take responsibility in any way, it becomes like Violet Elizabeth Bott in the Just William stories; it “thkweems and thkweems”—usually led by the noble Lord, Lord Black, whom I am glad to see in his place —and talks about press freedom.

My skin in this game is that I was the Minister in the Lords when the Leveson inquiry was under way and when we took action to try to implement its findings. It is interesting that at that point there was cross-party agreement in both Houses on how to implement them. I advise anybody intending to go into coalitions in future not to take the Conservative Party’s assurances on such matters totally at face value, as that cross-party agreement to implement Leveson was reneged on by the Conservative Party under pressure from the main newspaper publishers.

It was a tragedy, because the “series of conversations” that the noble Lord, Lord Stevenson, referred to will be ongoing. We will not let the press off the hook, no matter how much it wields its power. It is just over 90 years since Stanley Baldwin’s famous accusation of

“power without responsibility—the prerogative of the harlot throughout the ages”.

It is just over 30 years since David Mellor warned the press that it was in the “last chance saloon” and just over 10 years since Rupert Murdoch said that appearing before the Leveson inquiry, with a curious choice of language, was

“the most humble day of my life”.

Of course, like water off a duck’s back, once the pressure was off and the deal had been done with the Conservative Party, we could carry on on our own merry way.

It was a tragedy too because the Leveson settlement—as I think the PRP and Impress have proved—works perfectly well. It is neither state controlled nor an imposition on a free press. Like the noble Lord, Lord Lipsey, I greatly resent the idea that this is somehow an attempt to impose on a free press. It is an attempt to get the press to help the whole of our democracy and make things work properly, just as this Bill attempts to do.

Someone mentioned Rupert Murdoch’s recent summer party. The Prime Minister was not the only one who went—so did the leader of the Opposition. I like to think that Mr Attlee would not have gone. I am not sure that my old boss, Jim Callaghan, would have gone. I do not think that either would have flown half way around the world, as Tony Blair did, to treat with him. The truth is that, over the last decade or so, in some ways the situation has got worse. Politicians are more cowed by the press. When I was a Minister and we proposed some reasonably modest piece of radical change, I was told by my Conservative colleague, “We’ll not get that through; the Daily Mail won’t tolerate it”. That pressure on politics means we need politicians with the guts to resist it.

Those who want a genuinely free press would not leave this festering wound. I will not join in the attack on the noble Lord, Lord Faulks, because we worked together very well in coalition. I would prefer to see IPSO reform itself to become Leveson-compliant. That would not bring any of the dangers that we will hear about from the noble Lord, Lord Black, but it would give us a system of press regulation that we could all agree with.

On Section 40, I remember well the discussions about how we would give some incentive to join. A number of my colleagues feel uncomfortable about Section 40 making even the winners pay, but the winner pays only if they are not within a Leveson-compliant system. That was, perhaps innocently, thought of as a carrot to bring the press in, though, of course, it does not read easily. Frankly, if Section 40 were to go but IPSO became Leveson-compliant, that would be a fair deal.

This Bill leaves us with some very dangerous loopholes. Some of the comments underneath in the press and, as the Minister referred to, the newsclips that can be added can be extremely dangerous if children are exposed to them.

There are many other loopholes that this genuflection to press power is going to leave in the Bill and which will lead to problems in the future. Rather than launch another attack—because you can be sure another case will come along or another outrage will happen, and perhaps this time, Parliament will have the guts to deal with it—it would be far better if the media itself saw Leveson for what it was: a masterful, genuine attempt to put a free press within the context of a free society and protect the individuals and institutions in that society in a way that is in all our interests. As the noble Lord, Lord Lipsey, said, we are not pushing this tonight, but we are not going to go away.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I have been a journalist my whole career and I have great respect for the noble Lords who put their names to Amendments 159 and 160. However, I cannot support another attempt to lever Section 42 of the Crime and Courts Act into the Bill. In Committee I put my name to Amendment 51, which aims to protect journalism in the public interest. It is crucial to support our news outlets, in the interests of democracy and openness. We are in a world where only a few newspapers, such as the New York Times, manage to make a profit from their digital subscribers. I welcome the protection provided by Clause 50; it is much needed.

In the past decade, the declining state of local journalism has meant there is little coverage of magistrates’ courts and council proceedings, the result being that local public servants are no longer held to account. At a national level, newspapers are more and more reluctant to put money into investigations unless they are certain of an outcome, which is rarely the case. Meanwhile, the tech platforms are using newspapers’ contents for free or paying them little money, while disaggregating news content on their websites so the readers do not even know its provenance. I fear that the digital era is putting our legacy media, which has long been a proud centrepiece of our democracy, in great danger. The inclusion of these amendments would mean that all national newspapers and most local media would be excluded from the protections of the clause. The Bill, which is about regulating the digital world, should not be about trying to limit the number of newspapers and news websites covered by the protections of Clause 50; it would threaten democracy at a local and national level.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am very pleased to say a few words, because I do not want to disappoint my good friend the noble Lord, Lord McNally, who has obviously read the text of my speech before I have even delivered it. I declare my interests as deputy chairman of the Telegraph Media Group and a director of the Regulatory Funding Company, and note my other interests as set out in the register.

It will not come as a surprise that I oppose Amendments 159 and 160. I am not going to detain your Lordships for long; there are other more important things to talk about this evening than this seemingly never-ending issue, about which we had a good discussion in Committee. I am sorry that the two noble Lords were indisposed at that time, and I am glad to see they are back on fighting form. I am dispirited that these amendments surfaced in the first place as I do not think they really have anything to do with online safety and the protection of children. This is a Bill about the platforms, not the press. I will not repeat all the points we discussed at earlier stages. Suffice it to say that, in my view, this is not the time and the place to seek to impose what would be statutory controls on the press, for the first time since that great liberal, John Locke, led the charge for press freedom in 1695 when the Licensing Acts were abolished. Let us be clear: despite what the two noble Lords said, that is what these amendments would do, and I will briefly explain why.

These amendments seek to remove the exemption for news publishers from an onerous statutory regime overseen by Ofcom, which is, as the noble Lord, Lord Lipsey, said, a state regulator, unless they are part of an approved regulator. Yet no serious publisher, by which I mean the whole of the national and regional press, as the noble Viscount, Lord Colville, said—including at least 95% of the industry, from the Manchester Evening News to Cosmopolitan magazine—is ever going to join a regulator which is approved by the state. Even that patron saint of press controls, Sir Brian Leveson, conceded that this was a “principled position” for the industry to take. The net effect of these amendments would be, at a stroke, to subject virtually the entire press to state regulation—a momentous act wholly inimical to any definition of press freedom and free speech—and with very little discussion and absolutely no consultation.

Online Safety Bill

Viscount Colville of Culross Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

My Lords, we are coming to some critical amendments on a very important issue relatively late in the Bill, having had relatively little discussion on it. It is not often that committees of this House sit around and say, “We need more lawyers”, but this is one of those areas where that was true.

Notwithstanding the blushes of my noble friend on the Front Bench here, interestingly we have not had in our debate significant input from people who understand the law of freedom of expression and wish to contribute to our discussions on how online platforms should deal with questions of the legality of content. These questions are crucial to the Bill, which, if it does nothing else, tells online platforms that they have to be really robust in taking action against content that is deemed to be illegal under a broad swathe of law in the United Kingdom that criminalises certain forms of speech.

We are heavy with providers, and we are saying to them, “If you fail at this, you’re in big trouble”. The pressure to deal with illegal content will be huge, yet illegality itself covers a broad spectrum, from child sexual exploitation and abuse material, where in many cases it is obvious from the material that it is illegal and there is strict liability—there is never any excuse for distributing that material—and pretty much everyone everywhere in the world would agree that it should be criminalised and removed from the internet, through to things that we discussed in Committee, such as public order offences, where, under some interpretations of Section 5 of the Public Order Act, swearing at somebody or looking at them in a funny way in the street could be deemed alarming and harassing. There are people who interpret public order offences in this very broad sense, where there would be a lot less agreement about whether a specific action is or is not illegal and whether the law is correctly calibrated or being used oppressively. So we have this broad spectrum of illegality.

The question we need to consider is where we want providers to draw the line. They will be making judgments on a daily basis. I said previously that I had to make those judgments in my job. I would write to lawyers and they would send back an expensive piece of paper that said, “This is likely to be illegal”, or, “This is likely not to be illegal”. It never said that it was definitely illegal or definitely not illegal, apart from the content I have described, such as child sexual abuse. You would not need to send that, but you would send the bulk of the issues that we are dealing with to a lawyer. If you sent it to a second lawyer, you would get another “likely” or “not likely”, and you would have to come to some kind of consensus view as to the level of risk you wished to take on that particular form of speech or piece of content.

This is really challenging in areas such as hate speech, where exactly the same language has a completely different meaning in different contexts, and may or may not be illegal. Again, to give a concrete example, we would often deal with anti-Semitic content being shared by anti-anti-Semitic groups—people trying to raise awareness of anti-Semitic speech. Our reviewers would quite commonly remove the speech: they would see it and it would look like grossly violating anti-Semitic speech. Only later would they realise that the person was sharing it for awareness. The N-word is a gross term of racial abuse, but if you are an online platform you permit it a lot of the time, because if people use it self-referentially they expect to be able to use it. If you start removing it they would naturally get very upset. People expect to use it if it is in song lyrics and they are sharing music. I could give thousands of examples of speech that may or may not be illegal depending entirely on the context in which it is being used.

We will be asking platforms to make those judgments on our behalf. They will have to take it seriously, because if they let something through that is illegal they will be in serious trouble. If they misjudged it and thought the anti-Semitic hate speech was being circulated by Jewish groups to promote awareness but it turned out it was being circulated by a Nazi group to attack people and that fell foul of UK law, they would be in trouble. These judgments are critical.

We have the test in Clause 173, which says that platforms should decide whether they have “reasonable grounds to infer” that something is illegal. In Committee, we debated changing that to a higher bar, and said that we wanted a stronger evidential basis. That did not find favour with the Government. We hoped they might raise the bar themselves unilaterally, but they have not. However, we come back again in a different way to try to be helpful, because I do not think that the Government want excessive censorship. They have said throughout the Bill’s passage that they are not looking for platforms to be overly censorious. We looked at the wording again and thought about how we could ensure that the bar is not operated in a way that I do not think that the Government intend. We certainly would not want that to happen.

We look at the current wording in Clause 173 and see that the test there has two elements. One is: “Do you have reasonable grounds to infer?” and then a clause in brackets after that says, “If you do have reasonable grounds to infer, you must treat the content as illegal”. In this amendment we seek to remove the second part of that phrasing because it seems problematic. If we say to the platform, “Reasonable grounds to infer, not certainty”—and it is weird to put “inference”, which is by definition mushy, with “must”, which is very certain, into the same clause—we are saying, “If you have this mushy inference, you must treat it as illegal”, which seems quite problematic. Certainly, if I were working at a platform, the way I would interpret that is: “If in doubt, take it out”. That is the only way you can interpret that “must”, and that is really problematic. Again, I know that that is not the Government’s intention, and if it were child sexual exploitation material, of course you “must”. However, if it is the kind of abusive content that you have reasonable grounds to infer may be an offence under the Public Order Act, “must” you always treat that as illegal? As I read the rest of the Bill, if you are treating it as illegal, the sense is that you should remove it.

That is what we are trying to get at. There is a clear understanding from the Government that their intention is “must” when it comes to that hard end of very bad, very clearly bad content. However, we need something else—a different kind of behaviour where we are dealing with content where it is much more marginal. Otherwise, the price we will pay will be in freedom of expression.

People in the United Kingdom publish quite robust, sweary language. I sometimes think that some of the rules we apply penalise the vernacular. People who use sweary, robust language may be doing so entirely legally—the United Kingdom does not generally restrict people from using that kind of language. However, we risk heading towards a scenario where people post such content in future, and they will find that the platform takes it down. They will complain to the platform, saying, “Why the hell did you take my content down?”—in fact, they will probably use stronger words than that to register their complaint. When they do, the platform will say, “We had reasonable grounds to infer that that was in breach of the Public Order Act, for example, because somebody might feel alarmed, harassed or distressed by it. Oh, and look—in this clause, it says we ‘must’ treat it as illegal. Sorry—there is nothing else we can do. We would have loved to have been able to exercise the benefit of the doubt and to allow you to carry on using that kind of language, because we think there is some margin where you have not behaved in an illegal way. But unfortunately, because of the way that Clause 173 has been drafted, our lawyers tell us we cannot afford to take the risk”.

In the amendment we are trying to—I think—help the Government to get out of a situation which, as I say, I do not think they want. However, I fear that the totality of the wording of Clause 173, this low bar for the test and the “must treat as” language, will lead to that outcome where platforms will take the attitude: “Safety first; if in doubt, take it out”, and I do not think that that is the regime we want. I beg to move.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

My Lords, I regret I was unable to be present in Committee to deliver my speech about the chilling effect that the present definition of illegality in the Bill will have on free speech on the internet.

I am still concerned about Clause 173, which directs platforms how to come to the judgment on what is illegal. My concern is that the criterion for illegality, “reasonable grounds to infer” that elements of the content are illegal, will encourage the tech companies to take down content which is not necessarily illegal but which they infer could be. Indeed, the noble Lord, Lord Allan, gave us a whole list of examples of where that might happen. Unfortunately, in Committee there was little support for a higher bar when asking the platforms to judge what illegal content is. However, I have added my name to Amendment 228, put forward by the noble Lord, Lord Allan, because, as he has just said, it is a much less radical way of enhancing free speech when platforms are not certain whether to take down content which they infer is illegal.

The deletion of part of Clause 173(5) is a moderate proposal. It still leaves intact the definition for the platforms of how they are to make the judgment on the illegality of content, but it takes out the compulsory element in this judgment. I believe that it will have the biggest impact on the moderation system. Some of those systems are run by machines, but many of the moderation processes, such as Meta’s Facebook, involve thousands of human beings. The deletion of the second part of Clause 173(5), which demands that they take down content that they infer is illegal, will give them more leeway to err on the side of freedom of speech. I hope that this extra leeway to encourage free speech will also be included in the way that algorithms moderate our content.

Online Safety Bill

Viscount Colville of Culross Excerpts
3rd reading
Wednesday 6th September 2023

(7 months, 2 weeks ago)

Lords Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 164-I Marshalled list for Third Reading - (5 Sep 2023)
Baroness Kennedy of Shaws Portrait Baroness Kennedy of The Shaws (Lab)
- Parliament Live - Hansard - - - Excerpts

My Lords, I, too, join noble Lords in thanking the Minister for the way in which he has addressed my concerns about aspects of the Bill and has wanted to enhance particularly the protection of women and girls from the kind of threats that they experience online. I really feel that the Minister has been exemplary in the way in which he has interacted with everyone in this House who has wanted to improve the Bill and has come to him with good will. He has listened and his team have been absolutely outstanding in the work that they have done. I express my gratitude to him.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Parliament Live - Hansard - -

My Lords, I, too, thank the Minister for the great improvements that the Government have made to the Secretary of State’s powers in the Bill during its passage through this House. I rise to speak briefly today to praise the Government’s new Amendments 1 and 2 to Clause 44. As a journalist, I was worried by the lack of transparency around these powers in the clause; I am glad that the lessons of Section 94 of the Telecommunications Act 1984, which had to be rescinded, have been learned. In a world of conspiracy theories that can be damaging to public trust and governmental and regulatory process, it has never been more important that Parliament and the public are informed about the actions of government when giving directions to Ofcom about the draft codes of practice. So I am glad that these new amendments resolve those concerns.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I welcome Amendments 5 and 6, as well as the amendments that reflect the work done and comments made in earlier stages of this debate by the noble Baroness, Lady Kennedy. Of course, we are not quite there yet with this Bill, but we are well on the way as this is the Bill’s last formal stage in this Chamber before it goes back to the House of Commons.

Amendments 5 and 6 relate to the categorisation of platforms. I do not want to steal my noble friend’s thunder, but I echo the comments made about the engagement both from my noble friend the Minister and from the Secretary of State. I am delighted that the indications I have received are that they will accept the amendment to Schedule 11, which this House voted on just before the Recess; that is a significant and extremely welcome change.

When commentators outside talk about the work of a revising Chamber, I hope that this Bill will be used as a model for cross-party, non-partisan engagement in how we make a Bill as good as it possibly can be—particularly when it is as ground-breaking and novel as this one is. My noble friend the Minister said in a letter to all of us that this Bill had been strengthened in this Chamber, and I think that is absolutely right.

I also want to echo thanks to the Bill team, some of whom I was working with four years ago when we were talking about this Bill. They have stuck with the Bill through thick and thin. Also, I thank noble Lords across the House for their support for the amendments but also all of those outside this House who have committed such time, effort, support and expertise to making sure this Bill is as good as possible. I wish it well with its final stages. I think we all look forward to both Royal Assent and also the next big challenge, which is implementation.