All 23 Baroness Fox of Buckley contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, the Secretary of State, Michelle Donelan, has acknowledged that protecting children is the very reason that this Bill exists. If only the Government had confined themselves to that crucial task. Instead, I worry that the Bill has ballooned and still could be a major threat to free expression of adults. I agreed with much of what the noble Baroness, Lady D’Souza, just spoke about.

Like some other noble Lords here, I am delighted that the Government have dropped the censorious “legal but harmful” clauses. It was disappointing to hear Labour MPs in the other place keen to see them restored. In this place, I have admired opposition resistance to assaults on civil liberties in, for example, the Public Order Bill. Perhaps I can appeal for consistency to be just as zealous on free speech as a foundational civil liberty. I urge those pushing versions of censoring “legal but harmful” for adults to think again.

The Government’s counter to many freedom of expression concerns is that free speech is protected in various clauses, but stating that service providers must have regard to the importance of protecting users’ rights of freedom of speech is incredibly weak and woolly, giving a second-class status whencontrasted with the operational safety duties that compel companies to remove material. Instead, we need a single comprehensive and robust statutory duty in favour of freedom of expression that requires providers to ensure that free speech is not infringed on by measures taken to comply with other duties. Also, free speech should be listed as a relevant duty for which Ofcom has to develop a code of practice.

The Bill requires providers to include safety provisions for content in their terms of service. However, no similar requirement for free speech exists. It seems ironic that a Bill that claims to be clipping the power of big tech could actually empower companies to police and censor legal material in the name of safety, via the commercial route of terms and conditions.

The Government brush off worries that big tech is being encouraged to limit what UK citizens say or read online by glibly asserting that these are private companies and that they must be free to develop their own terms of service. Surely that is disingenuous. The whole purpose of the legislation is to interfere in private companies, compelling them to adhere to duties or face huge penalties. If the Government do not trust big tech with users’ safety, why do they trust them with UK citizens’ free speech rights? Similarly, consider the user empowerment duties. If users ask that certain specified types of legal content are blocked or filtered out, such as hate or abuse, it is big tech that has the power to decide what is categorised under those headings.

Only last year, amendments put forward in this House on placing convicted sex-offending trans prisoners on the female estate were labelled online as hate-fuelled, transphobic abuse. However, with the ability to hear all sides of the debate online, and especially in the light of recent events in Scotland around the Gender Recognition Act, more and more people realise that such views are not hate but driven by concerns about safeguarding women’s rights. Would such a debate be filtered out online by overcautious labelling by big tech and the safety duties in its Ts and Cs?

Finally, like others, I am worried that the Secretary of State is given too much power—for example, to shape Ofcom’s codes of practice, which is a potential route for political interference. My concerns are fuelled by recent revelations. In the US, Elon Musk’s leaked Twitter files prove that, in the run-up to the 2020 election, Joe Biden’s presidential campaign routinely flagged up tweets and accounts that it wanted removed, influencing the suppression of the New York Post’s Hunter Biden laptop exposé. Here in the UK, only this week, a shocking Big Brother Watch report reveals that military operatives reported on online dissenting views on official Covid lockdown policies to No. 10 and the DCMS’s counter-disinformation unit, allowing Whitehall’s hotlines to giant media companies to suppress this legal content. Even the phrase “illegal” in the Bill can be politically weaponised, such as with the proposal to censor content allegedly promoting small boat crossings.

Free speech matters to democracy, and huge swathes of this Bill could threaten both unless we amend it appropriately.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.

I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.

I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?

That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I welcome this opportunity to clarify the purposes of the Bill, but I am not sure that the amendment helps as my North Star. Like the Bill, it throws up as many questions as answers, and I found myself reading it and thinking “What does that word mean?”, so I am not sure that clarity was where I ended up.

It is not a matter of semantics, but in some ways you could say—and certainly this is as publicly understood—that the name of the Bill, the Online Safety Bill, gives it its chief purpose. Yet however well-intentioned, and whatever the press releases say or the headlines print, even a word such as “safety” is slippery, because safety as an end can be problematic in a free society. My worry about the Bill is unintended consequences, and that is not rectified by the amendment. As the Bill assumes safety as the ultimate goal, we as legislators face a dilemma. We have the responsibility of weighing up the balance between safety and freedom, but the scales in the Bill are well and truly weighted towards safety at the expense of freedom before we start, and I am again not convinced the amendment weights them back again.

Of course, freedom is a risky business, and I always like the opportunity to quote Karl Marx, who said:

“You cannot pluck the rose without its thorns!”


However, it is important to recognise that “freedom” is not a dirty word, and we should avoid saying that risk-free safety is more important than freedom. How would that conversation go with the Ukrainian people who risk their safety daily for freedom? Also, even the language of safety, or indeed what constitutes the harms that the Bill and the amendments promise to keep the public safe from, need to be considered in the cultural and social context of the norms of 2023. A new therapeutic ethos now posits safety in ever-expanding pseudo-psychological and subjective terms, and this can be a serious threat to free speech. We know that some activists often exploit that concept of safety to claim harm when they merely encounter views they disagree with. The language of safety and harm is regularly used to cancel and censor opponents—and the Government know that, so much so that they considered it necessary to introduce the Higher Education (Freedom of Speech) Bill to secure academic freedom against an escalating grievance culture that feigns harm.

Part of the triple shield is a safety duty to remove illegal content, and the amendment talks about speech within the law. That sounds unobjectionable—in my mind it is far better than “legal but harmful”, which has gone—but, while illegality might sound clear and obvious, in some circumstances it is not always clear. That is especially true in any legal limitations of speech. We all know about the debates around hate speech, for example. These things are contentious offline and even the police, in particular the College of Policing, seem to find the concept of that kind of illegality confusing and, at the moment, are in a dispute with the Home Secretary over just that.

Is it really appropriate that this Bill enlists and mandates private social media companies to judge criminality using the incredibly low bar of “reasonable grounds to infer”? It gets even murkier when the legal standard for permissible speech online will be set partly by compelling platforms to remove content that contravenes their terms and conditions, even if these terms of service restrict speech far more than domestic UK law does. Big tech is being incited to censor whatever content it wishes as long as it fits in with their Ts & Cs. Between this and determining, for example, what is in filters—a whole different issue—one huge irony here, which challenges one of the purposes of the Bill, is that despite the Government and many of us thinking that this legislation will de-fang and regulate big tech’s powers, actually the legislation could inadvertently give those same corporates more control of what UK citizens read and view.

Another related irony is that the Bill was, no doubt, designed with Facebook, YouTube, Twitter, Google, TikTok and WhatsApp in mind. However, as the Bill’s own impact assessment notes, 80% of impacted entities have fewer than 10 employees. Many sites, from Wikipedia to Mumsnet, are non-profit or empower their own users to make moderation or policy decisions. These sites, and tens of thousands of British businesses of varying sizes, perhaps unintentionally, now face an extraordinary amount of regulatory red tape. These onerous duties and requirements might be actionable if not desirable for larger platforms, but for smaller ones with limited compliance budgets they could prove a significant if not fatal burden. I do not think that is the purpose of the Bill, but it could be an unintended outcome. This also means that regulation could, inadvertently, act as barrier to entry to new SMEs, creating an ever more monopolistic stronghold for big tech, at the expense of trialling innovations or allowing start-ups to emerge.

I want to finish with the thorny issue of child protection. I have said from the beginning—I mean over the many years since the Bill’s inception—that I would have been much happier if it was more narrowly titled as the Children’s Online Safety Bill, to indicate that protecting children was its sole purpose. That in itself would have been very challenging. Of course, I totally agree with Amendment 1’s intention

“to provide a higher level of protection for children than for adults”.

That is how we treat children and adults offline.

--- Later in debate ---
Lord Cormack Portrait Lord Cormack (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am one of those who found the Bill extremely complicated, but I do not find this amendment extremely complicated. It is precise, simple, articulate and to the point, and I think it gives us a good beginning for debating what is an extremely complex Bill.

I support this amendment because I believe, and have done so for a very long time, that social media has done a great deal more harm than good, even though it is capable of doing great good. Whether advertently or inadvertently, the worst of all things it has done is to destroy childhood innocence. We are often reminded in this House that the prime duty of any Government is to protect the realm, and of course it is. But that is a very broad statement. We can protect the realm only if we protect those within it. Our greatest obligation is to protect children—to allow them to grow up, so far as possible, uncorrupted by the wicked ways of a wicked world and with standards and beliefs that they can measure actions against. Complex as it is, the Bill is a good beginning, and its prime purpose must be the protection and safeguarding of childhood innocence.

The noble Lord, Lord Griffiths of Burry Port, spoke a few moments ago about the instructions he was given as a young preacher. I remember when I was training to be a lay reader in the Church of England, 60 or more years ago, being told that if you had been speaking for eight minutes and had not struck oil, stop boring. I think that too is a good maxim.

We have got to try to make the Bill comprehensible to those around the country whom it will affect. The worst thing we do, and I have mentioned this in connection with other Bills, is to produce laws that are unintelligible to the people in the country; that is why I was very sympathetic to the remarks of my noble friend Lord Inglewood. This amendment is a very good beginning. It is clear and precise. I think nearly all of us who have spoken so far would like to see it in the Bill. I see the noble Baroness, Lady Fox, rising—does she wish to intervene?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I want to explain more broadly that I am all for clarifying what the law is about and for simplicity, but that ship has sailed. We have all read the Bill. It is not simple. I do not want this amendment to somehow console us, so that we can say to the public, “This is what the Bill is about”, because it is not what the Bill is about. It is about a range of things that are not contained within the amendment—I would wish them to be removed from the Bill. I am concerned that we think this amendment will resolve a far deeper and greater problem of a complicated Bill that very few of us can grasp in its entirety. We should not con the public that it is a simple Bill; it is not.

Lord Cormack Portrait Lord Cormack (Con)
- Hansard - - - Excerpts

Of course we should not. What I am saying is that this amendment is simple. If it is in the Bill, it should then be what we are aiming to create as the Bill goes through this House, with our hours of scrutiny. I shall not take part in many parts of this Bill, as I am not equipped to do so, but there are many in this House who are. Having been set the benchmark of this amendment, they can seek to make the Bill comprehensible to those of us—and that seems to include the noble Baroness, Lady Fox—who at the moment find it incomprehensible.

In a way, we are dealing with the most important subject of all: the protection of childhood innocence. We have got to err in that direction. Although I yield to no one in my passionate belief in the freedom of speech, it must have respect for the decencies of life and not be propagator of the profanities of life.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Moved by
4: Clause 3, page 3, line 17, leave out paragraphs (a) and (b) and insert “the service has at least one million monthly United Kingdom users.”
Member’s explanatory statement
This amendment replaces the two tests currently set out in subsection (5) of clause 3, relating to a service’s links with the United Kingdom, with a requirement that the service have at least a million monthly United Kingdom users.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

My Lords, in moving Amendment 4, I will also speak to Amendments 6 to 8 and 12 and consequential Amendments 288 and 305, largely grouped under the heading “exemptions”. In this group I am also particularly sympathetic to Amendment 9 in the names of the noble Lords, Moylan and Lord Vaizey, and I will leave them to motivate that. I look forward to hearing from the noble Lord, Lord Knight, an explanation for his Amendment 9A.

Last Wednesday we discussed the purposes of the Bill, and there was much agreement across the Chamber on one issue at least: that we need to stay focused and make sure that an already highly complex piece of legislation does not become even more unwieldy. My concern in general is that the Bill already suffers throughout from being overly broad in its aims, resulting in restricting the online experience and expressions of everyone. This series of amendments is about trying to rein in the scope, allowing us to focus on clear targets rather than a one-size-fits-all Bill that sweeps all in its wake with perhaps unintended and damaging consequences.

The Bill creates an extraordinary set of regulatory burdens on tens of thousands of British businesses, micro-communities and tech platforms, no matter the size. The impact assessment claims that 25,000 businesses are in scope, and that is considered a conservative estimate. This implies that an extraordinary range of platforms, from Mumsnet and Wikipedia to whisky-tasting forums and Reddit, will be caught up in this Bill. Can we find a way of removing the smaller platforms from scope? It will destroy too many of them if they have to comply with the regulatory burden created with huge Silicon Valley behemoths in mind.

Let us consider some of the regulatory duties that these entities are expected to comply with. They will need to undertake extensive assessments that must be repeated whenever a product changes. They will need to proactively remove certain types of content, involving assessing the risk of users encountering each type of illegal content, the speed of dissemination and functionality, the design of the platform and the nature and severity of the risk of harms presented to individual users. This will mean assessing their user base and implementing what are effectively surveillance systems to monitor all activity on their platforms.

Let us consider what a phrase such as “prevent from encountering” would mean to a web host such as Wikipedia. It would mean that it would need to scan and proactively analyse millions of edits across 250 languages for illegality under UK-specific law and then block content in defiance of the wishes of its own user community. There is much more, of course. Rest assured, Ofcom’s guidance and risk assessment will, over time, increase the regulatory complexity and the burdens involved.

Those technological challenges do not even consider the mountain of paperwork and administrative obligations that will be hugely costly and time consuming. All that might be achievable, if onerous, for larger platforms. But for smaller ones it could prove a significant problem, with SMEs and organisations working with a public benefit remit particularly vulnerable. Platforms with the largest profits and the most staff dedicated to compliance will, as a consequence, dominate at the expense of start-ups, small companies and community-run platforms.

No doubt the Government and the Minister will assure us that the duties are not so onerous and that they are manageable and proportionate. The impact assessment estimates that implementing the Bill will cost businesses £2.5 billion over the first 10 years, but all the commentators I have read think this is likely to be a substantial underestimate, especially when we are told in the same impact assessment that the legal advice is estimated to cost £39.23 per hour. I do not know what lawyers the Government hang out with, but they appear not to have a clue about the going rate for specialist law firms.

Also, what about the internal staff time? Again, the impact assessment assumes that staff will require only 30 minutes to familiarise themselves with the requirements of the legislation and 90 minutes to read, assess and change the terms and conditions in response to the requirements. Is this remotely serious? Even working through the groups of amendments has taken me hours. It has been like doing one of those 1,000-piece jigsaws, but at least at the end of those you get to see the complete picture. Instead, I felt as though somebody had come in and thrown all the pieces into the air again. I was as confused as ever.

If dealing with groups of amendments to this Bill is complex, that is nothing on the Bill itself, which is dense and often impenetrable. Last week, the Minister helpfully kept telling us to read the Explanatory Notes. I have done that several times and I am still in a muddle, yet somehow the staff of small tech companies will conquer all this and the associated regulatory changes in an hour and a half.

Many fear that this will replicate the worst horrors of GDPR, which, according to some estimates, led to an 8% reduction in the profits of smaller firms while it had little or no effect on the profits of large tech companies. That does not even take into account the cost of the near nervous breakdowns that GDPR caused small organisations, as I know from my colleagues at the Academy of Ideas.

These amendments try to tackle this disproportionate burden on smaller platforms—those companies that are, ironically, often useful challenges and antidotes to big tech’s dominance. The amendments would exempt them unless there is a good reason for specific platforms to be in scope. Of course, cutting out those in scope may not appeal to everyone here. From looking at the ever-increasing amendments list, it seems that some noble Lords have an appetite for expanding the number of services the legislation will apply to; we have already heard the discussion about app stores and online gaming. But we should note that the Government have carved out other exemptions for certain services that are excluded from the new regulatory system. They have not included emails, SMS messages, one-to-one oral communications and so on. I am suggesting some extra exemptions and that we remove services with fewer than 1 million monthly UK users. Ofcom would have the power to issue the provider with a notice bringing them into scope, but only based on reasonable grounds, having identified a safety risk and with 30 days’ notice.

If we do not tackle this, I fear that there is a substantial, serious and meaningful risk that smaller platforms based outside and inside the UK will become inaccessible to British users. It is notable that over 1,000 US news websites blocked European users during the EU’s introduction of GDPR, if noble Lords remember. Will there be a similar response to this law? What, for example, will the US search engine DuckDuckGo conclude? The search engine emphasises privacy and refuses to gather information on its users, meaning that it will be unable to fulfil the duties contained in the Bill of identifying or tailoring search results to users based on their age. Are we happy for it to go?

I fear that this Bill will reduce the number of tech platforms operating in the UK. This is anti-competitive. I do not say that because I have a particular commitment to competition and the free market, by the way. I do so because competition is essential and important for users’ choice and empowerment, and for free speech—something I fear the Bill is threatening. Indeed, the Lords’ Communications and Digital Committee’s extensive inquiry into the implications of giving large tech companies what is effectively a monopoly on defining which speech is free concluded:

“Increasing competition is crucial to promoting freedom of expression online. In a more competitive market, platforms would have to be more responsive to users’ concerns about freedom of expression and other rights”.


That is right. If users are concerned that a platform is failing to uphold their freedom of expression, they can join a different platform with greater ease if there is a wide choice. Conversely, users who are concerned that they do not want to view certain types of material would be more easily able to choose another platform that proscribes said material in its terms and conditions.

I beg to move the amendment as a way of defending diversity, choice and innovation—and as a feeble attempt to make the Bill proportionate.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.

I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.

Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be

“working to benefit the public”.

I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.

Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.

Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to

“search for … products or services … in a particular sector”.

It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.

The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.

The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.

I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.

The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.

The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.

I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.

I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.

The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.

Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.

--- Later in debate ---
Baroness Benjamin Portrait Baroness Benjamin (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, as might be expected, I will speak against Amendment 26 and will explain why.

The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.

Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.

We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.

I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.

One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.

These will be controversial issues, and we will come back to them, but it is good to have started the discussion.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.

We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.

When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.

There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Lord Moylan Portrait Lord Moylan (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.

Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.

That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.

We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.

End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.

Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.

Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.

We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.

The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.

Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.

I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.

I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.

Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.

Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.

My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.

As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.

I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice

“to a regulated service which offers private messaging with end-to-end encryption”;

and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.

Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.

Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Parliament Live - Hansard - - - Excerpts

My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.

If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.

I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I did, and I am happy to say it again: yes.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.

Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.

The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.

I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.

Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.

Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.

Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

I beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.

The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.

I will make a couple of points on that thought. Clause 170(6) directs that a provider must have

“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,

but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.

If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.

If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.

I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.

The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.

I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.

There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.

It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.

What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.

I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.

--- Later in debate ---
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services—
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I want to clarify one point. I have had a slightly different experience, which is that for many people—women, at least—whom I have talked to recently, there is an over-enthusiasm and an over-zealous attitude to policing the speech of particular women and, as we have already heard, gender-critical women. It is often under the auspices of hate speech and there is all sorts of discussion about whether the police are spending too long trawling through social media. By contrast, if you want to get a policeman or policewoman involved in a physical crime in your area, you cannot get them to come out. So I am not entirely convinced. I think policing online speech at least is taking up far too much of the authorities’ time, not too little time, and distracting them from solving real social and criminal activity.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

--- Later in debate ---
As the noble Baroness, Lady Kidron, and others have argued, these harms are often cumulative and interrelated. The social media companies are the only ones not looking through a keyhole but monitoring social media in the round and able to assess what is happening, but evidence suggests that they will do not so until compelled by legislation. These amendments are a vital step forward in fulfilling the Bill’s purpose of providing additional protection from harm for children. I urge the Government to adopt them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.

Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the

“widely understood and used 4 Cs of online risk to children”.

They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.

I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.

I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.

As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.

I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?

The other thing is that Amendment 93 says:

“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.


As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.

In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.

The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.

This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.

The amendment states that the Bill should target any platform that posts

“links to, or … encourages child users to seek”

out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Parliament Live - Hansard - - - Excerpts

To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

Let us leave it there.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Again, I say to my noble friend Lord Moylan—who I encourage to keep going with his scepticism about the Bill; it is important—that it is a bit of a dead end at any point in his argument to compare us with China. That is genuinely comparing apples with oranges. When people were resisting regulation in this sphere, they would always say, “That’s what the Chinese want”. We have broadcasting regulation and other forms of health and safety regulation. It is not the mark of an autocratic or totalitarian state to have regulation; platforms need to be held to account. I simply ask the proponents of the amendments to make it clear as they proceed how this fits in with existing regulations, such as the age-appropriate design code.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I want, apart from anything else, to speak in defence of philosophical ruminations. The only way we can scrutinise the amendments in Committee is to do a bit of philosophical rumination. We are trying to work out what the amendments might mean in terms of changing the Bill.

I read these amendments, noted their use of “eliminate” —we have to “eliminate” all risks—and wondered what that would mean. I do not want to feel that I cannot ask these kinds of difficult questions for fear that I will offend a particular group or that it would be insensitive to a particular group of parents. It is difficult but we are required as legislators to try to understand what each other are trying to change, or how we are going to try to change the law.

I say to those who have put “eliminate” prominently in a number of these amendments that it is impossible to eliminate all risks to children—is it not?—if they are to have access to the online world, unless you ban them from the platforms completely. Is “eliminate” really helpful here?

Previously in Committee, I talked a lot about the potential dangers, psychologically and with respect to development, of overcoddling young people, of cotton wool kids, and so on. I noted an article over the weekend by the science journalist Tom Chivers, which included arguments from the Oxford Internet Institute and various psychologists that the evidence on whether social media is harmful, particularly for teenagers, is ambiguous.

I am very convinced by the examples brought forward by the noble Baroness, Lady Kidron—and I too wish her a happy birthday. We all know about the targeting of young people and so forth, but I am also aware of the positives. I always try to balance these things out and make sure that we do not deny young people access to the positives. In fact, I found myself cheering at the next group of amendments, which is unusual. First, they depend on whether you are four or 14—in other words, you have to be age-specific—and, secondly, they recognise that we do not want to pass anything in the Bill that actually denies children access to either their own privacy or the capacity to know more.

I also wanted to explore a little the idea of expanding the debate away from content to systems, because this is something that I think I am not quite understanding. My problem is that moving away from the discussion on whether content is removed or accessible, and focusing on systems, does not mean that content is not in scope. My worry is that the systems will have an impact on what content is available.

Let me give some examples of things that can become difficult if we think that we do not want young people to encounter violence and nudity—which makes it seem as though we know what we are talking about when we talk about “harmful”. We will all recall that, in 2018, Facebook removed content from the Anne Frank Centre posted by civil rights organisations because it included photographs of the Holocaust featuring undressed children among the victims. Facebook apologised afterwards. None the less, my worry is these kinds of things happening. Another example, in 2016, was the removal of the Pulitzer Prize-winning photograph “The Terror of War”, featuring fleeing Vietnamese napalm victims in the 1970s, because the system thought it was something dodgy, given that the photo was of a naked child fleeing.

I need to understand how system changes will not deprive young people of important educational information such as that. That is what I am trying to distinguish. The point made by the noble Lord, Lord Moylan, about “harmful” not being defined—I have endlessly gone on about this, and will talk more about it later—is difficult because we think that we know what we mean by “harmful” content.

Finally, on the amendments requiring compliance with Ofcom codes of practice, that would give an extraordinary amount of power to the regulator and the Secretary of State. Since I have been in this place, people have rightly drawn my attention to the dangers of delegating power to the Executive or away from any kind of oversight—there has been fantastic debate and discussion about that. It seems to me that these amendments advocate delegated powers being given to the Secretary of State and Ofcom, an unelected body —the Secretary of State could amend for reasons of public policy in order to protect children—and this is to be put through the negative procedure. In any other instance, I would have expected outcry from the usual suspects, but, because it involves children, we are not supposed to object. I worry that we need to have more scrutiny of such amendments and not less, because in the name of protecting children unintended consequences can occur.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.

On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I welcome many of these amendments. I found reading them slightly more refreshing than the more dystopian images we have had previously. It is quite exciting, actually, because the noble Baroness, Lady Harding, sounded quite upbeat, which is in contrast to previous contributions on what the online world is like.

I want to defend the noble Baroness, Lady Bennett of Manor Castle, from the intervention that suggested that she was going off topic, because the truth is that these amendments are calling for children’s rights to be introduced into legislation via this Bill. I disagree with that, but we should at least talk about it if it is in the amendments.

Whereas I like the spirit of the amendments, it seems to me that children’s rights, which I consider to have huge constitutional implications, require a proper Bill to bring them in and not to be latched on to this one. My concern is that children’s rights can be used to undermine adult authority and are regularly cited as a way of undermining parents’ rights, and that children under 18 cannot enact political rights. Whether they have agency or capacity, they are not legally able to exercise their political rights, and therefore someone has to act on their behalf as an intermediary—as a third party—which is why it can become such a difficult, politicised area.

I say that because it would be a fascinating discussion to have. I do not think this is the Bill to have it on, but the spirit of the amendments raises issues that we should bear in mind for the rest of our discussion. During lockdown, we as a society stopped young people having any social interaction at all. They were isolated, and a lot of new reports suggest that young people’s mental health has suffered because they were on their own. They went online and, in many instances, it kept them sane. That is probably true not just of young people but of the rest of us, by the way, but I am making the point that it was not all bad.

Over recent years, as we have been concerned about children’s safety and protecting them, we have discouraged them from roaming far from home. They do not go out on their bikes or run around all the time; they are told, “Come back home, you’ll be safe”. Of course, they have gone into their room and gone online, and now we say, “That’s not safe either”.

I want to acknowledge that the online world has helped young people overcome the problems of isolation and lack of community that the adult world has sometimes denied them developing. That is important: it can be a source of support and solidarity. Children need spaces to talk, engage and interact with friends, mates, colleagues and so on where they can push boundaries, and all sorts of things, without grown-ups interfering. That is what we have always understood from child development. It is why you do not have spies wandering around all the time following them.

The main thing is that we know the difference between a four year-old and a 14 year-old. In the Bill, we call a child anyone under 18, but I was glad that the amendments acknowledge that distinction in terms of appropriateness is important. When young people are online, or if they are involved in encrypted messages, such as WhatsApp, that does not mean they are all planning to join county lines or are being groomed—it is not all dodgy. Appropriateness in terms of child age and not always imagining that the worst is happening are an important counter that these amendments bring to some of the pessimism that we have heard until now.

The noble Lord, Lord Russell, said that children’s rights are not mentioned in the Bill but freedom of expression has been mentioned 49 times. First, it is not a Bill about children’s rights, but when he says that freedom of expression has been mentioned 49 times, I assure him that quantity is not quality and the mention of it means nothing.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I want to challenge the noble Baroness’s assertion that the Bill is not about children’s rights. Anyone who has a teenage child knows that their right to access the internet is keenly held and fought out in every household in the country.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

The quip works, but political rights are not quips. Political rights have responsibilities, and so on. If we gave children rights, they would not be dependent on adults and adult society. Therefore, it is a debate; it is a row about what our rights are. Guess what. It is a philosophical row that has been going on all around the world. I am just suggesting that this is not the place—

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

I am sorry, but I must point out that 16 and 17 year-olds in Scotland and Wales have the vote. That is a political right.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

And it has been highly contentious whether the right to vote gives them independence. For example, you would still be accused of child exploitation if you did anything to a person under 18 in Scotland or Wales. In fact, if you were to tap someone and it was seen as slapping in Scotland and they were 17, you would be in trouble. Anyway, it should not be in this Bill. That is my point.

Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, perhaps I may intervene briefly, because Scotland and Wales have already been mentioned. My perception of the Bill is that we are trying to build something fit for the future, and therefore we need some broad underlying principles. I remind the Committee that the Well-being of Future Generations Act (Wales) Act set a tone, and that tone has run through all aspects of society even more extensively than people imagined in protecting the next generation. As I have read them, these amendments set a tone to which I find it difficult to understand why anyone would object, given that that is a core principle, as I understood it, behind building in future-proofing that will protect children, among others.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Lord Curry of Kirkharle Portrait Lord Curry of Kirkharle (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, in view of the hour, I will be brief, and I have no interests to declare other than that I have grandchildren. I rise to speak to a number of amendments tabled in my name in this group: Amendments 216A to 216C, 218ZZA to 218ZD and 218BA to 218BC. I do not think I have ever achieved such a comprehensive view of the alphabet in a number of amendments.

These amendments carry a simple message: Ofcom must act decisively and quickly. I have tabled them out of a deep concern that the Bill does not specify timescales or obligations within which Ofcom is required to act. It leaves Ofcom, as the regulator, with huge flexibility and discretion as to when it must take action; some action, indeed, could go on for years.

Phrases such as

“OFCOM may vary a confirmation decision”

or it

“may apply to the court for an order”

are not strong enough, in my view. If unsuitable or harmful material is populating social media sites, the regulator must take action. There is no sense of urgency within the drafting of the Bill. If contravention is taking place, action needs to be taken very quickly. If Ofcom delays taking an action, the harmful influence will continue. If the providers of services know that the regulator will clamp down quickly and severely on those who contravene, they are more likely to comply in the first place.

I was very taken by the earlier comments of the noble Baroness, Lady Harding, about putting additional burdens on Ofcom. These amendments are not designed to put additional burdens on Ofcom; indeed, the noble Lord, Lord Knight, referred to the fact that, for six years, I chaired the Better Regulation Executive. It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.

Noble Lords will be pleased to hear that I do not intend to go through each individual amendment. They all have a single purpose: to require the regulator—in this case, Ofcom—to act when necessary, as quickly as possible within specified timescales; and to toughen up the Bill to reduce the risk of continuous harmful content being promoted on social media.

I hope that the Minister will take these comments in the spirit in which they are intended. They are designed to help Ofcom and to help reduce the continuous adverse influence that many of these companies will propagate if they do not think they will be regulated severely.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.

I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.

I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.

I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?

What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?

My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Parliament Live - Hansard - - - Excerpts

I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.

One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, this group of amendments looks at the treatment of legal content accessed by adults. The very fact that Parliament feels that legislation has a place in policing access to legal material is itself worrying. This door was opened by the Government in the initial draft Bill, but, as we have already heard, after a widespread civil liberties backlash against the legal but harmful clauses, we are left with Clause 65. As has been mentioned, I am worried that this clause, and some of the amendments, might well bring back legal but harmful for adults by the back door. One of the weasel words here is “harmful”. As I have indicated before, it is difficult to work out from the groupings when to raise which bit, so I am keeping that for your Lordships until later and will just note that I am rather nervous about the weasel word “harmful”.

Like many of us, I cheered at the removal of the legal but harmful provisions, but I have serious reservations about their replacement with further duties via terms of service, which imposes a duty on category 1 services to have systems and processes in place to take down or restrict access to content, and to ban or suspend users in accordance with terms of service, as the noble Lord, Lord Moylan, explained. It is one of the reasons I support his amendment. It seems to me to be the state outsourcing the grubby job of censorship to private multinational companies with little regard for UK law.

I put my name to Amendment 155 in the name of the noble Lord, Lord Moylan, because I wanted to probe the Government’s attitude to companies’ terms of service. Platforms have no obligation to align their terms of service with freedom of expression under UK law. It is up to them. I am not trying to impose on them what they do with their service users. If a particular platform wishes to say, “We don’t want these types of views on our platform”, fine, that is its choice. But when major platforms’ terms of service, which are extensive, become the basis on which UK law enforces speech, I get nervous. State regulators are to be given the role of ensuring that all types of lawful speech are suppressed online, because the duty applies to all terms of service, whatever they are, regarding the platforms’ policies on speech suppression, censorship, user suspension, bans and so on. This duty is not restricted to so-called harmful content; it is whatever content the platform wishes to censor.

What is more, Clause 65 asks Ofcom to ensure that individuals who express lawful speech are suspended or banned from platforms if in breach of the platforms’ Ts & Cs, and that means limiting those individuals from expressing themselves more widely, beyond the specific speech in question. That is a huge green light to interfere in UK citizens’ freedom of expression, in my opinion.

I stress that I am not interested in interfering in the terms and conditions of private companies, although your Lordships will see later that I have an amendment demanding that they introduce free-speech clauses. That is because of the way we seem to be enacting the law via the terms of service of private companies. They should of course be free to dictate their own terms of service, and it is reasonable that members of the public should know what they are and expect them to be upheld. But that does not justify the transformation of these private agreements into statutory duties—that is my concern.

So, why are we allowing this Bill to ask companies to enforce censorship policies in the virtual public square that do not exist in UK law? When companies’ terms of service permit the suppression of speech, that is up to them, but when they supress speech far beyond the limitations of speech in UK law and are forced to do so by a government regulator such as Ofcom, are we not in trouble? It means that corporate terms of service, which are designed to protect platforms’ business interests, are trumping case law on free speech that has evolved over many years.

Those terms of service are also frequently in flux, according to fashion or ownership; one only has to look at the endless arguments, which I have yet to understand, about Twitter’s changing terms of service after the Elon Musk takeover. Is Ofcom’s job to follow Elon Musk’s ever-changing terms of service and enforce them on the British public as if they are law?

The terms and conditions are therefore no longer simply a contract between a company and the user; their being brought under statute means that big tech will be exercising public law functions, with Ofcom as the enforcer, ensuring that lawful speech is suppressed constantly, in line with private companies’ terms of service. This is an utter mess and not in any way adequate to protect free speech. It is a fudge by the Government: they were unpopular on “lawful but harmful”, so they have outsourced it to someone else to do the dirty work.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, it has been interesting to hear so many noble Lords singing from the same hymn sheet—especially after this weekend. My noble friend Lord McNally opened this group by giving us his wise perspective on the regulation of new technology. Back in 2003, as he mentioned, the internet was not even mentioned in the Communications Act. He explained how regulation struggles to keep up and how quantum leaps come with a potential social cost; all that describes the importance of risk assessment of these novel technologies.

As we have heard from many noble Lords today, on Report in the Commons the Government decided to remove the adult safety duties—the so-called “legal but harmful” aspect of the Bill. I agree with the many noble Lords who have said that this has significantly weakened the protection for adults under the Bill, and I share the scepticism many expressed about the triple shield.

Right across the board, this group of amendments, with one or two exceptions, rightly aims to strengthen the terms of service and user empowerment duties in the Bill in order to provide a greater baseline of protection for adults, without impinging on others’ freedom of speech, and to reintroduce some risk-assessment requirement on companies. The new duties will clearly make the largest and riskiest companies expend more effort on enforcing their terms of service for UK users. However, the Government have not yet presented any modelling on what effect this will have on companies’ terms of service. I have some sympathy with what the noble Lord, Lord Moylan, said: the new duties could mean that terms of service become much longer and lawyered. This might have an adverse effect on freedom of expression, leading to the use of excessive takedown measures rather than looking at other more systemic interventions to control content such as service design. We heard much the same argument from the noble Baroness, Lady Fox. They both made a very good case for some of the amendments I will be speaking to this afternoon.

On the other hand, companies that choose to do nothing will have an easier life under this regime. Faced with stringent application of the duties, companies might make their terms of service shorter, cutting out harms that are hard to deal with because of the risk of being hit with enforcement measures if they do not. Therefore, far from strengthening protections via this component of the triple shield, the Bill risks weakening them, with particular risks for vulnerable adults. As a result, I strongly support Amendments 33B and 43ZA, which my noble friend Lord McNally spoke to last week at the beginning of the debate on this group.

Like the noble Baroness, Lady Kidron, I strongly support Amendments 154, 218 and 160, tabled by the noble Lord, Lord Stevenson, which would require regulated services to maintain “adequate and appropriate” terms of service, including provisions covering the matters listed in Clause 12. Amendment 44, tabled by the right reverend Prelate the Bishop of Oxford and me, inserts a requirement that services to which the user empowerment duties apply

“must make a suitable and sufficient assessment of the extent to which they have carried out the duties in this section including in each assessment material changes from the previous assessment such as new or removed user empowerment features”.

The noble Viscount, Lord Colville, spoke very well to that amendment, as did the noble Baronesses, Lady Fraser and Lady Kidron.

Amendment 158, also tabled by me and the right reverend Prelate, inserts a requirement that services

“must carry out a suitable and sufficient assessment of the extent to which they have carried out the duties under sections 64 and 65 ensuring that assessment reflects any material changes to terms of service”.

That is a very good way of meeting some of the objections that we have heard to Clause 65 today.

These two amendments focus on risk assessment because the new duties do not have an assessment regime to work out whether they work, unlike the illegal content and children’s duties, as we have heard. Risk assessments are vital to understanding the environment in which the services are operating. A risk assessment can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and it can increase user safety by revealing new risks and future-proofing a regime.

The Government have not yet provided, in the Commons or in meetings with Ministers, any proper explanation of why risk assessment duties have been removed along with the previous adult safety duties, and they have not explained in detail why undertaking a risk assessment is in any way a threat to free speech. They are currently expecting adults to manage their own risks, without giving them the information they need to do so. Depriving users of basic information about the nature of harms on a service prevents them taking informed decisions as to whether they want to be on it at all.

Without these amendments, the Bill cannot be said to be a complete risk management regime. There will be no requirement to explain to Ofcom or to users of a company’s service the true nature of the harms that occur on its service, nor the rationale behind the decisions made in these two fundamental parts of the service. This is a real weakness in the Bill, and I very much hope that the Minister will listen to the arguments being made this afternoon.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will come on to say a bit more about how Ofcom goes about that work.

The Bill will ensure that providers have the information they need to understand whether they are in compliance with their duties under the Bill. Ofcom will set out how providers can comply in codes of practice and guidance that it publishes. That information will help providers to comply, although they can take alternative action if they wish to do so.

The right reverend Prelate’s amendments also seek to provide greater transparency to Ofcom. The Bill’s existing duties already account for this. Indeed, the transparency reporting duties set out in Schedule 8 already enable Ofcom to require category 1, 2A and 2B services to publish annual transparency reports with relevant information, including about the effectiveness of the user empowerment tools, as well as detailed information about any content that platforms prohibit or restrict, and the application of their terms of service.

Amendments 159, 160 and 218, tabled by the noble Lord, Lord Stevenson, seek to require user-to-user services to create and abide by minimum terms of service recommended by Ofcom. The Bill already sets detailed and binding requirements on companies to achieve certain outcomes. Ofcom will set out more detail in codes of practice about the steps providers can take to comply with their safety duties. Platforms’ terms of service will need to provide information to users about how they are protecting users from illegal content, and children from harmful content.

These duties, and Ofcom’s codes of practice, ensure that providers take action to protect users from illegal content and content that is harmful to children. As such, an additional duty to have adequate and appropriate terms of service, as envisaged in the amendments, is not necessary and may undermine the illegal and child safety duties.

I have previously set out why we do not agree with requiring platforms to set terms of service for legal content. In addition, it would be inappropriate to delegate this much power to Ofcom, which would in effect be able to decide what legal content adult users can and cannot see.

Amendment 155, tabled by my noble friend Lord Moylan, seeks to clarify whether and how the Bill makes the terms of service of foreign-run platforms enforceable by Ofcom. Platforms’ duties under Clause 65 apply only to the design, operation and use of the service in the United Kingdom and to UK users, as set out in Clause 65(11). Parts or versions of the service which are used in foreign jurisdictions—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

On that, in an earlier reply the Minister explained that platforms already remove harmful content because it is harmful and because advertisers and users do not like it, but could he tell me what definition of “harmful” he thinks he is using? Different companies will presumably have a different interpretation of “harmful”. How will that work? It would mean that UK law will require the removal of legal speech based on a definition of harmful speech designed by who—will it be Silicon Valley executives? This is the problem: UK law is being used to implement the removal of content based on decisions that are not part of UK law but with implications for UK citizens who are doing nothing unlawful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The noble Baroness’s point gets to the heart of the debate that we have had. I talked earlier about the commercial incentive that there is for companies to take action against harmful content that is legal which users do not want to see or advertisers do not want their products to be advertised alongside, but there is also a commercial incentive to ensure that they are upholding free speech and that there are platforms on which people can interact in a less popular manner, where advertisers that want to advertise products legally alongside that are able to do so. As with anything that involves the market, the majority has a louder voice, but there is room for innovation for companies to provide products that cater to minority tastes within the law.

--- Later in debate ---
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Parliament Live - Hansard - - - Excerpts

My Lords, it is a pleasure to be collaborating with the noble Baroness, Lady Morgan. We seem to have been briefed by the same people, been to the same meetings and drawn the same conclusions. However, there are some things that are worth saying twice and, although I will try to avoid a carbon copy of what the noble Baroness said, I hope the central points will make themselves.

The internet simply must be made to work for its users above all else—that is the thrust of the two amendments that stand in our names. Through education and communication, the internet can be a powerful means of improving our lives, but it must always be a safe platform on which to enjoy a basic right. It cannot be said often enough that to protect users online is to protect them offline. To create a strict division between the virtual and the public realms is to risk ignoring how actions online can have life and death repercussions, and that is at the heart of what these amendments seek to bring to our attention.

I was first made aware of these amendments at a briefing from the Samaritans, where we got to know each other. There I heard the tragic accounts of those whose loved ones had taken their own lives due to exposure to harmful content online. I will not repeat their accounts—this is not the place to do that—but understanding only a modicum of their grief made it obvious to me that the principle of “safest option by default” must underline all our decision-making on this.

I applaud the work already done by Members of this House to ensure the safety of young people online. Yet it is vital, as the noble Baroness has said, that we do not create a drop-off point for future users—one in which turning 18 means sudden exposure to the most harmful content lurking online, as it is always there. Those most at risk of suicide due to exposure to harmful content are aged between their late teens and early 20s. In fact, a 2017 inquiry into the suicides of young people found harmful content accessed online in 26% of the deaths of under 20s and 13% of the deaths of 20 to 24 year-olds. It is vital for us to empower users from their earliest years.

In the Select Committee—I see fellow members sitting here today—we have been looking at digital exclusion and the need for education at all levels for those using the internet. Looking for good habits established in the earliest years is the right way to start, but it goes on after that, because the world that young people go on to inhabit in adulthood is one where they are already in control of the internet—if they had the education earlier. Adulthood comes with the freedom to choose how one expresses oneself online—of course it does—but this must not be at the cost of their continuing freedom from the most insidious content that puts their mental health at risk. Much mention has been made of the triple shield and I need not go there again. Its origins and perhaps deficiencies have been mentioned already.

The Center for Countering Digital Hate recently conducted an experiment, creating new social media accounts that showed interest in body image and mental health. This study found that TikTok served suicide-related content to new accounts within 2.6 minutes, with eating disorder content being recommended within 8 minutes. At the very least, these disturbing statistics tell us that users should have the option to opt in to such content, and not have to suffer this harm before later opting out. While the option to filter out certain categories of content is essential, it must be toggled on by default if safety is to be our primary concern.

The principle of safest by default creates not only a less harmful environment, but one in which users are in a position to define their own online experience. The space in which we carry out our public life is increasingly located on a small number of social media platforms—those category 1 platforms already mentioned several times—which everyone, from children to pensioners, uses to communicate and share their experiences.

We must then ensure that the protections we benefit from offline continue online: namely, protection from the harm and hate that pose a threat to our physical and mental well-being. When a child steps into school or a parent into their place of work, they must be confident that those with the power to do so have created the safest possible environment for them to carry out their interactions. This basic confidence must be maintained when we log in to Twitter, Instagram, TikTok or any other social media giant.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, my Amendment 43 tackles Clause 12(1), which expressly says that the duties in Clause 12 are to “empower” users. My concern is to ensure that, first, users are empowered and, secondly, legitimate criticism around the characteristics listed in Clause 12(11) and (12), for example, is not automatically treated as abusive or inciting hatred, as I fear it could be. My Amendment 283ZA specifies that, in judging content that is to be filtered out after a user has chosen to switch on various filters, the providers act reasonably and pause to consider whether they have “reasonable grounds” to believe that the content is of the kind in question—namely, abusive or problematic.

Anything under the title “empower adult users” sounds appealing—how can I oppose that? After all, I am a fan of the “taking back control” form of politics, and here is surely a way for users to be in control. On paper, replacing the “legal but harmful” clause with giving adults the opportunity to engage with controversial content if they wish, through enhanced empowerment tools, sounds positive. In an earlier discussion of the Bill, the noble Baroness, Lady Featherstone, said that we should treat adults as adults, allowing them to confront ideas with the

“better ethics, reason and evidence”—[Official Report, 1/2/23; col. 735.]

that has been the most effective way to deal with ideas from Socrates onwards. I say, “Hear, hear” to that. However, I worry that, rather than users being in control, there is a danger that the filter system might infantilise adult users and disempower them by hard-wiring into the Bill a duty and tendency to hide content from users.

There is a general weakness in the Bill. I have noted that some platforms are based on users moderating their own sites, which I am quite keen on, but this will be detrimentally affected by the Bill. It would leave users in charge of their own moderation, with no powers to decide what is in, for example, Wikipedia or other Wikimedia projects, which are added to, organised and edited by a decentralised community of users. So I will certainly not take the phrase “user empowerment” at face value.

I am slightly concerned about linguistic double-speak, or at least confusion. The whole Bill is being brought forward in a climate in which language is weaponised in a toxic minefield—a climate of, “You can’t say that”. More nerve-rackingly, words and ideas are seen as dangerous and interchangeable with violent acts, in a way that needs to be unpicked before we pass this legislation. Speakers can be cancelled for words deemed to threaten listeners’ safety—but not physical safety; the opinions are said to be unsafe. Opinions are treated as though they cause damage or harm as viscerally as physical aggression. So lawmakers have to recognise the cultural context and realise that the law will be understood and applied in it, not in the abstract.

I am afraid that the language in Clause 12(1) and (2) shows no awareness of this wider backdrop—it is worryingly woolly and vague. The noble Baroness, Lady Morgan, talked about dangerous content, and all the time we have to ask, “Who will interpret what is dangerous? What do we mean by ‘dangerous’ or ‘harmful’?”. Surely a term such as “abusive”, which is used in the legislation, is open to wide interpretation. Dictionary definitions of “abusive” include words such as “rude”, “insulting” and “offensive”, and it is certainly subjective. We have to query what we mean by the terms when some commentators complain that they have been victims of online abuse, but when you check their timelines you notice that, actually, they have been subject just to angry, and sometimes justified, criticism.

I recently saw a whole thread arguing that the Labour Party’s recent attack ads against the Prime Minister were an example of abusive hate speech. I am not making a point about this; I am asking who gets to decide. If this is the threshold for filtering content, there is a danger of institutionalising safe space echo chambers. It can also be a confusing word for users, because if someone applies a user empowerment tool to protect themselves from abuse, the threshold at which the filter operates could be much lower than they intend or envisage but, by definition, the user would not know what had been filtered out in their name, and they have no control over the filtering because they never see the filtered content.

--- Later in debate ---
Baroness Bull Portrait Baroness Bull (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I support the noble Baroness, Lady Buscombe, on the built-in obsolescence of any list. It would very soon be out of date.

I support the amendments tabled by the noble Lord, Lord Clement-Jones, and by the noble Baroness, Lady Morgan of Cotes. They effectively seek a similar aim. Like the noble Baroness, Lady Fraser, I tend towards those tabled by the noble Lord, Lord Clement-Jones, because they seem clearer and more inclusive, but I understand that they are trying for the same thing. I also register the support for this aim of my noble friend Lady Campbell of Surbiton, who cannot be here but whom I suspect is listening in. She was very keen that her support for this aim was recorded.

The issue of “on by default” inevitably came up at Second Reading. Then and in subsequent discussions, the Minister reiterated that a “default on” approach to user empowerment tools would negatively impact people’s use of these services. Speaking at your Lordships’ Communications and Digital Committee, on which I sat at the time, Minister Scully went further, saying that the strongest option, of having the settings off in the first instance,

“would be an automatic shield against people’s ability to explore what they want to explore on the internet”.

According to the Government’s own list, this was arguing for the ability to explore content that abuses, targets or incites hatred against people with protected characteristics, including race and disability. I struggle to understand why protecting this right takes precedence over ensuring that groups of people with protected characteristics are, well, protected. That is our responsibility. It is precedence, because switching controls one way is not exactly the same as switching them the other way. It is easy to think so, but the noble Baroness, Lady Parminter, explained very clearly that it is not the same. It is undoubtedly easier for someone in good health and without mental or physical disabilities to switch controls off than it is for those with disabilities or vulnerabilities to switch them on. That is self-evident.

It cannot be right that those most at risk of being targeted online, including some disabled people—not all, as we have heard—and those with other protected characteristics, will have the onus on them to switch on the tools to prevent them seeing and experiencing harm. There is a real risk that those who are meant to benefit from user empowerment tools, those groups at higher risk of online harm, including people with a learning disability, will not be able to access the tools because the duties allow category 1 services to design their own user empowerment tools. This means that we are likely to see as many versions of user empowerment tools as there are category 1 services to which this duty applies.

Given what we know about the nature of addiction and self-harm, which has already been very eloquently explained, it surely cannot be the intention of the Bill that those people who are in crisis and vulnerable to eating disorders or self-harm, for example, will be required to seek and activate a set of tools to turn off the very material that feeds their addiction or encourages their appetite for self-harm.

The approach in the Bill does little to prevent people spiralling down this rabbit hole towards ever more harmful content. Indeed, instead it requires people to know that they are approaching a crisis point, and to have sufficient levels of resilience and rationality to locate the switch and turn on the tools that will protect them. That is not how the irrational or distressed mind works.

So, all the evidence that we have about the existence of harm which arises from mental states, which has been so eloquently set out in introducing the amendments— I refer again to my noble friend Lady Parminter, because that is such powerful evidence—tips the balance in favour, I believe, of setting the tools to be on by default. I very much hope the Minister will listen and heed the arguments we have heard set out by noble Lords across the Committee, and come back with some of his own amendments on Report.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Before the noble Baroness sits down, I wanted to ask for clarification, because I am genuinely confused. When it comes to political rights for adults in terms of their agency, they are rights which we assume are able to be implemented by everyone. But we recognise that in the adult community —this is offline now; I mean in terms of how we understand political rights—there may well be people who lack capacity or are vulnerable, and we take that into account. But we do not generally organise political rights and access to, for example, voting or free speech around the most vulnerable in society. That is not because we are insensitive or inhumane, or do not understand. The moving testimonies we have heard about people with eating disorders and so on are absolutely spot-on accurate. But are we suggesting that the world online should be organised around vulnerable adults, rather than adults and their political rights?

Baroness Bull Portrait Baroness Bull (CB)
- Hansard - - - Excerpts

I do not have all the answers, but I do think we heard a very powerful point from the right reverend Prelate. In doing the same for everybody, we do not ensure equality. We need to have varying approaches, in order that everybody has equality of access. As the Bill stands, it says nothing about vulnerable adults. It simply assumes that all adults have full capacity, and I think what these amendments seek to do is find a way to recognise that simply thinking about children, and then that everybody aged 18 is absolutely able to take care of themselves and, if I may say, “suck it up”, is not the world we live in. We can surely do better than that.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Just to clarify, in a way we have reduced this debate to whether the default position should be on or off, although in fact that is only one aspect of this. My concern, and what I maybe spent too long talking about, is what happens if we turn the toggles to “on”. The assumption we keep making is that once they are on, we are safe. The difficulty is that the categories of what is filtered out after turning them on are not necessarily what the user thinks they are. I am simply asking how you get around that; otherwise, we think it is too easy—turn it on or off; press the button. Is it not problematic for us all if, in thinking you are going to stop seeing hate, hate turns out actually to be legitimate and interesting political ideas?

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.

These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.

I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.

As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.

The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.

Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.

Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.

Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:

“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”


That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

The content which we have added to Clause 12 is a targeted approach. It reflects input from a wide range of interested parties, with whom we have discussed this, on the areas of content that users are most concerned about. The other protected characteristics that do not appear are, for instance, somebody’s marriage or civil partnership status or whether they are pregnant. We have focused on the areas where there is the greatest need for users to be offered the choice about reducing their exposure to types of content because of the abuse they may get from it. This recognises the importance of clear, enforceable and technically feasible duties. As I said a moment ago in relation to the point made by my noble friend Lady Buscombe, we will keep it under review but it is right that these provisions be debated at length—greater length than I think the Equality Bill was, but that was long before my time in your Lordships’ House, so I defer to the noble Lord’s experience and I am grateful that we are debating them thoroughly today.

I will move now, if I may, to discuss Amendments 43 and 283ZA, tabled by the noble Baroness, Lady Fox of Buckley. Amendment 43 aims to ensure that the user empowerment content features do not capture legitimate debate and discussion, specifically relating to the characteristics set out in subsections (11) and (12). Similarly, her Amendment 283ZA aims to ensure that category 1 services apply the features to content only when they have reasonable grounds to infer that it is user empowerment content.

With regard to both amendments, I can reassure the noble Baroness that upholding users’ rights to free expression is an integral principle of the Bill and it has been accounted for in drafting these duties. We have taken steps to ensure that legitimate online discussion or criticism will not be affected, and that companies make an appropriate judgment on the nature of the content in question. We have done this by setting high thresholds for inclusion in the content categories and through further clarification in the Bill’s Explanatory Notes, which I know she has consulted as well. However, the definition here deliberately sets a high threshold. By targeting only abuse and incitement to hatred, it will avoid capturing content which is merely challenging or robust discussion on controversial topics. Further clarity on definitions will be provided by Ofcom through regulatory guidance, on which it will be required to consult. That will sit alongside Ofcom’s code of practice, which will set out the steps companies can take to fulfil their duties.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I appreciate the Minister’s comments but, as I have tried to indicate, incitement to hatred and abuse, despite people thinking they know what those words mean, is causing huge difficulty legally and in institutions throughout the land. Ofcom will have its work cut out, but it was entirely for that reason that I tabled this amendment. There needs to be an even higher threshold, and this needs to be carefully thought through.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.

The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.

Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.

I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Bull Portrait Baroness Bull (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I rise to speak to Amendment 141 in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. Once again, I register the support of my noble friend Lady Campbell of Surbiton, who feels very strongly about this issue.

Of course, there is value in transparency online, but anonymity can be vital for certain groups of people, such as those suffering domestic abuse, those seeking help or advice on matters they wish to remain confidential, or those who face significant levels of hatred or prejudice because of who they are, how they live or what they believe in. Striking the right balance is essential, but it is equally important that everyone who wishes to verify their identity and access the additional protections that this affords can do so easily and effectively, and that this opportunity is open to all.

Clause 57 requires providers of category 1 services to offer users the option to verify their identity, but it is up to providers to decide what form of verification to offer. Under subsection (2) it can be “of any kind”, and it need not require any documentation. Under subsection (3), the terms of service must include a “clear and accessible” explanation of how the process works and what form of verification is available. However, this phrase in itself is open to interpretation: clear and accessible for one group may be unclear and inaccessible to another. Charities including Mencap are concerned that groups, such as people with a learning disability, could be locked out of using these tools.

It is also relevant that people with a learning disability are less likely to own forms of photographic ID such as passports or driving licences. Should a platform require this type of ID, large numbers of people with a learning disability would be denied access. In addition, providing an email or phone number and verifying this through an authentication process could be extremely challenging for those people who do not have the support in place to help them navigate this process. This further disadvantages groups of people who already suffer some of the most extensive restrictions in living their everyday lives.

Clause 58 places a duty on Ofcom to provide guidance to help providers comply with their duty, but this guidance is optional. Amendment 141 aims to strengthen Clause 58 by requiring Ofcom to set baseline principles and standards for the guidance. It would ensure, for example, that the guidance considers accessibility for disabled as well as vulnerable adults and aligns with relevant guidance on related matters such as age verification; it would ensure that verification processes are effective; and it would ensure that the interests of disabled users are covered in Ofcom’s pre-guidance consultation.

Online can be a lifeline for disabled and vulnerable adults, providing access to support, advice and communities of interest, and this is particularly important as services in the real world are diminishing, so we need to ensure that user-verification processes do not act as a further barrier to inclusion for people with protected characteristics, especially those with learning disabilities.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.

An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.

My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.

On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was

“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”

It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.

In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.

Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.

There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.

Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.

If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, what an unusually reticent group we have here for this group of amendments. I had never thought of the noble Baroness, Lady Fox, as being like Don Quixote, but she certainly seems to be tilting at windmills tonight.

I go back to the Joint Committee report, because what we said there is relevant. We said:

“Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response”.


We were very clear; the Government’s response on this was pretty clear too.

We said:

“The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms”.


We said there should be:

“A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category”.


Crucially for these amendments, we said:

“We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality”.


We were very clear about the difference between stripping away anonymity and ensuring that verification was available where the user wanted to engage only with those who had verified themselves. Requiring platforms to allow users—

--- Later in debate ---
Lord Kamall Portrait Lord Kamall (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, at the beginning of Committee, I promised that I would speak only twice, and this is the second time. I hope that noble Lords will forgive me if I stray from the group sometimes, but I will be as disciplined as I can. I will speak to Amendments 57 and 62, which the noble Baroness, Lady Featherstone, and I tabled. As others have said, the noble Baroness sends her apologies; sadly, she has fractured her spine, and I am sure we all wish her a speedy recovery. The noble Baroness, Lady Fox, has kindly added her name to these amendments.

As I have said, in a previous role, as a research director of a think tank—I refer noble Lords to my registered interests—I became interested in the phenomenon of unintended consequences. As an aside, it is sometimes known as the cobra effect, after an incident during the colonial rule of India, when a British administrator of Delhi devised a cunning plan to rid the city of dangerous snakes. It was simple: he would pay local residents a bounty for each cobra skin delivered. What could possibly go wrong? Never slow to exploit an opportunity, enterprising locals started to farm cobras as a way of earning extra cash. Eventually, the authorities grew wise to this, and the payments stopped. As a result, the locals realised that the snakes were now worthless and released them into the wild, leading to an increase, rather than a decrease, in the population of cobras.

As with the cobra effect, there have been many similar incidents of well-intentioned acts that have unintentionally made things worse. So, as we try to create a safer online space for our citizens, especially children and vulnerable adults, we should try to be as alert as we can to unintended consequences. An example is encrypted messages, which I discussed in a previous group. When we seek access to encrypted messages in the name of protecting children in this country, we should be aware that such technology could lead to dissidents living under totalitarian regimes in other countries being compromised or even murdered, with a devastating impact on their children.

We should also make sure that we do not unintentionally erode the fundamental rights and freedoms that underpin our democracy, and that so many people have struggled for over the centuries. I recognise that some noble Lords may say that that is applicable to other Bills, but I want to focus specifically on the implications for this Bill. In our haste to protect, we may create a digital environment and marketplace that stifles investment and freedom of expression, disproportionately impacting marginalised communities and cultivating an atmosphere of surveillance. The amendments the noble Baroness and I have tabled are designed to prevent such outcomes. They seek to strike a balance between regulating for a safer internet and preserving our democratic values. As many noble Lords have rightly said, all these issues will involve trade-offs; we may disagree, but I hope we will have had an informed debate, regardless of which side of the argument we are on.

We should explicitly outline the duties that service providers and regulators have with respect to these rights and freedoms. Amendment 57 focuses on safe- guarding specific fundamental rights and freedoms for users of regulated user-to-user services, including the protection of our most basic human rights. We believe that, by explicitly stating these duties, rather than hoping that they are somehow implied, we will create a more comprehensive framework for service providers to follow, ensuring that their safety policies and procedures do not undermine the essential rights of users, with specific reference to

“users with protected characteristics under the Equality Act 2010”.

Amendment 62 focuses on the role of Ofcom in mitigating risks to freedom of expression. I recognise that there are other amendments in this group on that issue. It is our responsibility to ensure that the providers of regulated user-to-user services are held accountable for their content moderation and recommender systems, to ensure they do not violate our freedoms.

I want this Bill to be a workable Bill. As I have previously said, I support the intention behind it to protect children and vulnerable adults, but as I have said many times, we should also be open about the trade-off between security and protection on the one hand, and freedom of expression on the other. My fear is that, without these amendments, we risk leaving our citizens vulnerable to the unintended consequences of overzealous content moderation, biased algorithms and opaque decision-making processes. We should shine a light on and bring transparency to our new processes, and perhaps help guide them by being explicit about those elements of freedom of speech we wish to preserve.

It is our duty to ensure that the Online Safety Bill not only protects our citizens from harm but safeguards the principles that form the foundation of a free and open society. With these amendments, we hope to transcend partisan divides and to fortify the essence of our democracy. I hope that we can work together to create an online environment that is safe, inclusive and respectful of the rights and freedoms that the people of this country cherish. I hope that other noble Lords will support these amendments, and, ever the optimist, that my noble friend the Minister will consider adopting them.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall, who explained well why I put my name to the amendments. I extend my regards to the noble Baroness, Lady Featherstone; I was looking forward to hearing her remarks, and I hope that she is well.

I am interested in free speech; it is sort of my thing. I am interested in how we can achieve a balance and enhance the free speech rights of the citizens of this country through the Bill—it is what I have tried to do with the amendments I have supported—which I fear might be undermined by it.

I have a number of amendments in this group. Amendment 49 and the consequential Amendments 50 and 156 would require providers to include in their terms of service

“by what method content present on the service is to be identified as content of democratic importance”,

and bring Clause 13 in line with Clauses 14 and 15 by ensuring an enhanced focus on the democratic issue.

Amendment 53A would provide that notification is given

“to any user whose content has been removed or restricted”.

It is especially important that the nature of the restriction in place be made clear, evidenced and justified in the name of transparency and—a key point—that the user be informed of how to appeal such decisions.

Amendment 61 in my name calls for services to have

“proportionate systems, processes and policies designed to ensure that as great a weight is given to users’ right to freedom of expression ... as to safety when making decisions”

about whether to take down or restrict users access to the online world, and

“whether to take action against a user generating, uploading or sharing content”.

In other words, it is all about applying a more robust duty to category 1 service providers and emphasising the importance of protecting

“a wide diversity of political, social, religious and philosophical opinion”

online.

I give credit to the Government, in that Clause 18 constitutes an attempt by them in some way to balance the damage to individual rights to freedom of expression and privacy as a result of the Bill, but I worry that it is a weak duty. Unlike operational safety duties, which compel companies proactively to prevent or minimise so-called harm in the way we have discussed, there is no such attempt to insist that freedom of speech be given the same regard or importance. In fact, there are worries that the text of the Bill has downgraded speech and privacy rights, which the Open Rights Group says

“are considered little more than a contractual matter”.

There has certainly been a lot of mention of free speech in the debates we have had so far in Committee, yet I am not convinced that the Bill gives it enough credit, which is why I support the explicit reference to it by the noble Lord, Lord Kamall.

I have a lot of sympathy with the amendments of the noble Lord, Lord Stevenson, seeking to replace Clauses 13, 14, 15 and 18 with a single comprehensive duty, because in some ways we are scratching around. That made some sense to me and I would be very interested to hear more about how that might work. Clauses 13, 14, 15 and 18 state that service providers must have regard to the importance of protecting users’ rights to freedom of expression in relation to

“content of democratic importance ... publisher content ... journalistic content”.

The very existence of those clauses, and the fact that we even need those amendments, is an admission by the Government that elsewhere, free speech is a downgraded virtue. We need these carve-outs to protect these things, because the rest of the Bill threatens free speech, which has been my worry from the start.

My Amendment 49 is a response to the Bill’s focus on protecting “content of democratic importance”. I was delighted that this was included, and the noble Lord, Lord Stevenson of Balmacara, has raised a lot of the questions I was asking. I am concerned that it is rather vaguely drawn, and too narrow and technocratic—politics with a big “P”, rather than in the broader sense. There is a lot that I would consider democratically important that other people might see, especially given today’s discussion, as harmful or dangerous. Certainly, the definition should be as broad as possible, so my amendment seeks to write that down, saying that it should include

“political, social, religious and philosophical opinion”.

That is my attempt to broaden it out. It is not perfect, I am sure, but that is the intention.

I am also keen to understand why Clauses 14 and 15, which give special protection to news publisher and journalistic content, have enhanced provisions, including an expedited appeals process for the reinstatement of removed materials, but those duties are much weaker—they do not exist—in Clause 13, which deals with content of democratic importance. In my amendment, I have suggested that they are levelled up.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his very clear and precise introduction of these amendments. As the noble Lord, Lord Clement-Jones, said, we will return to some of the underlying issues in future debates. It may be that this is just an aperitif to give us a chance to get our minds around these things, as the noble Baroness, Lady Stowell, said.

It is sometimes a bit difficult to understand exactly what issue is being addressed by some of these amendments. Even trying to say them got us into a bit of trouble. I think I follow the logic of where we are in the amendments that deal with the difference between adult material and children’s material, but it would benefit us all if the Minister could repeat it, perhaps a little slower this time, and we will see if we can agree that that is the way forward.

Broadly speaking, we accept the arrangements. They clarify the issues under which the takedown and appeal mechanisms will work. They are interfacing with the question of how the Bill deals with legal but harmful material, particularly for those persons who might wish not to see material and will not be warned about it under any process currently in the Bill but will have a toggle to turn to. It safeguards children who would not otherwise be covered by that. That is a fair balance to be struck.

Having said that, we will be returning to this. The noble Lord, Lord Clement-Jones, made the good point that we have a rather ironic situation where a press regulation structure set up and agreed by Parliament is not in operation across the whole of the press, but we do not seem to make any accommodation for that. This is perhaps something we should return to at a later date.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I want very briefly to probe something. I may have got the wrong end of the stick, but I want to just ask about the recognised news publishers. The Minister’s explanation about what these amendments are trying to do was very clear, but I have some concerns.

I want to know how this will affect how we understand what a recognised news publisher is in a world in which we have many citizen journalists, blogs and online publications. One of the democratising effects of the internet has been in opening up spaces for marginalised voices, campaign journalism and so on. I am worried that we may inadvertently put them into a category of being not recognised; maybe the Minister can just explain that.

I am also concerned that, because this is an area of some contention, this could be a recipe for all sorts of litigious disputes with platforms about content removal, what constitutes those carve-outs and what is a recognised news, journalism or publishing outlet.

I know we will come on to this, but for now I am opposed to Amendment 127 in this group—or certainly concerned that it is an attempt to coerce publishers into a post-Leveson regulatory structure by denying them the protections that the Bill will give news publishers, unless they sign up in certain ways. I see that as blackmail and bullying, which I am concerned about. Much of the national press and many publishers have refused to join that kind of regulatory regime post Leveson, as is their right; I support them in the name of press freedom. Any comments or clarifications would be helpful.

--- Later in debate ---
Those are the two things I am trying to achieve, which in many ways speak for themselves. I hope my noble friend will feel able to support them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I have given notice in this group that I believe Clause 139 should not stand part of the Bill. I want to remove the idea of Ofcom having any kind of advisory committee on misinformation and disinformation, at least as it has been understood. I welcome the fact that the Government have in general steered clear of putting disinformation and misinformation into the Bill, because the whole narrative around it has become politicised and even weaponised, often to delegitimise opinions that do not fit into a narrow set of official opinions or simply to shout abuse at opponents. We all want the truth—if only it was as simple as hiring fact-checkers or setting up a committee.

I am particularly opposed to Amendment 52 from the noble Baroness, Lady Merron, and the noble Lord, Lord Bethell. They have both spoken very eloquently of their concerns, focusing on harmful health misinformation and disinformation. I oppose it because it precisely illustrates my point about the danger of these terms being used as propaganda.

There was an interesting and important investigative report brought out in January this year by Big Brother Watch entitled Inside Whitehall’s Ministry of Truth—How Secretive “Anti-Misinformation” Teams Conducted Mass Political Monitoring. It was rather a dramatic title. We now know that the DCMS had a counter-disinformation unit that had a special relationship with social media companies, and it used to recommend that content was removed. Interestingly, in relation to other groups we have discussed, it used third-party contractors to trawl through Twitter looking for perceived terms of service violations as a reason for content to be removed. This information warfare tactic, as we might call it, was used to target politicians and high-profile journalists who raised doubts or asked awkward questions about the official pandemic response. Dissenting views were reported to No.10 and then often denounced as misinformation, with Ministers pushing social media platforms to remove posts and promote Government-sponsored lines.

It has been revealed that a similar fake news unit was in the Cabinet Office. It got Whitehall departments to attack newspapers for publishing articles that analysed Covid-19 modelling, not because it was accurate—it was not accurate in many instances—but because it feared that any scepticism would affect compliance with the rules. David Davis MP appeared in an internal report on vaccine hesitancy, and his crime was arguing against vaccine passports as discriminatory, which was a valid civil liberties opposition but was characterised as health misinformation. A similar approach was taken to vaccine mandates, which led to tens of thousands of front-line care workers being sacked even though, by the time this happened, the facts were known: the vaccine was absolutely invaluable in protecting individual health, but it did not stop transmission, so there was no need for vaccine mandates to be implemented. The fact that this was not discussed is a real example of misinformation, but we did not have it in the public sphere.

Professor Carl Heneghan’s Spectator article that questioned whether the rule of six was an arbitrary number was also flagged across Whitehall as misinformation, but we now know that the rule of six was arbitrary. Anyone who has read the former Health Secretary Matt Hancock’s WhatsApp messages, which were leaked to the Telegraph and which many of us read with interest, will know that many things posed as “the science” and factual were driven by politics more than anything else. Covid policies were not all based on fact, yet it was others who were accused of misinformation.

Beyond health, the Twitter files leaked by Elon Musk, when he became its new owner, show the dangers of using the terms misinformation and disinformation to pressure big tech platforms into becoming tools of political censorship. In the run-up to the 2020 election, Joe Biden’s presidential campaign team routinely flagged tweets and accounts it wanted to be censored, and we have all seen the screengrab of email exchanges between executives as evidence of that. Twitter suppressed the New York Post’s infamous Hunter Biden laptop exposé on the spurious grounds that it was “planted Russian misinformation”. The Post was even locked out of its own account. It took 18 months for the Washington Post and the New York Times to get hold of, and investigate, Hunter Biden’s emails, and both determined that the New York Post’s original report was indeed legitimate and factually accurate, but it was suppressed as misinformation when it might have made some political difference in an election.

We might say that all is fair in love and war and elections but, to make us think about what we mean by “misinformation” and why it is not so simple, was the Labour Party attack ad that claimed Rishi Sunak did not believe that paedophiles should go to jail fair comment or disinformation, and who decides? I know that Tobias Ellwood MP called for a cross-party inquiry on the issue, calling on social media platforms to do more to combat “malicious political campaigns”. I am not saying that I have a view one way or another on this, but my question is: in that instance, who gets to label information as “malicious” or “fake” or “misinformation”? Who gets the final say? Is it a black and white issue? How can we avoid it becoming partisan?

Yesterday, at the Second Reading of the Illegal Migration Bill, I listened very carefully to the many contributions. Huge numbers of noble Lords continually claimed that all those in the small boats crossing the channel were fleeing war and persecution—fleeing for their lives. Factually that was inaccurate, according to detailed statistics and evidence, yet no one called those contributors “peddlers of misinformation”, because those speaking are considered to be compassionate and on the righteous side of the angels—at least in the case of the most reverend Primate the Archbishop of Canterbury—and, as defined by this House, they were seen to be saying the truth, regardless of the evidence. My point is that it was a political argument, yet here we are focusing on this notion that the public are being duped by misinformation.

What about those who tell children that there are 140 genders to choose from, or that biological sex is immutable? I would say that is dangerous misinformation or disinformation; others would say that me saying that is bigoted. There is at least an argument to be had, but it illustrates that the labelling process will always be contentious, and therefore I have to ask: who is qualified to decide?

A number of amendments in this group put forward a variety of “experts” who should be, for example, on the advisory committee—those who should decide and those who should not—and I want to look at this notion of expertise in truth. For example, in the report by the Communications and Digital Committee in relation to an incident where Facebook marked as “false” a post on Covid by a professor of evidence-based medicine at Oxford University, the committee asked Facebook about the qualifications of those who made that judgment—of the fact-checkers. It was told that they were

“certified by the International Fact-Checking Network”.

Now, you know, who are they? The professor of evidence-based medicine at Oxford University might have a bit more expertise here, and I do not want a Gradgrind version of truth in relation to facts, and so on.

If it were easy to determine the truth, we would be able to wipe out centuries of philosophy, but if we are going to have a committee determining the truth, could we also have some experts in civil liberties—maybe the Free Speech Union, Big Brother Watch, and the Index on Censorship—on a committee to ensure that we do not take down accurate information under the auspices of “misinformation”? Are private tech companies, or professional fact-checkers, or specially selected experts, best placed to judge the reliability of all sorts of information and of the truth, which I would say requires judgement, analysis and competing perspectives?

Too promiscuous a use of the terms “misinformation” and “disinformation” can also cause problems, and often whole swathes of opinion are lumped together. Those who raised civil liberties objections to lockdown where denounced “Covidiots”, conspiracy theorists peddling misinformation and Covid deniers, on a par with those who suggested that the virus was linked to everything from 5G masts to a conscious “plandemic”.

Those who now raise queries about suppressing any reference to vaccine harms, or who are concerned that people who have suffered proven vaccine-related harms are not being shown due support, are often lumped in with those who claim the vaccine was a crime against humanity. All are accused of misinformation, with no nuance and no attempt at distinguishing very different perspectives. Therefore, with such wide-ranging views labelled as “misinformation” as a means of censorship, those good intentions can backfire—and I do believe that there are good intentions behind many of these amendments.

--- Later in debate ---
When harmful narratives are identified, the unit works with departments across Whitehall to deploy the appropriate response, which could involve a direct rebuttal on social media or awareness-raising campaigns to promote the facts. Therefore, the primary purpose is not to monitor for harmful content to flag to social media companies—the noble Baroness raised this point—but the department may notify the relevant platform if, in the course of its work, it identifies content that potentially violates platforms’ terms of service, including co-ordinated, inauthentic or manipulative behaviour. It is then up to the platform to decide whether to take action against the content, based on its own assessment and terms of service.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

The Minister mentioned “acute” examples of misinformation and used the example of the pandemic. I tried to illustrate that perhaps, with hindsight, what were seen as acute examples of misinformation turned out to be rather more accurate than we were led to believe at the time. So my concern is that there is already an atmosphere of scepticism about official opinion, which is not the same as misinformation, as it is sometimes presented. I used the American example of the Hunter Biden laptop so we could take a step away.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?

--- Later in debate ---
I agree but, matching like with like, I seek to amplify. More than tools, we need media literacy to be nothing short of the sword and the shield for young people in the online world—the sword and the shield for all people.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, for once, I am not entirely hostile to all these amendments—hurrah. In fact, I would rather have media literacy and education than regulation; that seems to me the solution to so much of what we have been discussing. But guess what? I have a few anxieties and I shall just raise them so that those who have put forward the arguments can come back to me.

We usually associate media literacy with schools and young people in education. Noble Lords will be delighted to know that I once taught media literacy: that might explain where we are now. It was not a particularly enlightening course for anybody, but it was part of the communications A-level at the time. I am worried about mandating schools how to teach media literacy. As the noble Lord, Lord Knight, will know, I worry about adding more to their overcrowded curriculum than they already have on their plate, but I note that the amendments actually expand the notion of being taught literacy to adults, away from just children. I suppose I just have some anxiety about Ofcom becoming the nation’s teacher, presenting users of digital services as though they are hapless and helpless. In other words, I am concerned about an overly paternalistic approach—that we should not be patronising.

The noble Baroness, Lady Kidron, keeps reminding us that content should not be our focus, and that it should be systems. In fact, in practically every discussion we have had, content has been the focus, because that is what will be removed, or not, by how we deal with the systems. That is one of the things that we are struggling with.

Literacy in the systems would certainly be very helpful for everybody. I have an idea—it is not an amendment—that we should send the noble Lord, Lord Allan of Hallam, on a UK tour so that he can explain it to us all; he is not here for this compliment, but every time he spoke in the first week of Committee, I think those of us who were struggling understood what he meant, as he explained complicated and technical matters in a way that was very clear. That is my constructive idea.

Amendment 52A from the noble Lord, Lord Knight of Weymouth, focuses on content, with its

“duty to make available information to allow users to establish the reliability and accuracy of content”.

That takes us back to the difficulties we were struggling with on how misinformation and disinformation will be settled and whether it is even feasible. I do not know whether any noble Lords have been following the “mask wars” that are going on. There are bodies of scientists on both sides on the efficacy of mask wearing—wielding scientific papers at dawn, as it were. These are well-informed, proper scientists who completely disagree on whether it was effective during lockdown. I say that because establishing reliability and accuracy is not that straightforward.

I like the idea of making available

“to users such information that may be necessary to allow users to establish the reliability and accuracy of content encountered on the service”.

I keep thinking that we need adults and young people to say that there is not one truth, such as “the science”, and to be equipped and given the tools to search around and compare and contrast different versions. I am involved in Debating Matters for 16 to 18 year-olds, which has topic guides that say, “Here is an argument, with four really good articles for it and four really good articles against, and here’s a load of background”. Then 16 to 18 year-olds will at least think that there is not just one answer. I feel that is the way forward.

The noble Lord, Lord Clement-Jones, said that I was preaching a counsel of despair; I like to think of myself as a person who has faith in the capacity and potential of people to overcome problems. I had a slight concern when reading the literature associated with online and digital literacy—not so much with the amendments—that it always says that we must teach people about the harms of the online world. I worry that this will reinforce a disempowering idea of feeling vulnerable and everything being negative. One of the amendments talks about a duty to promote users’ “safe use” of the service. I encourage a more positive outlook, incorporating into this literacy an approach that makes people aware that they can overcome and transcend insults and be robust and savvy enough to deal with algorithms—that they are not always victims but can take control over the choices they make. I would give them lessons on resilience, and possibly just get them all to read John Locke on toleration.

Baroness Prashar Portrait Baroness Prashar (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak to Amendments 236, 237 and 238 in my name. I thank the noble Lord, Lord Storey, and the noble Baroness, Lady Bennett of Manor Castle, for supporting me. Like others, I thank Full Fact for its excellent briefings. I also thank the noble Lord, Lord Knight, for introducing this group of amendments, as it saves me having to make the case for why media literacy is a very important aspect of this work. It is the other side of regulation; they very much go hand in hand. If we do not take steps to promote media literacy, we could fall into a downward spiral of further and further regulation, so it is extremely important.

It is a sad fact that levels of media literacy are very low. Research from Ofcom has found that one-third of internet users are unaware of the potential for inaccurate and biased information. Further, 40% of UK adult internet users do not have the skills to critically assess information they see online, and only 2% of children have skills to tell fact from fiction online. It will not be paternalistic, but a regulator should be proactively involved in developing media literacy programmes. Through the complaints it receives and from the work that it does, the regulator can identify and monitor where the gaps are in media literacy.

To date, the response to this problem has been for social media platforms to remove content deemed harmful. This is often done using technology that picks up on certain words and phrases. The result has been content being removed that should not have been. Examples of this include organisations such as Mumsnet having social media posts on sexual health issues taken down because the posts use certain words or phrases. At one stage, Facebook’s policy was to delete or censor posts expressing opinions that deviated from the norm, without defining what “norm” actually meant. The unintended consequences of the Bill could undermine free speech. Rather than censoring free speech through removing harmful content, we should give a lot more attention to media literacy.

During the Bill’s pre-legislative scrutiny, the Joint Committee recommended that the Government include provisions to ensure media literacy initiatives are of a high standard. The draft version of the Bill included Clause 103, which strengthened the media literacy provisions in the Communications Act 2003, as has already been mentioned. Regrettably, the Government later withdrew the enhanced media literacy clause, so the aim of my amendments is to reintroduce strong media literacy provisions. Doing so will both clarify and strengthen media literacy obligations on online media providers and Ofcom.

Amendment 236 would place a duty on Ofcom to take steps to improve the media literacy of the public in relation to regulated services. As part of this duty, Ofcom must try to reach audiences who are less engaged and harder to reach through traditional media literacy services. It must also address gaps in the current availability of media literacy provisions for vulnerable users. Many of the existing media literacy services are targeted at children but we need to include vulnerable adults too. The amendment would place a duty on Ofcom to promote availability and increase the effectiveness of media literacy initiatives in relation to regulated services. It seeks to ensure that providers of regulated services take appropriate measures to improve users’ media literacy through Ofcom’s online safety function. This proposed new clause makes provision for Ofcom to prepare guidance about media literacy matters, and such guidance must be published and kept under review.

Amendment 237 would place a duty on Ofcom to prepare a strategy on how it intends to undertake the duty to promote media literacy. This strategy should set out the steps Ofcom proposes to take to achieve its media literacy duties and identify organisations, or types of organisations, that Ofcom will work with to undertake these duties. It must also explain why Ofcom believes the proposed steps will be effective in how it will assess progress. This amendment would also place a duty on Ofcom to have regard to the need to allocate adequate resources for implementing this strategy. It would require Ofcom’s media strategy to be published within six months of this provision coming into force, and to be revised within three years; in both cases this should be subject to consultation.

Amendment 238 would place a duty on Ofcom to report annually on the delivery of its media literacy strategy. This reporting must include steps taken in accordance with the strategy and assess the extent to which those steps have had an effect. This amendment goes further than the existing provisions in the Communications Act 2003, which do not include duties on Ofcom to produce a strategy or to measure progress; nor do they place a duty on Ofcom to reach hard-to-reach audiences who are the most vulnerable in our society to disinformation and misinformation.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I see my amendments as being probing. I am very keen on having a robust complaints system, including for individuals, and am open to the argument about an ombudsman. I am listening very carefully to the way that that has been framed. I tabled these amendments because while I know we need a robust complaints system—and I think that Ofcom might have a role in that—I would want that complaints system to be simple and as straightforward as possible. We certainly need somewhere that you can complain.

Ofcom will arguably be the most powerful regulator in the UK, effectively in charge of policing a key element of democracy: the online public square. Of course, one question one might ask is: how do you complain about Ofcom in the middle of it all? Ironically, an ombudsman might be somewhere where you would have to field more than just complaints about the tech companies.

I have suggested completely removing Clauses 150 to 152 from the Bill because of my reservations, beyond this Bill and in general, about a super-complaints system within the regulatory framework, which could be very unhelpful. I might be wrong, and I am open to correction if I have misunderstood, but the Bill’s notion of an eligible entity who will be allowed to make this complaint to Ofcom seems, at the moment, to be appointed on criteria set only by the Secretary of State. That is a serious problem. There is a danger that the Secretary of State could be accused of partiality or politicisation. We therefore have to think carefully about that.

I also object to the idea that certain organisations are anointed with extra legitimacy as super-complaints bodies. We have seen this more broadly. You will often hear Ministers say, in relation to consultations, “We’ve consulted stakeholders and civil society organisations”, when they are actually often referring to lobbying organisations with interests. There is a free-for-all for NGOs and interest groups. We think of a lot of charities as very positive but they are not necessarily neutral. I just wanted to query that.

There is also a danger that organisations will end up speaking on behalf of all women, all children or all Muslims. That is something we need to be careful about in a time of identity politics. We have seen it happen offline with self-appointed community leaders, but say, for example, there is a situation where there is a demand by a super-complainant to remove a particular piece of content that is considered to be harmful, such as an image of the Prophet Muhammad. These are areas where we have to admit that if people then say, “We speak on behalf of”, they will cause problems.

Although charities historically have had huge credibility, as I said, we know from some of the scandals that have affected charities recently that they are not always the saviours. They are certainly not immune from corruption, political bias, political disputes and so on.

I suppose my biggest concern is that the function in the Bill is not open to all members of the public. That seems to be a problem. Therefore, we are saying that certain groups and individuals will have a greater degree of influence over the permissibility of speech than others. There are some people who have understood these clauses to mean that it would act like a class action—that if enough people are complaining, it must be a problem. But, as noble Lords will know from their inboxes, sometimes one is inundated with emails and it does not necessarily show a righteous cause. I do not know about anyone else who has been involved in this Bill, but I have had exactly the same cut-and-paste email about violence against women and girls hundreds of times. That usually means a well-organised, sometimes well-funded, mass mobilisation. I have no objection, but just because you get lots of emails it does not mean that it is a good complaint. If you get only one important email complaint that is written by an individual, surely you should respect that minority view.

Is it not interesting that the assumption of speakers so far has been that the complaints will always be that harms have not been removed or taken notice of? I was grateful when the noble Baroness, Lady Kidron, mentioned the Free Speech Union and recognised, as I envisage, that many of the complaints will be about content having been removed—they will be free speech complaints. Often, in that instance, it will be an individual whose content has been removed. I cannot see how the phrasing of the Bill helps us in that. Although I am a great supporter of the Free Speech Union, I do not want it to represent or act on behalf of, say, Index on Censorship or even an individual who simply thinks that their content should not be removed—and who is no less valid than an official organisation, however much I admire it.

I certainly do not want individual voices to be marginalised, which I fear the Bill presently does in relation to complaints. I am not sure about an ombudsman; I am always wary of creating yet another more powerful body in the land because of the danger of over-bureaucratisation.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Parliament Live - Hansard - - - Excerpts

I am happy to meet and discuss this. We are expanding what they are able to receive today under the existing arrangements. I am happy to meet any noble Lords who wish to take this forward to help them understand this—that is probably best.

Amendments 287 and 289 from the noble Baroness, Lady Fox of Buckley, seek to remove the provision for super-complaints from the Bill. The super-complaints mechanism is an important part of the Bill’s overall redress mechanisms. It will enable entities to raise concerns with Ofcom about systemic issues in relation to regulated services, which Ofcom will be required to respond to. This includes concerns about the features of services or the conduct of providers creating a risk of significant harm to users or the public, as well as concerns about significant adverse impacts on the right to freedom of expression.

On who can make super-complaints, any organisation that meets the eligibility criteria set out in secondary legislation will be able to submit a super-complaint to Ofcom. Organisations will be required to submit evidence to Ofcom, setting out how they meet these criteria. Using this evidence, Ofcom will assess organisations against the criteria to ensure that they meet them. The assessment of evidence will be fair and objective, and the criteria will be intentionally strict to ensure that super-complaints focus on systemic issues and that the regulator is not overwhelmed by the number it receives.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

To clarify and link up the two parts of this discussion, can the Minister perhaps reflect, when the meeting is being organised, on the fact that the organisations and the basis on which they can complain will be decided by secondary legislation? So we do not know which organisations or what the remit is, and we cannot assess how effective that will be. We know that the super-complainants will not want to overwhelm Ofcom, so things will be bundled into that. Individuals could be excluded from the super-complaints system in the way that I indicated, because super-complaints will not represent everyone, or even minority views; in other words, there is a gap here now. I want that bit gone, but that does not mean that we do not need a robust complaints system. Before Report at least—in the meetings in between—the Government need to advise on how you complain if something goes wrong. At the moment, the British public have no way to complain at all, unless someone sneaks it through in secondary legislation. This is not helpful.

Viscount Camrose Portrait Viscount Camrose (Con)
- Parliament Live - Hansard - - - Excerpts

As I said, we are happy to consider individual complaints and super-complaints further.

--- Later in debate ---
The Bill is an opportunity to improve all of this. There are pieces of very good practice and, clearly, areas where not enough is being done and too much very harmful content—particularly content that is posted with the express intent of causing harm—is being allowed to circulate. I hope that, through the legislation and by getting these protocols right, we can get to the point where we are both preventing lower-risk people moving into a higher-risk category and enabling people already in a high-risk category to get the help, support and advice that they need. Nowadays, online is often the primary tool that could benefit them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, as usual, the noble Lord, Lord Allan of Hallam, has explained with some nuance the trickiness of this area, which at first sight appears obvious—black and white—but is not quite that. I want to explore some of these things.

Many decades ago, I ran a drop-in centre for the recovering mentally ill. I remember my shock the first time that I came across a group of young women who had completely cut themselves up. It was a world that I did not know but, at that time, a very small world—a very minor but serious problem in society. Decades later, going around doing lots of talks, particularly in girls’ schools where I am invited to speak, I suddenly discovered that whole swathes of young women were doing something that had been considered a mental health problem, often hidden away. Suddenly, people were talking about a social contagion of self-harm happening in the school. Similarly, there were discussions about eating disorders being not just an individual mental health problem but something that kind of grew within a group.

Now we have the situation with suicide sites, which are phenomenal at exploiting those vulnerabilities. This is undoubtedly a social problem of some magnitude. I do not in any way want to minimise it, but I am not sure exactly how legislation can resolve it or whether it will, even though I agree that it could do certain things.

Some of the problems that we have, which have already been alluded to, really came home to me when I read about Instagram bringing in some rules on self-harm material, which ended up with the removal of posts by survivors of self-harm discussing their illness. I read a story in the Coventry Evening Telegraph—I was always interested because I ran the drop-in centre for the recovering mentally ill in Coventry—where a young woman had some photographs taken down from Instagram because they contained self-harm images of her scars. The young woman who had suffered these problems, Marie from Tile Hill, wanted to share pictures of her scars with other young people because she felt that they would help others recover. She had got over it and was basically saying that the scars were healing. In other words, it was a kind of self-help group for users online, yet it was taken down.

It is my usual problem: this looks to be a clear-cut case, yet the complexities can lead to problems of censorship of a sort. I was really pleased that the noble Baroness, Lady Finlay, stressed the point about definitions. Search engines such as Google have certainly raised the problem of a concern, or worry, that people looking for help—or even looking to write an essay on suicide, assisted suicide or whatever—will end up not being able to find appropriate material.

I also want to ask a slightly different question. Who decides which self-harms are in our definitions and what is the contagion? When I visit schools now, there is a new social contagion in town, I am afraid to say, which is that of gender dysphoria. In the polling for a newly published report Show, Tell and Leave Nothing to the Imagination by Jo-Anne Nadler, which has just come out, half of the young people interviewed said that they knew someone at their school who wanted to change gender or had already, while one in 10 said that they wanted to change their gender.

That is just an observation; your Lordships might ask what it has to do with the Bill. But these are actually problem areas that are being affirmed by educational organisations and charities, by which I mean that organisations that have often worked with government have been consulted as stakeholders. They have recommended to young women where online to purchase chest binders, which will stop them developing, or where and how to use puberty blockers. Eventually, they are affirming double mastectomies or castration. By the way, this is of course all over social media, because once you start to search on it, TikTok is rife with it. Algorithmically, we are reminded all the time to think about systems: once you have had a look at it, it is everywhere. My point is that this is affirmed socially.

Imagine a situation whereby, in society offline, some young woman who has an eating disorder and weighs 4 stone comes to you and says “Look, I’m so fat”. If you said, “Yes, of course you’re fat—I’ll help you slim”, we would think it was terrible. When that happens online, crudely and sometimes cruelly, we want to tackle it here. If some young woman came to you and said, “I want to self-harm; I feel so miserable that I want to cut myself”, and you started recommending blades, we would think it was atrocious behaviour. In some ways, that is what is happening online and that is where I have every sympathy with these amendments. Yet when it comes to gender dysphoria, which actually means encouraging self-harm, because it is a cultural phenomenon that is popular it does not count.

In some ways, I could be arguing that we should future-proof this legislation by including those self-harms in the definition put forward by the amendments in this group. However, I raise it more to indicate that, as with all definitions, it is not quite as easy as one would think. I appreciate that a number of noble Lords, and others engaged in this discussion, might think that I am merely exhibiting prejudice rather than any genuine compassion or concern for those young people. I would note that if noble Lords want to see a rising group of people who are suicidal and feel that their life is really not worth living, search out the work being done on detransitioners who realise too late that that affirmation by adults has been a disaster for them.

On the amendments suggesting another advisory committee with experts to advise Ofcom on how we regulate such harms, I ask that we are at least cautious about which experts. To mention one of the expert bodies, Mermaids has become controversial and has actually been advocating some of those self-harms, in my opinion. It is now subject to a Charity Commission investigation but has been on bodies such as this advising about young people. I would not think that appropriate, so I just ask that some consideration is given to which experts would be on such bodies.

--- Later in debate ---
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.

From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.

I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.

Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.

My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.

I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.

First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.

I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I broadly support all these amendments in spirit, since, as we have heard, they tackle excessive levels of influence that any Secretary of State is awarding themselves to shape the strategic priorities and codes of conduct of Ofcom. I will speak to Amendments 254 and 260, tabled by the noble Lord, Lord Moylan, who I am glad to see in his place. He will see in Hansard that he was about to be much missed. I cannot do him credit, but I will carry on regardless because I support his amendments.

The main difference between Amendment 254 and other similar amendments is that it requires that any guidance issued to Ofcom—under Clause 157, for example—is

“approved by resolution of each House of Parliament”

rather than by committees. However, the spirit of it, which is to remove the Secretary of State’s power to give wide-ranging guidance or instructions about Ofcom’s functions, and the concerns that we have about that, is broadly in line with everything else we have heard.

It is important to ask whether it is appropriate for our right to freedom of expression to be curtailed by secondary legislation, which cannot be amended, which has little parliamentary oversight and on which challenges are very often reduced to nothing more than rhetorical whinges in this House. That would mean that the power exercised by the Secretary of State would bypass the full democratic process.

In fact, it also weakens our capacity to hold Ofcom to account. One thing that has become apparent throughout our Committee deliberations is that—as I think the noble Baroness, Lady Stowell, indicated—Ofcom will be an uber-regulator. It is itself very powerful, as I have raised, with respect to potentially policing and controlling what UK citizens see, read and have access to online, what they are allowed to search, private messaging and so on. In some ways, I want Ofcom to have much more scrutiny and be accountable to Parliament, elected politicians and the public realm. But that is not the same as saying that Ofcom should be accountable and answerable to the Secretary of State; that would be a whole different ball game. It could be said that the Secretary of State will be elected, but we know that that is a sleight of hand.

I want more accountability and scrutiny of Ofcom by individual users of online services; we even talked about that, the other day, in relation to complaints. I want more democratic scrutiny, but the Bill does the opposite of that by allowing the Government a huge amount of executive power to shape the proposed system of online speech moderation and even influence political discourse in the public square.

I want to move on to that issue. Under the Bill, the Secretary of State will have the power to set Ofcom’s strategic priorities, direct Ofcom to modify its code of practice through secondary legislation, set criteria for platform categorisation and designate priority illegal offences. They will be able to change codes of practice for “reasons of public policy”, which is as vague a phrase as you will ever get. I fear that, frankly, that level of discretion is likely to lead to a highly politicised and—my dread—censorship-heavy approach to regulation.

The Secretary of State could come under extreme pressure to respond to each individual concerning case of digital content—whatever happens to be in the news this week—with an ever-expanding list of areas to be dealt with. I dread that this will inevitably be exploited by highly political lobbyists and organisations, who will say, “You must act on this. This is hate speech. You’ve got to do something about this”. That is a completely arbitrary way to behave.

According to the Bill, Ofcom has no choice but to comply, and that obviously leads to the dangers of politicisation. I do not think it is scaremongering to say that this is politicisation and could compromise the independence of Ofcom. The Secretary of State’s power of direction could mean that the Government are given the ability to shape the permissibility of categories of online content, based on the political mood of the day, and the political whims of a specific Secretary of State to satisfy a short-term moral panic on a particular issue.

One question for the Minister, and the Government, is: should you ever create powers that you would not want to see your political opponents exercising? The Secretary of State today will not always be the Secretary of State tomorrow; they will not always be in the same image. Awarding such overwhelming powers, and the potential politicising of policing speech, might feel comfortable for the Government today; it might be less comfortable when you look at the way that some people view, for example, the tenets of Conservatism.

In recent weeks, since a “National Conservatism” conference was held up the road, I have heard members of opposition parties describe the contents of Conservatism as “Trumpist”, “far-right” and “fascist” hate speech. I am worried—on behalf of the Government —that some of those people might end up as a Secretary of State and it could all blow up in their face as it were, metaphorically.

In all seriousness, because I am really not interested in the fate of either the Opposition or the Government in terms of their parties, I am trying to say that it is too arbitrary. In a situation where we have such weak commitments to freedom of conscience, thought or speech in this Bill, I really do not want to give the Secretary of State the power to broaden out the targets that might be victim to it.

Finally—and I apologise to the noble Lord, Lord Moylan, who would have been much more professional, specific and hard-hitting on his amendment—from what I have heard, I hope that all the tablers of the amendments from all parties might well have got together by Report and come up with satisfactory amendments that will deal with this. I think we all agree, for once, that something needs to be done to curtail power. I look forward to supporting that later in the process.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Grey-Thompson Portrait Baroness Grey-Thompson (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I speak in favour of Amendments 124, 126 and 227 to which my name is attached. I will reserve my comments mostly to the Bill’s loophole on newspaper comment sections.

These forums would qualify as social media platforms under the Bill’s definition were it not for a special exemption in Clause 49. They have been found to host some of the most appalling and despicable content online. I will paraphrase some examples so as not to subject the Committee to the specific language used, but they include anti-Semitic slurs in comments appearing under articles covering a violent attack on a synagogue; Holocaust denial; and speculation that Covid was created and spread by a secretive global cabal of powerful individuals who control the world’s leaders like puppets.

Some of the worst abuse is reserved for women in public life, which I and others in your Lordships’ House have personally experienced. In an article about a female leader, comments included that she should be struck down or executed by the SAS. Others commented graphically on her appearance and made disturbing sexual remarks. Another woman, Professor Fowler—who the noble Lord, Lord Clement-Jones, has already discussed —was described as having a sick mind and a mental disorder; one comment implied that a noose should be prepared for her. There are many more examples.

Comment sections are in too many cases badly regulated and dangerous places for members of the public. The exemption for them is unwarranted. Specifically, it protects any social media platform where users make comments in response to what the Bill describes as “provider content”. In this case, that means comments posted in response to articles published by the newspaper. This is materially no different from user exchanges of any other kind and should be covered just the same.

The Government have previously argued that there should be a distinction between newspaper comment sections and other platforms, in that other platforms allow for virality because posts that are liked and retweeted do better than the others. But this is exactly the same for many modern comment sections. Lots of these include functionality to upvote certain comments, which can then rise to the top of the comment section on that article.

There are estimated to be around 15 million people on Twitter in the UK—I am one of them—but more than twice that number read newspaper websites every month. These comment sections are social media platforms with the same power, reach and capacity to cause harm as the US giants. We should not treat them any differently on account of the fact that they are based out of Fleet Street rather than Silicon Valley.

There are some concerns that the Bill’s requirements would put an undue burden on small organisations running comment sections, so this amendment would apply only to organisations with an annual turnover in excess of £100 million. This would ensure that only the largest titles, which can surely afford it, are required to regulate their comment sections. Amendment 124 would close the comment section loophole, and I urge the Government to act on it.

It is a great shame that, due to the lateness of the hour, my noble friend Lady Hollins is unable to be here. She would strongly support Amendment 126 on several points but specifically wanted to talk about how the exemption creates double standards between how the public and news publishers are treated, and puts platforms and Ofcom in an impossible situation over whether newspapers meet vague criteria to access exemptions.

I also support Amendments 126 and 227, which would help protect the public from extremist and other dangerous websites by preventing them accessing the separate media exemption. In all these matters, we must not let overbroad exemptions and loopholes undermine what good work this Bill could do.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, while considering this group of amendments, a comment by Index on Censorship came to mind. Critical of aspects of the Bill, it worried out loud about whether this legislation

“will reverse the famous maxim ‘publish and be damned’, to become, ‘consider the consequences of all speech, or be damned’”.

In that context, I am very grateful—relieved at least— that the freedom of the press is given due regard and protections in the Bill. Freedom of the press is one practical form in which freedom of expression exists and is invaluable in a democracy. It is so crucial that it has been at the centre of democratic struggles in this very Parliament for more than five centuries—ever since the first printing press meant that the masses could gain access to the written word. It fuelled the pamphleteers of the English Civil War. It made a hero of MP John Wilkes in the 18th century, his victory giving the press freedom to report on the goings-on of the great and the good, to muckrake and to dig the dirt; long may that continue.

So I welcome that news publishers’ content on their own websites is not in scope of the legislation; that if platforms take down or restrict access to trusted news sources, they will face significant sanctions; that platforms must notify news publishers if they want to take down their content and, if the publisher disputes that, the platform must not remove it until the dispute is resolved; and that Ofcom must also review the efficacy of how well the platforms are protecting news.

I say “Hurrah!” to all that. If only the Bill treated all content with such a liberal and proportionate approach, I would not be standing up and speaking quite so much. But on the press specifically, I strongly oppose Amendments 124 and 126—as well as Amendment 127, now that it has been explained and I understand it; I did not quite before. Amendment 124 would mean that the comment section of the largest newspaper websites were subject to the regulation in the Bill.

It is important to note—as has been explained—that user comments are already regulated by IPSO, the Independent Press Standards Organisation, and that individual publishers have strong content moderation system policies and the editor is ultimately liable for comments. That is the key issue here. This is about protecting editorial independence from state interference. Amendment 124 does the opposite. That amendment would also restrict the ability of UK citizens to discuss and engage with publishers’ content.

It is part of a lively and vital public square to be free to debate and discuss articles in newspapers. We have heard some pretty graphic and grim descriptions from the noble Baroness, Lady Grey-Thompson, and the noble Lord, Lord Clement-Jones, about those comments; but for me, ironically, the comment section in newspapers is a form of accountability of the press to readers and the audience. Although the descriptions were grim, much of that section is intelligent, well-informed and interesting feedback. I will talk a little about hate afterwards.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

I am mindful of the examples the noble Lord gave in his speech. Looking at some of the provisions set out in subsection (2) about a body being

“subject to a standards code”

or having

“policies and procedures for handling and resolving complaints”,

I think on first response that those examples he gave would be covered. But I will certainly take on board the comments he made and those the noble Baroness, Lady Gohir, made as well and reflect on them. I hope—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

On a final point of clarification, in contrast, I think the exemption may be too narrow, not too broad. With the emergence of blogs and different kinds of news organisations—I think the noble Lord, Lord Allan, described well the complexity of what we have—and some of the grimmer, grosser examples of people who might play the system, does the Minister acknowledge that that might be dealt with by the kind of exemptions that have been used for RT? When somebody is really an extremist representative of, I do not know, ISIS, pretending to be a media organisation, the sensible thing to do would be to exempt them, rather than to overtighten the exemptions, so that new, burgeoning, widely read online publications can have press freedom protection.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take on board the points the noble Baroness raises. Hearing representations in both directions on the point would, on first consideration, reassure me that we have it right, but I will certainly take on board the points which the noble Baroness, the noble Lord and others have raised in our debate on this. As the noble Lord, Lord Allan, suggests, I will take the opportunity to discuss it with Ofcom, as we will do on many of the issues which we are discussing in this Committee, to make sure that its views are taken on board before we return to these and other issues on Report.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Amendment 263 would complete this systemic implementation of risk assessment by ensuring that future reviews of the regime by the Secretary of State include a broad assessment of the harms arising from regulated services, not just regulated content. This amendment would ensure ongoing consideration of risk management, including whether the regime needs expanding or contracting. I urge the Minister to support Amendments 195, 239 and 263.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, like others, I thank the Whips for intervening to protect children from hearing details that are not appropriate for the young. I have to say that I was quite relieved because I was rather squirming myself. Over the last two days of Committee, I have been exposed to more violent pornographic imagery than any adult, never mind a child, should be exposed to. I think we can recognise that this is certainly a challenging time for us.

I do not want any of the comments I will now make to be seen as minimising understanding of augmented reality, AI, the metaverse and so on, as detailed so vividly by the noble Baronesses, Lady Harding and Lady Finlay, in relation to child safety. However, I have some concerns about this group, in terms of proportionality and unintended outcomes.

Amendment 239, in the names of the right reverend Prelate the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, sums up some of my concerns about a focus on future-proofing. This amendment would require Ofcom to produce reports about future risks, which sounds like a common-sense demand. But my question is about us overly focusing on risk and never on opportunities. There is a danger that the Bill will end up recommending that we see these new technologies only in a negative way, and that we in fact give more powers to expand the scope for harmful content, in a way that stifles speech.

Beyond the Bill, I am more generally worried about what seems to be becoming a moral panic about AI. The precautionary principle is being adopted, which could mean stifling innovation at source and preventing the development of great technologies that could be of huge benefit to humanity. The over-focus on the dangers of AI and augmented reality could mean that we ignore the potential large benefits. For example, if we have AI, everyone could have an immediately responsive GP in their pocket—goodness knows that, for those trying to get an appointment, that could be of great use and benefit. It could mean that students have an expert tutor in every subject, just one message away. The noble Baroness, Lady Finlay, spoke about the fantastic medical breakthroughs that augmented reality can bring to handling neurological damage. Last night, I cheered when I saw how someone who has never been able to walk now can, through those kinds of technologies. I thought, “Isn’t this a brilliant thing?” So all I am suggesting is that we have to be careful that we do not see these new technologies only as tools for the most perverted form of activity among a small minority of individuals.

I note, with some irony, that fewer qualms were expressed by noble Lords about the use of AI when it was proposed to scan and detect speech or images in encrypted messages. As I argued at the time, this would be a threat to WhatsApp, Signal and so on. Clauses 110 and 124 have us using AI as a blunt proactive technology of surveillance, despite the high risks of inaccuracy, error and false flags. But there was great enthusiasm for AI then, when it was having an impact on individuals’ freedom of expression—yet, here, all we hear are the negatives. So we need to be balanced.

I am also concerned about Amendment 125, which illustrates the problem of seeing innovation only as a threat to safety and a potential problem. For example, if the Bill considers AI-generated content to be user-generated content, only large technology companies will have the resources—lawyers and engineers—necessary to proceed while avoiding crippling liability.

In practice, UK users risk being blocked out from new technologies if we are not careful about how we regulate here. For example, users in the European Union currently cannot access Google Bard AI assistant because of GDPR regulations. That would be a great loss because Google Bard AI is potentially a great gain. Despite the challenges of the likes of ChatGPT and Bard AI that we keep reading about, with people panicking that this will lead to wide-scale cheating in education and so on, this has huge potential as a beneficial technology, as I said.

I have mentioned that one of the unintended consequences—it would be unintended—of the whole Bill could be that the UK becomes a hostile environment for digital investment and innovation. So start-ups that have been invested in—like DeepMind, a Google-owned and UK-based AI company—could be forced to leave the UK, doing huge damage to the UK’s digital sector. How can the UK be a science and technology superpower if we end up endorsing anti-innovation, anti-progress and anti-business measures by being overly risk averse?

I have the same concerns about Amendment 286, which requires periodic reviews of new technology content environments such as the metaverse and other virtual augmented reality settings. I worry that it will not be attractive for technology companies to confidently invest in new technologies if there is this constant threat of new regulations and new problems on the horizon.

I have a query that mainly relates to Amendment 125 but that is also more general. If virtual augmented reality actually involves user-to-user interaction, like in the metaverse, is it not already covered in the Bill? Why do we need to add it in? The noble Baroness, Lady Harding, said that it has got to the point where we are not able to distinguish fake from real, and augmented reality from reality. But she concludes that that means that we should treat fake as real, which seems to me to rather muddy the waters and make it a fait accompli. I personally—

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

I am sorry to interrupt, but I will make a clarification; the noble Baroness is misinterpreting what I said. I was actually quoting the godfather of AI and his concerns that we are fast approaching a space where it will be impossible—I did not say that it currently is—to distinguish between a real child being abused and a machine learning-generated image of a child being abused. So, first, I was quoting the words of the godfather of AI, rather than my own, and, secondly, he was looking forward—only months, not decades—to a very real and perceived threat.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I personally think that it is pessimistic view of the future to suggest that humanity cannot rise to the task of being able to distinguish between deep fakes and real images. Organising all our lives, laws and liberties around the deviant predilections of a minority of sexual offenders on the basis that none of us will be able to tell the difference in the future, when it comes to that kind of activity, is rather dangerous for freedom and innovation.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak very briefly. I could disagree with much of what the noble Baroness just said, but I do not need to go there.

What particularly resonates with me today is that, since I first entered your Lordships’ House at the tender age of 28 in 1981, this is the first time I can ever remember us having to rein back what we are discussing because of the presence of young people in the Public Gallery. I reflect on that, because it brings home the gravity of what we are talking about and its prevalence; we cannot run away or hide from it.

I will ask the Minister about the International Regulatory Cooperation for a Global Britain: Government Response to the OECD Review of International Regulatory Cooperation of the UK, published 2 September 2020. He will not thank me for that, because I am sure that he is already familiar and word-perfect with this particular document, which was pulled together by his noble friend, the noble Lord, Lord Callanan. I raise this because, to think that we can in any way, shape or form, with this piece of legislation, stem the tide of what is happening in the online world—which is happening internationally on a global basis and at a global level—by trying to create regulatory and legal borders around our benighted island, is just for the fairies. It is not going to happen.

Can the Minister tell us about the degree to which, at an international level, we are proactively talking to, and learning from, other regulators in different jurisdictions, which are battling exactly the same things that we are? To concentrate the Minister’s mind, I will point out what the noble Lord, Lord Callanan, committed the Government to doing nearly three years ago. First, in relation to international regulatory co-operation, the Government committed to

“developing a whole-of-government IRC strategy, which sets out the policies, tools and respective roles of different departments and regulators in facilitating this; … developing specific tools and guidance to policy makers and regulators on how to conduct IRC; and … establishing networks to convene international policy professionals from across government and regulators to share experience and best practice on IRC”.

I am sure that, between now and when he responds, he will be given a detailed answer by the Bill team, so that he can tell us exactly where the Government, his department and Ofcom are in carrying out the commitments of the noble Lord, Lord Callanan.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.

In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.

For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.

The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.

The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.

It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.

Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.

I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.

I want particularly to compliment Amendment 229 that says that transparency reports should be

“of sufficient quality to enable service users and researchers to make informed judgements”,

et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.

--- Later in debate ---
I hope that is a helpful counterargument to the idea that platforms should automatically report material. However, I recognise that it leaves an open question. When people engage in that kind of behaviour online and it has serious real-world consequences, how do we make sure that they do not feel that it is consequence-free—that they understand that there are consequences? If they have broken the law, they should be prosecuted. There may be something in streamlining the process where a complainant goes to the police and the police are able to access the information they need, having first assessed that it is worth prosecuting and illegal, so that we make that loop work first before we head in the direction of having platforms report content en masse because they believe it may have violated laws where we are not at that most serious end of the spectrum.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, so few of us are involved in this discussion that we are now able to write each other’s speeches. I thank the noble Lord, Lord Allan of Hallam, for articulating some of my concerns, probably more elegantly than I will myself. I will focus on two amendments in this group; in fact, there are lots of interesting things, but I will focus on both the amendments from the noble Lord, Lord Bassam of Brighton.

On the issue of proactive steps to remove listings of knives for young people, I am so sympathetic to this because in a different area of my life I am pretty preoccupied with the problem of knife crime among young people. It really bothers me and I worry about how we tackle it. My concern of course is that the police should be working harder to solve that problem and that we cannot anticipate that the Bill will solve all social problems. There is a danger of removing the focus from law enforcement in a real-world problem, as though removing how you buy the knife is the issue. I am not convinced that that helps us.

I wanted to reflect on the kind of dilemmas I am having around this in relation to the story of Mizzy that is doing the rounds. He is the 18 year-old who has been posting his prank videos on TikTok and has caused quite a stir. People have seen him wandering into strangers’ homes uninvited, asking random people in the street if they want to die, running off with an elderly lady’s dog and making fun of Orthodox Jews—generally speaking, this 18 year-old is obnoxious. His TikTok videos have gone viral; everybody is discussing them.

This cruelty for kicks genre of filming yourself, showing your face full to the camera and so on, is certainly abhorrent but, as with the discussion about knife crime, I have noticed that some people outside this House are attempting to blame the technology for the problem, saying that the videos should have been removed earlier and that it is TikTok’s fault that we have this anti-social behaviour, whereas I think it is a much deeper, broader social problem to do with the erosion of adult authority and the reluctance of grown-ups to intervene clearly when people are behaving badly—that is my thesis. It is undoubtedly a police matter. The police seem to have taken ages to locate Mizzy. They eventually got him and charged him with very low offences, so he was on TV being interviewed the other evening, laughing at how weak the law was. Under the laws he was laughing at, he could freely walk into somebody’s house or be obnoxious and get away with it. He said, “We can do what we want”. That mockery throws up problems, but I do not necessarily think that the Bill is the way to solve it.

That leads me to my concerns about Amendment 268AA, because Mizzy was quoted in the Independent newspaper as saying:

“I’m a Black male doing these things and that’s why there’s such an uproar”.


I then went on a social media thread in which any criticism of Mizzy’s behaviour was described as racist harassment. That shows the complexity of what is being called for in Amendment 268AA, which wants platforms to take additional steps

“to combat incidents of online racially aggravated harassment”.

My worry is that we end up with not only Mizzy’s TikTok videos being removed but his critics being removed for racially harassing him, so we have to be very careful here.

Amendment 268AA goes further, because it wants tech companies to push for prosecution. I really think it is a dangerous step to encourage private companies to get tangled up in deciding what is criminal and so on. The noble Lord, Lord Allan, has exactly described my concerns, so I will not repeat them. Maybe I can probe this probing amendment. It also broadens the issue to all forms of harassment.

By the way, the amendment’s explanatory statement mentions the appalling racist abuse aimed at footballers and public figures, but one of the fascinating things was that when we number-crunched and went granular, we found that the majority of that racist abuse seemed to have been generated by bots, which takes us to the position of the noble Lord, Lord Knight, earlier: who would you prosecute in that instance? Bots not even based in the UK were generating what was assumed to be an outbreak of racist abuse among football fans in the UK, but the numbers did not equate to that. There were some people being racist and vile and some things that were generated in these bot farms.

To go back to the amendment, it goes on to broaden the issue out to

“other forms of harassment and threatening or abusive behaviour”.

Again, this is much more complicated in today’s climate, because those kinds of accusation can be deployed for bad faith reasons, particularly against public figures.

We have an example close to this House. I hope that Members have been following and will show solidarity over what has been happening to the noble Baroness, Lady Falkner of Margravine, who is chair of the Equality and Human Rights Commission and tasked with upholding the equality law but is at the centre of a vicious internal row after her officials filed a dossier of complaints about her. They have alleged that she is guilty of harassment. A KC is being brought in, there are 40 complaints and the whole thing is costing a fortune for both taxpayers and the noble Baroness herself.

It coincided with the noble Baroness, Lady Falkner, advising Ministers to update the definition of sex in the Equality Act 2010 to make clear that it refers to biological sex and producing official advice clarifying that trans women can be lawfully excluded from female-only spaces. We know how toxic that whole debate is.

Many of us feel that a lot of the accusations against the noble Baroness are ideologically and politically motivated vexatious complaints. I am distressed to read newspaper reports that say that she has been close to tears and has asked why anyone would go into public service. All this is for the crime of being a regulator upholding and clarifying the law. I hope it does not happen to the person who ends up regulating Ofcom—ending up close to tears as he stands accused of harassment, abusive behaviour and so on.

The point is that she is the one being accused of harassment. I have seen the vile abuse that she has received online. It is completely defamatory, vicious abuse and yet somehow it ends up being that, because she does not provide psychological safety at work and because of her views, she is accused of harassment and is the one in the firing line. I do not want us to introduce that kind of complexity—this is what I have been worried about throughout—into what is banned, removed or sent to the police as examples of harassment or hate crime.

I know that is not the intention of these amendments; it is the unintended consequences that I dread.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak chiefly to Amendment 262 in my name, although in speaking after the noble Baroness, Lady Fox, who suggested that the grown-ups should control anti-social behaviour by young people online, I note that there is a great deal of anti-social behaviour online from people of all ages. This is relevant to my Amendment 262.

It is a very simple amendment and would require the Secretary of State to consult with young people by means of an advisory board consisting of people aged 25 and under when reviewing the effectiveness and proportionality of this legislation. This amendment is a practical delivery of some of the discussion we had earlier in this Committee when we were talking about including the Convention on the Rights of the Child in the Bill. There is a commonly repeated phrase, “Nothing about us without us”. It was popularised by disability activists in the 1990s, although in doing a little research for this I found that it originates in Latin in Poland in the 15th century. So it is an idea that has been around for a long while and is seen as a democratic standard. It is perhaps a variation of the old “No taxation without representation”.

This suggestion of an advisory board for the Secretary of State is because we know from the discussion earlier on the children’s rights amendments that globally one in three people online is a child under the age of 18. This comes to the point of the construction of your Lordships’ House. Most of us are a very long way removed in experiences and age—some of us further than others. The people in this Committee thinking about a 12 year-old online now are parents, grandparents and great-grandparents. I venture to say that it is very likely that the Secretary of State is at least a generation older than many of the people who will be affected by its provisions.

This reflects something that I also did on the Health and Care Bill. To introduce an advisory panel of young people reporting directly to the Secretary of State would ensure a direct voice for legislation that particularly affects young people. We know that under-18s across the UK do not have any role in elections to the other place, although 16 and 17 year-olds have a role in other elections in Wales and Scotland now. This is really a simple, clear, democratic step. I suspect the Minister might be inclined to say, “We are going to talk to charities and adults who represent children”. I suggest that what we really need here is a direct voice being fed in.

I want to reflect on a recent comment piece in the Guardian that made a very interesting argument: that there cannot be, now or in the future, any such thing as a digital native. Think of the experience of someone 15 or 20 years ago; yes, they already had the internet but it was a very different beast to what we have now. If we refer back to some of the earlier groups, we were starting to ask what an internet with widespread so-called generative artificial intelligence would look like. That is an internet which is very different from even the one that a 20 year-old is experiencing now.

It is absolutely crucial that we have that direct voice coming in from young people with experience of what it is like. They are an expert on what it is like to be a 12 year-old, a 15 year-old or a 20 year-old now, in a way that no one else can possibly be, so that is my amendment.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.

I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.

I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.

As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.

We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.

We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.

Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.

Online Safety Bill

Baroness Fox of Buckley Excerpts
This an important set of amendments which we are coming to quite late in the day. They touch on some issues that are being dealt with elsewhere, and I hope this is one example where we will feel comfortable learning from the EU, which is a little bit ahead in terms of trying to deal with some of these questions, working within a framework which is still, from a data protection law point of view at least, a pretty consistent framework between us and them.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, Amendments 233 and 234 from the noble Lord, Lord Knight of Weymouth, were well motivated, so I will be brief. I just have a couple of queries.

First, we need to consider what the criteria are for who is considered worthy of the privileged status of receiving Ofcom approval as a researcher. We are discussing researchers as though they are totally reliable and trustworthy. We might even think that if they are academic researchers, they are bound to be. However, there was an interesting example earlier this week of confirmation bias leading to mistakes when King’s College had to issue a correction to its survey data that was used in the BBC’s “Mariana in Conspiracyland”. King’s College admitted that it had wildly overestimated the numbers of those reading conspiracy newspaper, The Light, and wildly overestimated the numbers of those attending what it dubbed conspiracy demonstrations. By the way, BBC Verify has so far failed to verify the mistake it repeated. I give this example not as a glib point but because we cannot just say that because researchers are accredited elsewhere they should just be allowed in. I also think that the requirement to give the researchers

“all such assistance as they may reasonably require to carry out their research”

sounds like a potentially very time-consuming and expensive effort.

The noble Lord, Lord Allan of Hallam, raised points around “can’t” or “won’t”, and whether this means researchers “must” or “should”, and who decides whether it is ethical that they “should” in all instances. There are ethical questions here that have been raised. Questions of privacy are not trivial. Studying individuals as specimens of “badthink” or “wrongthink” might appear in this Committee to be in the public interest but without the consent of people it can be quite damaging. We have to decide which questions fulfil the public interest so sufficiently that consent could be overridden in that way.

I do not think this is a slam-dunk, though it looks like a sensible point. I do not doubt that all of us want more research, and good research, and data we can use in arguments, whatever side we are on, but it does not mean we should just nod something through without at least pausing.

Lord Bethell Portrait Lord Bethell (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I declare an interest as a trustee of the International Centre for the Study of Radicalisation at the War Studies department of King’s College London. That is somewhere that conducts research using data of the kind addressed in this group, so I have a particular interest in it.

We know from the kind of debates that the noble Lord, Lord Knight, referred to that it is widely accepted that independent researchers benefit hugely from access to relevant information from service providers to research online safety matters. That is why my Amendment 234, supported by the noble Lords, Lord Clement-Jones and Lord Knight, aims to introduce an unavoidable mandatory duty for regulated platforms to give access to that data to approved researchers.

As the noble Lord, Lord Knight, said, there are three ways in which this would be done. First, the timeframe for Ofcom’s report would be accelerated; secondly, proposed new Clause 147 would allow Ofcom to appoint the researchers; and, thirdly, proposed new Clause 148 would require Ofcom to write a code of practice on data access, setting up the fundamental principles for data access—a code which, by the way, should answer some of the concerns quite reasonably voiced by the noble Baroness, Lady Fox.

The internet is absolutely the most influential environment in our society today, but it is a complete black box, and we have practically no idea what is going on in some of the most important parts of it. That has a terrible impact on our ability to devise sensible policies and mitigate harm. Instead, we have a situation where the internet companies decide who accesses data, how much of it and for what purposes.

In answer to his point, I can tell the noble Lord, Lord Allan, who they give the data to—they give it to advertisers. I do not know if anyone has bought advertising on the internet, but it is quite a chilling experience. You can find out a hell of a lot about quite small groups of people if you are prepared to pay for the privilege of trying to reach them with one of your adverts: you can find out what they are doing in their bedrooms, what their mode of transport is to get to work, how old they are, how many children they have and so on. There is almost no limit to what you can find out about people if you are an advertiser and you are prepared to pay.

In fact, only the companies themselves can see the full picture of what goes on on the internet. That puts society and government at a massive disadvantage and makes policy-making virtually impossible. Noble Lords should be in no doubt that these companies deliberately withhold valuable information to protect their commercial interests. They obfuscate and confuse policymakers, and they protect their reputations from criticism about the harms they cause by withholding data. One notable outcome of that strategy is that it has taken years for us to be here today debating the Online Safety Bill, precisely because policy-making around the internet has been so difficult and challenging.

A few years ago, we were making some progress on this issue. I used to work with the Institute for Strategic Dialogue using CrowdTangle, a Facebook product. It made a big impact. We were working on a project on extremism, and having access to CrowdTangle revolutionised our understanding of how the networks of extremists that were emerging in British politics were coming together. However, since then, platforms have gone backwards a long way and narrowed their data-sharing. The noble Lord, Lord Knight, mentioned that CrowdTangle has essentially been closed down, and Twitter has basically stopped providing its free API for researchers—it charges for some access but even that is quite heavily restricted. These retrograde steps have severely hampered our ability to gather the most basic data from otherwise respectable and generally law-abiding companies. It has left us totally blind to what is happening on the rest of the internet—the bit beyond the nice bit; the Wild West bit.

Civil society plays a critical role in identifying harmful content and bad behaviour. Organisations such as the NSPCC, the CCDH, the ISD—which I mentioned—the Antisemitism Policy Trust and King’s College London, with which I have a connection, prove that their work can make a really big difference.

It is not as though other parts of our economy or society have the same approach. In fact, in most parts of our world there is a mixture of public, regulator and expert access to what is going on. Retailers, for instance, publish what is sold in our shops. Mobile phones, hospitals, banks, financial markets, the broadcast media—they all give access, both to the public and to their regulators, to a huge amount of data about what is going on. Once again, internet companies are claiming exceptional treatment—that has been a theme of debates on the Online Safety Bill—as if what happens online should, for some reason, be different from what happens in the rest of the world. That attitude is damaging the interests of our country, and it needs to be reversed. Does anyone think that the FSA, the Bank of England or the MHRA would accept this state of affairs in their regulated market? They absolutely would not.

Greater access to and availability of data and information about systems and processes would hugely improve our understanding of the online environment and thereby protect the innovation, progress and prosperity of the sector. We should not have to wait for Ofcom to be able to identify new issues and then appoint experts to look at them closely; there should be a broader effort to be in touch with what is going on with the internet. It is the nature of regulation that Ofcom will heavily rely on researchers and civil society to help enforce the Online Safety Bill, but this can be achieved only if researchers have sufficient access to data.

As the noble Lord, Lord Allan, pointed out, legislators elsewhere are making progress. The EU’s Digital Services Act gives a broad range of researchers access to data, including civil society and non-profit organisations dedicated to public interest research. The DSA sets out a framework for vetting and access procedures in detail, as the noble Baroness, Lady Fox, rightly pointed out, creating an explicit role for new independent supervisory authorities and digital services co-ordinators to manage that process.

Under Clause 146, Ofcom must produce a report exploring such access within two years of that section of the Bill coming into effect. That is too long. There is no obligation on the part of the regulator or service providers to take this further. No arguments have been put forward for this extended timeframe or relative uncertainty. In contrast, the arguments to speed up the process are extremely persuasive, and I invite my noble friend the Minister to address those.

--- Later in debate ---
Baroness Finlay of Llandaff Portrait Baroness Finlay of Llandaff (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will address my remarks to government Amendment 268AZA and its consequential amendments. I rather hope that we will get some reassurance from the Minister on these amendments, about which I wrote to him just before the debate. I hope that that was helpful; it was meant to be constructive. I also had a helpful discussion with the noble Lord, Lord Allan.

As has already been said, the real question relates to the threshold and the point at which this measure will clock in. I am glad that the Government have recognised the importance of the dangers of encouraging or assisting serious self-harm. I am also grateful for the way in which they have defined it in the amendment, relating to it grievous bodily harm and severe injury. The amendment says that this also

“includes successive acts of self-harm which cumulatively reach that threshold”.

That is important; it means, rather than just one act, a series of them.

However, I have a question about subsection (10), which states that:

“A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it”.


We know from bereaved parents that algorithms have been set up which relay this ghastly, horrible and inciteful material that encourages and instructs. That is completely different from those organisations that are trying to provide support.

I am grateful to Samaritans for all its help with my Private Member’s Bill, and for the briefing that it provided in relation to this amendment. As it points out, over 5,500 people in England and Wales took their own lives in 2021 and self-harm is

“a strong risk factor for future suicide”.

Interestingly, two-thirds of those taking part in a Samaritans research project said that

“online forums and advice were helpful to them”.

It is important that there is clarity around providing support and not encouraging and goading people into activity which makes their self-harming worse and drags them down to eventually ending their own lives. Three-quarters of people who took part in that Samaritans research said that they had

“harmed themselves more severely after viewing self-harm content online”.

It is difficult to know exactly where this offence sits and whether it is sufficiently narrowly drawn.

I am grateful to the Minister for arranging for me to meet the Bill team to discuss this amendment. When I asked how it was going to work, I was somewhat concerned because, as far as I understand it, the mechanism is based on the Suicide Act, as amended, which talks about the offence of encouraging or assisting suicide. The problem as I see it is that, as far as I am aware, there has not been a string of prosecutions following the suicide of many young people. We have met their families and they have been absolutely clear about how their dead child or sibling—whether a child or a young adult—was goaded, pushed and prompted. I recently had experience outside of a similar situation, which fortunately did not result in a death.

The noble Lord, Lord Allan, has already addressed some of the issues around this, and I would not want the amendment not to be there because we must address this problem. However, if we are to have an offence here, with a threshold that the Government have tried to define, we must understand why, if assisting and encouraging suicide on the internet is already a criminal offence, nothing has happened and there have been no prosecutions.

Why is subsection (10) in there? It seems to negate the whole problem of forwarding on through dangerous algorithms content which is harmful. We know that a lot of the people who are mounting this are not in the UK, and therefore will be difficult to catch. It is the onward forwarding through algorithms that increases the volume of messaging to the vulnerable person and drives them further into the downward spiral that they find themselves in—which is perhaps why they originally went to the internet.

I look forward to hearing the Government’s response, and to hearing how this will work.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, this group relates to communications offences. I will speak in support of Amendment 265, tabled by the noble Lord, Lord Moylan, and in support of his opposition to Clause 160 standing part of the Bill. I also have concerns about Amendments 267AA and 267AB, in the name of the noble Baroness, Lady Kennedy. Having heard her explanation, perhaps she can come back and give clarification regarding some of my concerns.

On Clause 160 and the false communications offence, unlike the noble Lord, Lord Moylan, I want to focus on psychological harm and the challenge this poses for freedom of expression. I know we have debated it before but, in the context of the criminal law, it matters in a different way. It is worth us dwelling on at least some aspects of this.

The offence refers to what is described as causing

“non-trivial psychological or physical harm to a likely audience”.

As I understand it—maybe I want some clarity here—it is not necessary for the person sending the message to have intended to cause harm, yet there is a maximum sentence of 51 weeks in prison, a fine, or both. We need to have the context of a huge cultural shift when we consider the nature of the harm we are talking about.

J.S. Mill’s harm principle has now been expanded, as previously discussed, to include traumatic harm caused by words. Speakers are regularly no-platformed for ideas that we are told cause psychological harm, at universities and more broadly as part of the whole cancel culture discussion. Over the last decade, harm and safety have come no longer to refer just to physical safety but have been conflated. Historically, we understood the distinction between physical threats and violence as distinct from speech, however aggressive or incendiary that speech was; we did not say that speech was the same as or interchangeable with bullets or knives or violence—and now we do. I want us to at least pause here.

What counts as psychological harm is not a settled question. The worry is that we have an inability to ascertain objectively what psychological harm has occurred. This will inevitably lead to endless interpretation controversies and/or subjective claims-making, at least some of which could be in bad faith. There is no median with respect to how humans view or experience controversial content. There are wildly divergent sensibilities about what is psychologically harmful. The social media lawyer Graham Smith made a really good point when he said that speech is not a physical risk,

“a tripping hazard … a projecting nail … that will foreseeably cause injury … Speech is nuanced, subjectively perceived and capable of being reacted to in as many different ways as there are people.”

That is true.

We have seen an example of the potential disputes over what creates psychological harm in a case in the public realm over the past week. The former Culture Secretary, Nadine Dorries, who indeed oversaw much of this Bill in the other place, had her bullying claims against the SNP’s John Nicolson MP overturned by the standards watchdog. Her complaints had previously been upheld by the standards commissioner. John Nicolson tweeted, liked and retweet offensive and disparaging material about Ms Dorries 168 times over 24 hours—which, as they say, is a bit OTT. He “liked” tweets describing Ms Dorries as grotesque, a “vacuous goon” and much worse. It was no doubt very unpleasant for her and certainly a personalised pile-on—the kind of thing the noble Baroness, Lady Kennedy, just talked about—and Ms Dorries would say it was psychologically harmful. But her complaint was overturned by new evidence that led to the bullying claim being turned down. What was this evidence? Ms Dorries herself was a frequent and aggressive tweeter. So, somebody is a recipient of something they say causes them psychological harm, and it has now been said that it does not matter because they are the kind of person who causes psychological harm to other people. My concern about turning this into a criminal offence is that the courts will be full of those kinds of arguments, which I do not think we want.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Women start changing the opportunities in their lives and stop doing things that they might want to do: they stop deciding to be Members of Parliament or to stand for election in any capacity, or, if they are lawyers, to take cases that will be inflammatory. They start inhibiting and limiting their own potential because of this kind of threat coming from men who resent the idea that women should be aspiring to hold positions and be equal to men. Some of it is of a very unpleasant and nasty nature, and law has its place in sending out clear messages of what is acceptable and unacceptable. It is then up to us—in schools, other educational settings and everywhere else—to spread the word among our young men and young women about what is acceptable and what they must not accept, and about the right way to behave decently towards other human beings.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, first, I welcome the amendment from the noble Lord, Lord Allan, and his motivation, because I am concerned that, throughout the Bill, the wrong targets are being caught up. I was grateful to hear his recognition that people who talk about their problems with self-harm could end up being targeted, which nobody would ever intend. These things need to be taken seriously.

In that sense, I was slightly concerned about the motivation of the noble Baroness, Lady Burt of Solihull, in the “reckless” amendment. The argument was that the recklessness standard is easier to prove. I am always worried about things that make it easier to prosecute someone, rather than there being a just reason for that prosecution. As we know, those involved in sending these images are often immature and very foolish young men. I am concerned about lowering the threshold at which we criminalise them—potentially destroying their lives, by the way, because if you have a criminal record it is not good—even though I in no way tolerate what they are doing and it is obviously important that we take that on.

There is a danger that this law will become a mechanism through which people try to resolve a whole range of social problems—which brings me on to responding to the speech just made by the noble Baroness, Lady Kennedy of The Shaws. I continue to be concerned about the question of trying to criminalise indirect threats. The point about somebody who sends a direct threat is that we can at least see the connection between that direct threat and the possibility of action. It is the same sort of thing that we have historically considered in relation to incitement. I understand that, where your physical being is threatened by words, physically a practical thing can happen, and that is to be taken very seriously. The problem I have is with the indirect threat from somebody who says, for example, “That smile should be taken of your face. It can be arranged”, or other indirect but incredibly unpleasant comments. There is clearly no link between that and a specific action. It might use violent language but it is indirect: “It could be arranged”, or “I wish it would happen”.

Anyone on social media—I am sure your Lordships all are—will know that I follow very carefully what people from different political parties say about each other. I do not know if you have ever followed the kind of things that are said about the Government and their Ministers, but the threats are not indirect and are often named. In that instance, it is nothing to do with women, but it is pretty violent and vile. By the way, I have also followed what is said about the Opposition Benches, and that can be pretty violent and vile, including language that implies that they wish those people were the subject of quite intense violence—without going into detail. That happens, and I do not approve of it—obviously. I also do not think that pile-ons are pleasant to be on the receiving end of, and I understand how they happen. However, if we criminalise pile-ons on social media, we are openly imposing censorship.

What is worse in my mind is that we are allowing the conflation of words and actions, where what people say or think is the same as acting on it, as the criminal law would see it. We have seen a very dangerous trend recently, which is particularly popular in the endless arguments and disputes over identity politics, where people will say that speech is violence. This has happened to a number of gender-critical feminists, in this instance women, who have gone in good faith to speak at universities, having been invited. They have been told that their speech was indistinguishable from violence and that it made students at the university feel under threat and unsafe and that it was the equivalent of being attacked. But guess what? Once you remove that distinction, the response to that speech can be to use violence, because you cannot tell the difference between them. That has happened around a number of university actions, where speakers and their supporters were physically assaulted by people who said that they were using self-defence against speech that was violent. I get nervous that this is a slippery slope, and we certainly should not go anywhere near it in legislation.

Finally, I agree that we should tackle the culture of people piling on and using this kind of language, but it is a cultural and social question. What we require is moral leadership and courage in the face of it—calling it out, arguing against it and so on. It is wrong to use the law to send messages; it is an abdication of moral leadership and a cop-out, let alone dangerous in what is criminalised. I urge your Lordships to reject those amendments.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak briefly to Amendments 5C and 7A in this group. I welcome the Government’s moves to criminalise cyberflashing. It is something that many have campaigned for in both Houses and outside for many years. I will not repeat the issues so nobly introduced by the noble Baroness, Lady Burt, and I say yet again that I suspect that the noble Baroness, Lady Featherstone, is watching, frustrated that she is still not able to take part in these proceedings.

It is worth making the point that, if actions are deemed to be serious enough to require criminalisation and for people potentially to be prosecuted for them, I very much hope that my noble friend the Minister will be able to say in his remarks that this whole area of the law will be kept under review. There is no doubt that women and girls’ faith in the criminal justice system, both law enforcement and the Crown Prosecution Service, is already very low. If we trumpet the fact that this offence has been introduced, and then there are no prosecutions because the hurdles have not been reached, that is even worse than not introducing the offence in the first place. So I hope very much that this will be kept under review, and no doubt there will be opportunities to return to it in the future.

I do not want to get into the broader debate that we have just heard, because we could be here for a very long time, but I would just say to the noble Baronesses, Lady Kennedy and Lady Fox, that we will debate this in future days on Report and there will be specific protection and mention of women and girls on the face of the Bill—assuming, of course, that Amendment 152 is approved by this House. The guidance might not use the words that have been talked about, but the point is that that is the place to have the debate—led by the regulator with appropriate public consultation—about the gendered nature of abuse that the noble Baroness, Lady Kennedy, has so eloquently set out. I hope that will also be a big step forward in these matters.

I look forward to hearing from the Minister about how this area of law will be kept under review.

Online Safety Bill

Baroness Fox of Buckley Excerpts
I am bound to say that, although the noble Lord, Lord Allan, might prefer his amendment and the noble Lord, Lord Moylan, might prefer his, I prefer Amendment 245 in the name of the noble Baroness, Lady Morgan, which says that all services should be judged according to risk. This would stop this endless game of taking things out and putting things in, in case they behave badly, or taking things out for companies that we recognise now although we do not know what the companies of the future will be. We all have to remember that, even when we had the pre-legislative committee, we were not talking about large language models and when we started this Bill we were not talking about TikTok. Making laws for individual services is not a grand idea, but saying that it is not the size but the risk that should determine the category of a regulated service, and therefore its duties, seems a comprehensive way of getting to the same place.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, there is a danger of unanimity breaking out. The noble Lord, Lord Moylan, and I are not always on the same page as others, but this is just straightforward. I hope the Government listen to the fact that, even though we might be coming at this in different ways, there is concern on all sides.

I also note that this is a shift from what happened in Committee, when I tabled an amendment to try to pose the same dilemmas by talking about the size of organisations. Many a noble Lord said that size did not matter and that that did not work—but it was trying to get at the same thing. I do feel rather guilty that, to move the core philosophy forward, I have dumped the small and micro start-ups and SMEs that I also wanted to protect from overregulation—that is what has happened in this amendment—but now it seems an absolute no-brainer that we should find a way to exempt public interest organisations. This is where I would go slightly further. We should have a general exemption for public interest organisations, but with the ability for Ofcom to come down hard if they look as though they have moved from being low risk to being a threat.

As the noble Lord, Lord Moylan, noted, public interest exemptions happen throughout the world. Although I do not want to waste time reading things out, it is important to look at the wording of Amendment 29. As it says, we are talking about:

“historical, academic, artistic, educational, encyclopaedic, journalistic, or statistical content”.

We are talking about the kind of online communities that benefit the public interest. We are talking about charities, user-curated scientific publications and encyclopaedias. They is surely not what this Bill was designed to thwart. However, there is a serious danger that, if we put on them the number of regulatory demands in the Bill, they will not survive. That is not what the Government intend but it is what will happen.

Dealing with the Bill’s complexity will take much time and money for organisations that do not have it. I run a small free-speech organisation called the Academy of Ideas and declare my interest in it. I am also on the board of the Free Speech Union. When you have to spend so much time on regulatory issues it costs money and you will go under. That is important. This could waste Ofcom’s time. The noble Lord, Lord Allan of Hallam, has explained that. It would prevent Ofcom concentrating on the nasty bits that we want it to. It would be wasting its time trying to deal with what is likely to happen.

I should mention a couple of other things. It is important to note that there is sometimes controversy over the definition of a public interest organisation. It is not beyond our ken to sort it out. I Googled it—it is still allowed—and came up with a Wikipedia page that still exists. That is always good. If one looks, the term “public interest” is used across a range of laws. The Government know what kind of organisations they are talking about. The term has not just been made up for the purpose of an exemption.

It is also worth noting that no one is talking about public interest projects and organisations not being regulated at all but this is about an exemption from this regulation. They still have to deal with UK defamation, data protection, charity, counterterrorism and pornography laws, and the common law. Those organisations’ missions and founding articles will require that they do some good in the world. That is what they are all about. The Government should take this matter seriously.

Finally, on the rescue clauses, it is important to note—there is a reference to the Gambling Act—the Bill states that if there is problem, Ofcom should intervene. That was taken from what happens under the Gambling Act, which allows UK authorities to strip one or more gambling businesses of their licensing exemptions when they step out of line. No one is trying to say do not look at those exemptions at all but they obviously should not be in the scope of the Bill. I hope that when we get to the next stage, the Government will, on this matter at least, accept the amendment.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

To follow on from that, we are talking about the obligation to bring exemptions to Parliament. Well, we are in Parliament and we are bringing exemptions. The noble Lord is recommending that we bring very specific exemptions while those that the noble Lord, Lord Moylan, and I have been recommending may be rather broad—but I thought we were bringing exemptions to Parliament. I am not being facetious. The point I am making is, “Why can’t we do it now?” We are here now, doing this. We are saying, as Parliament, “Look at these exemptions”. Can the Minister not look at them now instead of saying that we will look at them some other time?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I may as well intervene now as well, so that the Minister can get a good run at this. I too am concerned at the answer that has been given. I can see the headline now, “Online Safety Bill Age-Gates Wikipedia”. I cannot see how it does not, by virtue of some of the material that can be found on Wikipedia. We are trying to say that there are some services that are inherently in a child’s best interests—or that are in their best interests according to their evolving capacity, if we had been allowed to put children’s rights into the Bill. I am concerned that that is the outcome of the answer to the noble Lord, Lord Allan.

--- Later in debate ---
On the question of religion, if I, A, treat B adversely because they adhere to a particular religion, I fall foul of the Equality Act. But this appears to cover religion as a phenomenon. So, if I say that I am going to treat somebody badly because they are Jewish, of course I fall foul of the Equality Act. But this appears to say that if I say something adverse and abusive about the Jewish religion without reference to any particular individual, I will fall foul of this clause. I know that sounds a minor point of detail, but it is actually very significant. I want to hear my noble friend explain how in detail this is going to operate. If I say something adverse or abusive about gender reassignment and disability, that would not fall foul of the Equality Act necessarily, but it would fall foul of the Bill, as far as I can see. Are we creating a new blasphemy offence here, in effect, in relation to religion, as opposed to what the Equality Act does? I would like my noble friend to be able to expand on this. I know this is a Committee stage-type query, but this is our first opportunity to ask these questions.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, interestingly, because I have not discussed this at all with the noble Lord, Lord Moylan, I have some similar concerns to his. I have always wanted this to be a children’s online safety Bill. My concerns generally have been about threats to adults’ free speech and privacy and the threat to the UK as the home of technological innovation. I have been happy to keep shtum on things about protecting children, but I got quite a shock when I saw the series of government amendments.

I thought what most people in the public think: the Bill will tackle things such as suicide sites and pornography. We have heard some of that very grim description, and I have been completely convinced by people saying, “It’s the systems”. I get all that. But here we have a series of amendments all about content—endless amounts of content and highly politicised, contentious content at that—and an ever-expanding list of harms that we now have to deal with. That makes me very nervous.

On the misinformation and disinformation point, the Minister is right. Whether for children or adults, those terms have been weaponised. They are often used to delegitimise perfectly legitimate if contrary or minority views. I say to the noble Baroness, Lady Kidron, that the studies that say that youth are the fastest-growing far-right group are often misinformation themselves. I was recently reading a report about this phenomenon, and things such as being gender critical or opposing the small boats arriving were considered to be evidence of far-right views. That was not to do with youth, but at least you can see that this is quite a difficult area. I am sure that many people even in here would fit in the far right as defined by groups such as HOPE not hate, whose definition is so broad.

My main concerns are around the Minister’s Amendment 172. There is a problem: because it is about protected characteristics—or apes the protected characteristics of the Equality Act—we might get into difficulty. Can we at least recognise that, even in relation to the protected characteristics as noted in the Equality Act, there are raging rows politically? I do not know how appropriate it is that the Minister has tabled an amendment dragging young people into this mire. Maya Forstater has just won a case in which she was accused of being opposed to somebody’s protected characteristics and sacked. Because of the protected characteristics of her philosophical views, she has won the case and a substantial amount of money.

I worry when I see this kind of list. It is not just inciting hatred—in any case, what that would mean is ambivalent. It refers to abuse based on race, religion, sex, sexual orientation, disability and so on. This is a minefield for the Government to have wandered into. Whether you like it or not, it will have a chilling effect on young people’s ability to debate and discuss. If you worry that some abuse might be aimed at religion, does that mean that you will not be able to discuss Charlie Hebdo? What if you wanted to show or share the Charlie Hebdo cartoons? Will that count? Some people would say that is abusive or inciteful. This is not where the Bill ought to be going. At the very least, it should not be going there at this late stage. Under race, it says that “nationality” is one of the indicators that we should be looking out for. Maybe it is because I live in Wales, but there is a fair amount of abuse aimed at the English. A lot of Scottish friends dole it out as well. Will this count for young people who do that? I cannot get it.

My final question is in relation to proposed subsection (11). This is about protecting children, yet it lists a person who

“has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex”.

Are the Government seriously accepting that children have not just proposed to reassign but have been reassigned? That is a breach of the law. That is not meant to be happening. Your Lordships will know how bad this is. Has the Department for Education seen this? As we speak, it is trying to untangle the freedom for people not to have to go along with people’s pronouns and so on.

This late in the day, on something as genuinely important as protecting children, I just want to know whether there is a serious danger that this has wandered into the most contentious areas of political life. I think it is very dangerous for a government amendment to affirm gender reassignment to and about children. It is genuinely irresponsible and goes against the guidance the Government are bringing out at the moment for us to avoid. Please can the Minister clarify what is happening with Amendment 172?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am not entirely sure how to begin, but I will try to make the points I was going to make. First, I would like to respond to a couple of the things said by the noble Baroness, Lady Fox. With the greatest respect, I worry that the noble Baroness has not read the beginning of the proposed new clause in Amendment 172, subsection (2), which talks about “Content which is abusive”, as opposed to content just about race, religion or the other protected characteristics.

One of the basic principles of the Bill is that we want to protect our children in the digital world in the same way that we protect them in the physical world. We do not let our children go to the cinema to watch content as listed in the primary priority and priority content lists in my noble friend the Minister’s amendments. We should not let them in the digital world, yet the reality is that they do, day in and day out.

I thank my noble friend the Minister, not just for the amendments that he has tabled but for the countless hours that he and his team have devoted to discussing this with many of us. I have not put my name to the amendments either because I have some concerns but, given the way the debate has turned, I start by thanking him and expressing my broad support for having the harms in the Bill, the importance of which this debate has demonstrated. We do not want this legislation to take people by surprise. The important thing is that we are discussing some fundamental protections for the most vulnerable in our society, so I thank him for putting those harms in the Bill and for allowing us to have this debate. I fear that it will be a theme not just of today but of the next couple of days on Report.

I started with the positives; I would now like to bring some challenges as well. Amendments 171 and 172 set out priority content and primary priority content. It is clear that they do not cover the other elements of harm: contact harms, conduct harms and commercial harms. In fact, it is explicit that they do not cover the commercial harms, because proposed new subsection (4) in Amendment 237 explicitly says that no amendment can be made to the list of harms that is commercial. Why do we have a perfect crystal ball that means we think that no future commercial harms could be done to our children through user-to-user and search services, such that we are going to expressly make it impossible to add those harms to the Bill? It seems to me that we have completely ignored the commercial piece.

I move on to Amendment 174, which I have put my name to. I am absolutely aghast that the Government really think that age-inappropriate sexualised content does not count as priority content. We are not necessarily talking here about a savvy 17 year-old. We are talking about four, five and six year-olds who are doomscrolling on various social media platforms. That is the real world. To suggest that somehow the digital world is different from the old-fashioned cinema, and a place where we do not want to protect younger children from age-inappropriate sexualised material, just seems plain wrong. I really ask my noble friend the Minister to reconsider that element.

I am also depressed about the discussion that we had about misinformation. As I said in Committee several times, I have two teenage girls. The reality is that we are asking today’s teenagers to try to work out what is truth and what is misinformation. My younger daughter will regularly say, “Is this just something silly on the internet?” She does not use the term “misinformation”; she says, “Is that just unreal, Mum?” She cannot tell about what appears in her social media feeds because of the degree of misinformation. Failing to recognise that misinformation is a harm for young people who do not yet know how to validate sources, which was so much easier for us when we were growing up than it is for today’s generations, is a big glaring gap, even in the content element of the harms.

I support the principle behind these amendments, and I am pleased to see the content harms named. We will come back next week to the conduct and contact harms—the functionality—but I ask my noble friend the Minister to reconsider on both misinformation and inappropriate sexualised material, because we are making a huge mistake by failing to protect our children from them.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I just have a question for the noble Lord. He has given an excellent exposé of the other things that I was worried about but, even when he talks about listing the harms, I wonder how helpful it is. Like him, I read them out to a focus group. Is it helpful to write these things, for example emojis, down? Will that not encourage the platforms to over-panic? That is my concern.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On the noble Baroness’s point, that is why I intervened in the debate: so that we are all clear. We are not saying that, for priority content, it is an amber light and not a red light. We are not saying, “Just remove all this stuff”; it would be a wrong response to the Bill to say, “It’s a fictional character being slaughtered so remove it”, because now we have removed “Twilight”, “Watership Down” and whatever else. We are saying, “Think very carefully”. If it is one of those circumstances where this is causing harm—they exist; we cannot pretend that they do not—it should be removed. However, the default should not be to remove everything on this list; that is the point I am really trying to make.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I certainly will think about it, but the difficulty is the scale of the material and the speed with which we want these assessments to be made and that light to be lit, in order to make sure that people are properly protected.

My noble friend Lord Moylan asked about differing international terminology. In order for companies to operate in the United Kingdom they must have an understanding of the United Kingdom, including the English-language terms used in our legislation. He made a point about the Equality Act 2010. While it uses the same language, it does not extend the Equality Act to this part of the Bill. In particular, it does not create a new offence.

The noble Baroness, Lady Fox, also mentioned the Equality Act when she asked about the phraseology relating to gender reassignment. We included this wording to ensure that the language used in the Bill matches Section 7(1) of the Equality Act 2010 and that gender reassignment has the same meaning in the Bill as it does in that legislation. As has been said by other noble Lords—

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I clarify that what I said was aimed at protecting children. Somebody corrected me and asked, “Do you know that this says ‘abusive’?”—of course I do. What I suggested was that this is an area that is very contentious when we talk about introducing it to children. I am thinking about safeguarding children in this instance, not just copying and pasting a bit of an Act.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:

“Content which is abusive and which targets any of the following characteristics”.


It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

I will keep this short, because I know that everyone wants to get on. It would be said that it is abusive to misgender someone; in the context of what is going on in sixth forms and schools, I suggest that this is a problem. It has been suggested that showing pictures of the Prophet Muhammad in an RE lesson—these are real-life events that happen offline—is abusive. I am suggesting that it is not as simple as saying the word “abusive” a lot. In this area, there is a highly contentious and politicised arena that I want to end, but I think that this will exacerbate, not help, it.

Online Safety Bill

Baroness Fox of Buckley Excerpts
I hope we keep all that in mind as we do this. We are going to build the user empowerment tools. It is a logical response once we had decided to take legal but harmful out, but I think we should approach it with a note of caution that we do not assume it is necessarily going to be a fix everywhere and in the same way on all platforms. For some platforms, it might be quite meaningless; for others, potentially, it is something people will want to use.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I am happy to acknowledge and recognise what the Government did when they created user empowerment duties to replace legal but harmful. I think they were trying to counter the dangers of over-paternalism and illiberalism that oblige providers to protect adult users from content that allegedly would cause them harm.

At least the new provisions brought into the Bill have a different philosophy completely. They enhance users’ freedom as individuals and allow them to apply voluntary content filters and freedom of choice, on the principle that adults can make decisions for themselves.

In case anyone panics, I am not making a philosophical speech. I am reminding the Government that that is what they said to us—to everybody—“We are getting rid of legal but harmful because we believe in this principle”. I am worried that some of the amendments seem to be trying to backtrack from that different basis of the Bill—and that more liberal philosophy—to go back to the old legal but harmful. I say to the noble Lord, Lord Allan of Hallam, that the cat is distinctly not dead.

The purpose of Amendment 56 is to try to ensure that providers also cannot thwart the purpose of Clause 12 and make it more censorious and paternalistic. I am not convinced that the Government needed to compromise on this as I think Amendment 60 just muddies the waters and fudges the important principle that the Government themselves originally established.

Amendment 56 says that the default must be no filtering at all. Then users have to make an active decision to switch on the filtering. The default is that you should be exposed to a full flow of ideas and, if you do not want that, you have to actively decide not to and say that you want a bowdlerised or sanitised version.

Amendment 56 takes it a bit further, in paragraph (b), and applies different levels of filtering in terms of content of democratic importance and journalistic content. In the Bill itself, the Government accept the exceptional nature of those categories of content, and this just allows users to be able to do the same and say, “No; I might want to filter some things out but bear in mind the exceptional importance of democratic and journalistic content”. I worry that the government amendments signal to users that certain ideas are dangerous and must be hidden. That is my big concern. In other words, they might be legal but they are harmful: that is what I think these amendments try to counter.

One of the things that worries me about the Bill is the danger of echo chambers. I know we are concentrating on harms, but I think echo chambers are harmful. I started today quite early at Blue Orchid at 55 Broadway with a big crowd of sixth formers involved in debating matters. I complimented Keir Starmer on his speech on the importance of oracy and encouraging young people to speak. I stressed to all the year 12 and year 13 young people that the important thing was that they spoke out but also that they listened to contrary opinions and got out of their safe spaces and echo chambers. They were debating very difficult topics such as commercial surrogacy, cancel culture and the risks of contact sports. I am saying all that to them and then I am thinking, “We have now got a piece of legislation that says you can filter out all the stuff you do not want to hear and create your own safe space”. So I just get anxious that we do not inadvertently encourage in the young—I know this is for all adults—that antidemocratic tendency to not want to hear what you do not want to hear, even when it would be good to hear as many opinions as possible.

I also want to press the Minister on the problem of filtering material that targets race, religion, sex, sexual orientation, disability and gender reassignment. I keep trying to raise the problem that it could lead to diverse philosophical views around those subjects also being removed by overzealous filtering. You might think that you know what you are asking to be filtered out. If you say you want to filter out material that is anti-religion, you might not mean that you do not want any debates on religious tolerance. For example, there was that major controversy over the “The Lady of Heaven” film. I know the Minister was interested, as I was, in the dangers of censorship in relation to that. You would not want, because you said, “Don’t target me for my religion”, to not be able to access that debate.

I think there is a danger that we are handing a lot of power to filterers to make filtering decisions based on their values when we are not clear about what they are. Look at what has happened with the banks in the last few days. Their values have closed down people’s bank accounts because they disagree on values. Again, we say “Don’t target on race”, but I have been having lots of arguments with people recently who have accused the Government, through their Illegal Migration Bill, of being racist. I think we just need to know that we are not accepting an ideological filtering of what we see.

Amendment 63 is key because it requires providers’ terms of service to include provisions about how content to which Clause 12(2) applies is identified, precisely to try to counter these problems. It imposes a duty on providers to apply those provisions consistently, as the noble Lord, Lord Moylan, explained. The point that providers have to set out how they identify content that is allegedly hostile, for example, to religion, or racially abusive, is important because this is about empowering users. Users need to know whether this will be done by machine learning or will it be a human doing it. Do they look for red flags and, if so, what are the red flags? How are these things decided? That means that providers have to state clearly and be accountable for their definition of any criteria that could justify them filtering out and disturbing the flow of democratic information. It is all about transparency and accountability in that sense.

Finally, in relation to Amendment 183, I am worried about the notion of filtering out content from unverified users for a range of reasons. It indicates somehow that there is a direct link between being unverified or anonymous and harm or being dodgy, which I think that is illegitimate. It has already been explained that there will be a detrimental impact on certain organisations —we have talked about Reddit, but I like to remember Mumsnet. There are quite a lot of organisations with community-centred models, where the structure is that influencers broadcast to their followers and where there are pseudonymous users. Is the requirement to filter out those contributors likely to lead to those models collapsing? I need to be reassured on this because I am not convinced at all. As has been pointed out, there will be a two-tier internet because those who are unable or unwilling to disclose their identity online or to be verified by someone would be or could be shut out from public discussions. That is a very dangerous place to have ended up, even though I am sure it is not what the Government intend.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I am grateful for the broad, if not universal, support for the amendments that we have brought forward following the points raised in Committee. I apologise for anticipating noble Lords’ arguments, but I am happy to expand on my remarks in light of what they have said.

My noble friend Lord Moylan raised the question of non-verified user duties and crowdsourced platforms. The Government recognise concerns about how the non-verified user duties will work with different functionalities and platforms, and we have engaged extensively on this issue. These duties are only applicable to category 1 platforms, those with the largest reach and influence over public discourse. It is therefore right that such platforms have additional duties to empower their adult users. We anticipate that these features will be used in circumstances where vulnerable adults wish to shield themselves from anonymous abuse. If users decide that they are restricting their experience on a particular platform, they can simply choose not to use them. In addition, before these duties come into force, Ofcom will be required to consult effective providers regarding the codes of practice, at which point they will consider how these duties might interact with various functionalities.

My noble friend and the noble Lord, Lord Allan of Hallam, raised the potential for being bombarded with pop-ups because of the forced-choice approach that we have taken. These amendments have been carefully drafted to minimise unnecessary prompts or pop-ups. That is why we have specified that the requirement to proactively ask users how they want these tools to be applied is applicable only to registered users. This approach ensures that users will be prompted to make a decision only once, unless they choose to ignore it. After a decision has been made, the provider should save this preference and the user should not be prompted to make the choice again.

The noble Lord, Lord Clement-Jones, talked further about his amendments on the cost of user empowerment tools as a core safety duty in the Bill. Category 1 providers will not be able to put the user empowerment tools in Clause 12 behind a pay wall and still be compliant with their duties. That is because they will need to offer them to users at the first possible opportunity, which they will be unable to do if they are behind a pay wall. The wording of Clause 12(2) makes it clear that providers have a duty to include user empowerment features that an adult user may use or apply.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Lord Farmer Portrait Lord Farmer (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I thank the Minister for engaging with the amendment in my name and that of the noble Baroness, Lady Benjamin, in Committee, to ensure parity between the regulation of online and offline pornography. We did not table it for Report because of the welcome news of the Government’s review. At this point, I would like to give my backing to all that my noble friend Lord Bethell said and would like to thank him for his great encouragement and enthusiasm on our long journey, as well as the noble Baroness, Lady Kidron. I would particularly like to mention the noble Baroness, Lady Benjamin, who, as my noble friend Lord Bethell mentioned, must be very frustrated today at not being able to stand up and benefit us with her passion on this subject, which has kept a lot of us going.

I have some questions and comments about the review, but first I want to stand back and state why this review is so necessary. Our society must ask how pornography was able to proliferate so freely, despite all the warnings of the danger and consequences of this happening when the internet was in its infancy. Human appetites, the profit motive and the ideology of cyberlibertarianism flourished freely in a zeitgeist where notions of right and wrong had become deeply unfashionable. Pre-internet, pornography was mainly on top shelves, in poky and rather sordid sex shops, or in specialist cinemas. There was recognition that exposure to intimate sex acts should never be accidental but always the result of very deliberate decisions made by adults—hence the travesty of leaving children exposed to the danger of stumbling across graphic, violent and frequently misogynistic pornography by not bringing Part 3 of the Digital Economy Act 2017 into force.

I have talked previously in this House about sociology professor Christie Davies’ demoralisation of society thesis: what happens when religiously reinforced moralism, with its totemic notion of free will, is ditched along with God. Notions of right and wrong become subjective, individually determined, and a kind of blindness sets in; how else can we explain why legislators ignored the all-too-predictable effects of unrestrained access to pornography on societal well-being, including but not limited to harms to children? For this Bill to be an inflection point in history, this review, birthed out of it, must unashamedly call out the immorality of what has gone before. How should we define morality? Well, society simply does not work if it is governed by self-gratification and expressive individualism. Relationships—the soil of society—including intimate sexual relationships, are only healthy if they are self-giving, rather than self-gratifying. These values did not emerge from the Enlightenment but from the much deeper seam of our Judeo-Christian foundations. Pornography is antithetical to these values.

I turn to the review’s terms of reference. Can the Minister confirm that the lack of parity between online and offline regulation will be included in the legal gaps it will address? Can he also confirm that the review will address gaps in evidence? As I said in Committee, a deep seam of academic research already exists on the harmful effects of the ubiquity of pornography. The associations with greater mental ill health, especially among teenagers, are completely unsurprising; developing brains are being saturated with dark depictions of child sexual abuse, incest, trafficking, torture, rape, violence and coercion. As I mentioned earlier, research shows that adults whose sexual arousal is utterly dependent on pornography can be catastrophically impaired in their ability to form relationships with flesh-and-blood human beings, let alone engage in intimate physical sex.

Will the review also plug gaps in areas that remain underresearched and controversial and where vested interests are bound? On that point, whoever chairs this review will have to be ready, willing and able to take on powerful, ideologically motivated and profit-driven lobbies.

Inter alia, we need to establish through research the extent to which some young women are driven to change their gender because of hyper-sexualised, porn-depicted female stereotypes. Anecdotally, some individuals have described their complete inability to relate to their natal sex. It can be dangerous and distasteful to be a woman in a world of pornified relationships which expects them to embrace strangulation, degradation and sexual violence. One girl who transitioned described finding such porn as a child: “I am ashamed that I was fascinated by it and would seek it out. Despite this interest in watching it, I hated the idea of myself actually being in the position of the women. For a while, I even thought I was asexual. Sex is still scary to me, complicated”.

Finally, the Government’s announcement mentioned several government departments but does not make it clear that they will also draw in the work of DfE and DHSC—the departments for children’s and adult mental health—for reasons I have already touched on. Can the Minister confirm that the remit will include whatever areas of government responsibility are needed so that the review is genuinely broad enough to look across society at how to protect not just children but adults?

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I rise to speak to Amendment 184 in my name—

Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, the guidance in the Companion states that Peers who were not present for the opening of this debate last week should not speak in the debate today, so I will have to ask the noble Baroness to reserve her remarks on this occasion.

--- Later in debate ---
Moved by
77: Clause 18, page 21, line 30, after “implementing,” insert “terms of service,”
Member’s explanatory statement
This amendment, and others in the name of Baroness Fox of Buckley, ensure free speech is not just considered at an abstract policy level but is included in providers’ terms of service.
--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.

Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.

This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.

When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.

That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.

Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.

I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.

This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.

Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.

Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?

Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.

The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes

“threatening or abusive words or behaviour, or disorderly behaviour”

that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.

I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.

Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.

Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.

I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.

I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.

That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.

Lord Moylan Portrait Lord Moylan (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.

Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

On that point specifically, having worked inside one of the companies, they fear legal action under all sorts of laws, but not under the European Convention on Human Rights. As the Minister explained, it is for public bodies; if people are going to take a case on Article 10 grounds, they will be taking it against a public body. There are lots of other grounds to go after a private company but not ECHR compliance.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

My Lords, I genuinely appreciate this debate. The noble Lord, Lord Clement-Jones, made what I thought was a very important point, which is, in going through the weeds of the Bill—and some people have been involved in it for many years, looking at the detail—I appreciate that it can be easy to forget the free speech point. It is important that it has been raised but it also constantly needs to be raised. That is the point: it is, as the noble Lord, Lord Allan of Hallam, admitted, a speech-restricting Bill where we are working out the balance.

I apologise to the noble and learned, Lord Hope of Craighead, for not acknowledging that he has constantly emphasised the distinction between free speech and free expression. He and I will not agree on this; it is that we do not have time for this argument now rather than me not understanding. But he has been diligent in his persistence in trying to at least raise the issues and that is important.

I was a bit surprised by the Minister’s response because, for the first time ever, since I have been here, there has been some enthusiasm across the House for one of my amendments—it really is unprecedented—Amendment 162 on the public order offences. I thought that the Minister might have noted that, because he has noted it every other time there has been a consensus across the House. I think he ought to look again at Amendment 162.

To indicate the muddle one gets in, in terms of public order offences and illegality, the police force in Cheshire, where I am from, has put out a film online today saying that misgendering is a crime. That is the police who have said that. It is not a crime and the point about these things, and the difficulty we are concerned with, is asking people to remove and censor material based on illegality or public offences that they should not be removing. That is my concern: censorship.

To conclude, I absolutely agree with the noble Lord, Lord Allan of Hallam, that of course free speech does not mean saying whatever you want wherever you want. That is not free speech, and I am a free speech absolutist. Even subreddits—if people know what they are—think they are policing each other’s speech. There are norms that are set in place. That is fine with me—that multitude.

My concern is that a state body such as Ofcom is going to set norms of what is acceptable free speech that are lower than free speech laws by demanding, on pain of breach of the law, with fines and so on, that these private companies have to impose their own terms of service, which can actually then set a norm, leading them to be risk-averse, and set a norm for levels of speech that are very dangerous. For example, when you go into work, you cannot just say anything, but there are people such as Maya Forstater, who said something at work and was disciplined and lost her job and has just won more than £100,000, because she was expressing her views and opinions. The Equality Act ran to her aid and she has now won and been shown to be right. You cannot do that if your words have disappeared and are censored.

I could talk about this for a long time, as noble Lords know. I hope that at least, as the Bill progresses, even when it becomes an Act, the Government could just stamp on its head, “Don’t forget free speech”—but before then, as we end this process, they could come back with some concessions to some of the amendments that have been raised here today. That would be more than just words. I beg leave to withdraw the amendment.

Amendment 77 withdrawn.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, a lot of positive and interesting things have been said that I am sympathetic to, but this group of amendments raises concerns about a democratic deficit: if too much of the Bill is either delegated to the Secretary of State or open to interference in relation to the Secretary of State and Ofcom, who decides what those priorities are? I will ask for a couple of points of clarification.

I am glad to see that the term “public policy” has been replaced, because what did that mean? Everything. But I am not convinced that saying that the Secretary of State can decide not just on national security but on public safety and public health is reassuring in the present circumstances. The noble Lord, Lord Allan, has just pointed out what it feels like to be leaned on. We had a very recent example internationally of Governments leaning on big tech companies in relation to Covid policies, lockdowns and so on, and removing material that was seen to contradict official public health advice—often public health advice that turned out not to be accurate at all. There should at least have been a lot more debate about what were political responses to a terrible virus. Noble Lords will know that censorship became a matter of course during that time, and Governments interfering in or leaning on big tech directly was problematic. I am not reassured that the Government hold to themselves the ability to lean on Ofcom around those issues.

It is also worth remembering that the Secretary of State already has a huge amount of power to designate, as we have discussed previously. They can designate what constitute priority illegal offences and priority content harmful to children, and that can all change beyond what we have discussed here. We have already seen that there is a constant expansion of what those harms can be, and having those decisions removed using only secondary legislation, unaccountable to Parliament or to public scrutiny, really worries me. It is likely to give a green light to every identity group and special interest NGO to demand that the list of priority harms and so on should be dealt with. That is likely to make the job of the Secretary of State to respond to “something must be done” moral panics all the more difficult. If that is going to happen, we should have parliamentary scrutiny of it; it cannot just be allowed to happen elsewhere.

It is ironic that the Secretary of State is more democratic, because they are elected, than an unelected regulator. I just feel that there is a danger in so much smoke and mirrors. When the Minister very kindly agreed to see the noble Lord, Lord Moylan, and me, I asked in a rather exasperated way why Ofcom could not make freedom of expression a priority, with codes of practice so that it would have to check on freedom of speech. The Minister said, “It’s not up to me to tell Ofcom what to do”, and I thought, “The whole Bill is telling Ofcom what to do”. That did not seem to make any sense.

I had another exchange with the present Secretary of State—again, noble Lords will not be surprised to hear that it was not a sophisticated intervention on my part—in which I said, “Why can’t the Government force the big tech companies to put freedom of expression in their terms and conditions or terms of service?” The Minister said, “They are private companies; we’re not interfering in what they do”. So you just end up thinking, “The whole Bill is telling companies that they’re going to be compelled to act in relation to harm and safety, but not on freedom of expression”. What that means is that you feel all the time as though the Government are saying that they are outsourcing this to third parties, which means that you cannot hold anyone to account.

Civil liberties campaigner Guy Herbert compared this to what is happening with the banks at the moment; they are being blamed by the Government and held to account for things such as politically exposed people and Ts and Cs that overconcentrate on values such as EDI and ESG that may be leading to citizens of this country having their bank accounts closed down. The Government say that they will tell the regulator that it has to act and say that the banks cannot behave in this way, but this all came from legislation—it is not as though the regulator was doing it off its own bat. Maybe it overinterpreted the legislation and the banks then overinterpreted it again and overremoved.

The obvious analogy for me is that there is a danger here that we will not be able to hold anyone to account for overremoval of legitimate democratic discussion from the online world, because everyone is pointing the finger at everyone else. At the very least, the amendments are trying to say that any changes beyond what we have discussed so far on this Bill must come before Parliament. That is very important for any kind of democratic credibility to be attached to this legislation.

Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I too express my admiration to the noble Baroness, Lady Stowell, for her work on this group with the Minister and support the amendments in her name. To pick up on what the noble Baroness, Lady Harding, said about infinite ping-pong, it can be used not only to avoid making a decision but as a form of power and of default decision-making—if you cannot get the information back, you are where you are. That is a particularly important point and I add my voice to those who have supported it.

I have a slight concern that I want to raise in public, so that I have said it once, and get some reassurance from the Minister. New subsection (B1)(d) in Amendment 134 concerns the Secretary of State directing Ofcom to change codes that may affect

“relations with the government of a country outside the United Kingdom”.

Many of the companies that will be regulated sit in America, which has been very forceful about protecting its sector. Without expanding on this too much, when it was suggested that senior managers would face some sort of liability in international fora, various parts of the American Government and state apparatus certainly made their feelings clearly known.

I am sure that the channels between our Government and the US are much more straightforward than any that I have witnessed, but it is absolutely definite that more than one Member of your Lordships’ House was approached about the senior management and said, “This is a worry to us”. I believe that where we have landed is very good, but I would like the Minister to say what the limits of that power are and acknowledge that it could get in a bit of a muddle with the economic outcomes that we were talking about, celebrating that they had been taken off the list, and government relations. That was the thing that slightly worried me in the government amendments, which, in all other ways, I welcome.

--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will speak very briefly to this amendment; I know that the House is keen to get on to other business today. I very much welcome the amendment that the Government have tabled. My noble friend the Minister has always said that they want to keep women and girls safe online. As has been referred to elsewhere, the importance of making our digital streets safer cannot be overestimated.

As my noble friend said, women and girls experience a disproportionate level of abuse online. That is now recognised in this amendment, although this is only the start, not the end, of the matter. I thank my noble friend and the Secretary of State for their engagement on this issue. I thank the chief executive and the chair of Ofcom. I also thank the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester, who I know cannot be here today, and the noble Lord, Lord Knight, who signed the original amendment that we discussed in Committee.

My noble friend has already talked about the campaigners outside the Chamber who wanted there to be specific mention of women and girls in the Bill. I thank Refuge, the 100,000 people who signed the End Violence Against Women coalition’s petition, BT, Glitch, Carnegie UK, Professor Lorna Woods, the NSPCC and many others who made the case for this amendment.

As my noble friend said, this is Ofcom guidance. It is not necessarily a code of practice, but it is still very welcome because it is broader than just the specific offences that the Government have legislated on, which I also welcome. As he said, this puts all the things that companies, platforms and search engines should be doing to protect women and girls online in one specific place. My noble friend mentioned holistic protection, which is very important.

There is no offline/online distinction these days. Women and girls should feel safe everywhere. I also want to say, because I know that my noble friend has had a letter, that this is not about saying that men and boys should not be safe online; it is about recognising the disproportionate levels of abuse that women and girls suffer.

I welcome the fact that, in producing this guidance, Ofcom will have to consult with the Domestic Abuse Commissioner and the Victims’ Commissioner and more widely. I look forward, as I am sure do all the organisations I just mentioned, to working with Ofcom on the first set of guidance that it will produce. It gives me great pleasure to have signed the amendment and to support its introduction.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I know that we do not have long and I do not want to be churlish. I am not that keen on this amendment, but I want to ask a question in relation to it.

I am concerned that there should be no conflation in the best practice guidance between the actual, practical problems of, for example, victims of domestic abuse being stalked online, which is a threat to their safety, or threatened with physical violence—I understand that—and abuse. Abuse is horrible to be on the receiving end of, but it is important for freedom of thought and freedom of speech that we do not make no distinction between words and action. It is important not to overreact or frighten young women by saying that being shouted at is the same as being physically abused.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

I am very grateful to everyone for the support they have expressed for this amendment both in the debate now and by adding their names to it. As I said, I am particularly grateful to my noble friend Lady Morgan, with whom we have worked closely on it. I am also grateful for her recognition that men and boys also face harm online, as she rightly points out. As we discussed in Committee, this Bill seeks to address harms for all users but we recognise that women and girls disproportionately face harm online. As we have discussed with the noble Baroness, Lady Merron, women and girls with other characteristics such as women of colour, disabled women, Jewish women and many others face further disproportionate harm and abuse. I hope that Amendment 152 demonstrates our commitment to giving them the protection they need, making it easy and clear for platforms to implement protections for them across all the wide-ranging duties they have.

The noble Baroness, Lady Burt of Solihull, asked why it was guidance and not a code of practice. Ofcom’s codes of practice will set out how companies can comply with the duties and will cover how companies should tackle the systemic risks facing women and girls online. Stipulating that Ofcom must produce specific codes for multiple different issues could, as we discussed in Committee, create duplication between the codes, causing confusion for companies and for Ofcom.

As Ofcom said in its letter to your Lordships ahead of Report, it has already started the preparatory work on the draft illegal content and child sexual abuse and exploitation codes. If it were required to create a separate code relating to violence against women and girls, this preparatory work would need to be revised, so there would be the unintended—and, I think, across the House, undesired—consequence of slowing down the implementation of these vital protections. I am grateful for the recognition that we and Ofcom have had on that point.

Instead, government Amendment 152 will consolidate all the relevant measures across codes of practice, such as on illegal content, child safety and user empowerment, in one place, assisting platforms to reduce the risk of harm that women and girls disproportionately face.

On timing, at present Ofcom expects that this guidance will be published in phase 3 of the implementation of the Bill, which was set out in Ofcom’s implementation plan of 15 June. This is when the duties in Part 4 of the Bill, relating to terms of service and so on, will be implemented. The guidance covers the duties in Part 4, so for guidance to be comprehensive and have the most impact in protecting women and girls, it is appropriate for it to be published during phase 3 of the Bill’s implementation.

The noble Baroness, Lady Fox, mentioned the rights of trans people and the rights of people to express their views. As she knows, gender reassignment and religious or philosophical belief are both protected characteristics under the Equality Act 2010. Sometimes those are in tension, but they are both protected in the law.

With gratitude to all the noble Lords who have expressed their support for it, I commend the amendment to the House.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

The Minister did not quite grasp what I said but I will not keep the House. Would he be prepared to accept recommendations for a broader consultation—or who do I address them to? It is important that groups such as the Women’s Rights Network and others, which suffer abuse because they say “I know what a woman is”, are talked to in a discussion on women and abuse, because that would be appropriate.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am sorry—yes, the noble Baroness made a further point on consultation. I want to reassure her and other noble Lords that Ofcom has the discretion to consult whatever body it considers appropriate, alongside the Victims’ Commissioner, the Domestic Abuse Commissioner and others who I mentioned. Those consultees may not all agree. It is important that Ofcom takes a range of views but is able to consult whomever. As I mentioned previously, Ofcom and its officers can be scrutinised in Parliament through Select Committees and in other ways. The noble Baroness could take it up directly with them but could avail herself of those routes for parliamentary scrutiny if she felt that her pleas were falling on deaf ears.

--- Later in debate ---
At some point, someone has to say, “You’ve got it right: you shouldn’t be able to classify that as a recognised news publisher”, or, “You’ve got it wrong: actually, the British Government, in all their glory, stand behind the fact that Infowars should be recognised and given these special privileges”. Those are really important questions we have to ask about how this clause will work in practice. Amendments 158 and 161, because they allow explicitly for short-form video made especially for social media, will come to be seen as quite instrumental and not at all minor.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I am completely opposed to Amendments 159 and 160, but the noble Lords, Lord Faulks and Lord Black, and the noble Viscount, Lord Colville, have explained the issues perfectly. I am fully in agreement with what they said. I spoke at length in Committee on that very topic. This is a debate we will undoubtedly come back to in the media Bill. I, for one, am extremely disappointed that the Labour Party has said that it will not repeal Section 40. I am sure that these issues will get an airing elsewhere. As this is a speech-limiting piece of legislation, as was admitted earlier this week, I do not want any more speech limiting. I certainly do not want it to be a media freedom-limiting piece of legislation on top of that.

I want to talk mainly about the other amendments, Amendments 158 and 161, but approach them from a completely different angle from the noble Lord, Lord Allan of Hallam. What is the thinking behind saying that the only people who can clip content from recognised news publishers are the news publishers? The Minister mentioned in passing that there might be a problem of editing them, but it has become common practice these days for members of the public to clip from recognised news publishers and make comments. Is that not going to be allowed? That was the bit that completely confused me. It is too prescriptive; I can see all sorts of people getting caught by that.

The point that the noble Lord, Lord Allan of Hallam, made about what constitutes a recognised news publisher is where the issue gets quite difficult. The point was made about the “wrong” organisations, but I want to know who decides what is right and wrong. We might all nod along when it comes to Infowars and RT, but there are lots of organisations that would potentially fail that test. My concern is that they would not be able to appeal when they are legitimate news organisations, even if not to everybody’s taste. Because I think that we already have too much speech limiting in the Bill, I do not want any more. This is important.

When it comes to talking about the “wrong” organisations, I noticed that the noble Lord, Lord McNally, referred to people who went to Rupert Murdoch’s parties. I declare my interests here: I have never been invited or been to a Rupert Murdoch party—although do feel free, I say, if he is watching—but I have read about them in newspapers. For some people in this Chamber, the “wrong” kind of news organisation is, for example, the Times or one with the wrong kind of owner. The idea that we will all agree or know which news publishers are the “wrong” kind is not clear, and I do not think that the test is going to sort it out.

Will the Minister explain what organisations can do if they fail the recognised news publisher test to appeal and say, “We are legitimate and should be allowed”? Why is there this idea that a member of the public cannot clip a recognised news publisher’s content without falling foul? Why would they not be given some exemption? I genuinely do not understand that.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, I shall speak very briefly. I feel a responsibility to speak, having spoken in Committee on a similar group of amendments when the noble Lords, Lord Lipsey and Lord McNally, were not available. I spoke against their amendments then and would do so again. I align myself with the comments of my noble friend Lord Black, the noble Lord, Lord Faulks, and the noble Viscount, Lord Colville. As the noble Baroness, Lady Fox, just said, they gave a comprehensive justification for that position. I have no intention of repeating it, or indeed repeating my arguments in Committee, but I think it is worth stating my position.

Online Safety Bill

Baroness Fox of Buckley Excerpts
The Minister has a clear idea of the kind of reassurances that we are looking for. He teased out some of them in his opening comments, and I hope that he can make them even more strongly in his closing remarks.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, the noble Lord, Lord Allan of Hallam, hinted at the fact that there have been a plethora of government amendments on Report and, to be honest, it has been quite hard fully to digest most of them, let alone scrutinise them. I appreciate that the vast majority have been drawn up with opposition Lords, who might have found it a bit easier. But some have snuck in and, in that context, I want to raise some problems with the amendments in this group, which are important. I, too, am especially worried about that government amendment on facilitating remote access to services and equipment used to buy services. I am really grateful to the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, for tabling Amendment 247B, because I did not know what to do—and they did it. At least it raises the issue to the level of it needing to be taken seriously.

The biggest problem that I had when I originally read this provision was that facilitating remote access to services, and as yet undefined equipment used by a service, seems like a very big decision, and potentially disproportionate. It certainly has a potential to have regulatory overreach, and it creates real risks around privacy. It feels as though it has not even been flagged up strongly enough by the Government with regard to what it could mean.

I listened to what the Minister said, but I still do not fully understand why this is necessary. Have the Government considered the privacy and security implications that have already been discussed? Through Amendment 252A, the Government now have the power to enter premises for inspection—it rather feels as if there is the potential for raids, but I will put that to one side. They can go in, order an audit and so on. Remote access as a preliminary way to gather information seems heavy-handed. Why not leave it as the very last thing to do in a dialogue between Ofcom and a given platform? We have yet to hear a proper justification of why Ofcom would need this as a first-order thing to do.

The Bill does not define exactly what

“equipment used by the service”

means. Does it extend to employees’ laptops and phones? If it extends to external data centres, have the Government assessed the practicalities and security impact of that and the substantial security implications, as have been explained, for the services, the data centre providers and those of us whose data they hold?

I am also concerned that this will necessitate companies having very strongly to consider internal privacy and security controls to deal with the possibility of this, and that this will place a disproportionate burden on smaller and mid-sized businesses that do not have the resources available to the biggest technology companies. I keep raising this because in other parts of government there is a constant attempt to say that the UK will be the centre of technological innovation and that we will be a powerhouse in new technologies, yet I am concerned that so much of the Bill could damage that innovation. That is worth considering.

It seems to me that Amendment 252A on the power to observe at the premises ignores decentralised projects and services—the very kind of services that can revolutionise social media in a positive way. Not every service is like Facebook, but this amendment misses that point. For example, you will not be able to walk into the premises of the UK-based Matrix, the provider of the encrypted chat service Element that allows users to host their own service. Similarly, the non-profit Mastodon claims to be the largest decentralised social network on the internet and to be built on open-web standards precisely because it does not want to be bought or owned by a billionaire. So many of these amendments seem not to take those issues into account.

I also have a slight concern on researcher access to data. When we discussed this in Committee, the tone was very much—as it is in these amendments now—that these special researchers need to be able to find out what is going on in these big, bad tech companies that are trying to hide away dangerous information from us. Although we are trying to ensure that there is protection from harms, we do not want to demonise the companies so much that, every time they raise privacy issues or say, “We will provide data but you can’t access it remotely” or “We want to be the ones deciding which researchers are allowed to look at our data”, we assume that they are always up to no good. That sends the wrong message if we are to be a tech-innovative country or if there is to be any working together.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will write on Schedule 12 as well.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Before the Minister sits down, to quote the way the Minister has operated throughout Report, there is consensus across the House that there are some concerns. The reason why there are concerns outside and inside the House on this particular amendment is that it is not entirely clear that those protections exist, and there are worries. I ask the Minister whether, rather than just writing, it would be possible to take this back to the department, table a late amendment and say, “Look again”. That has been done before. It is certainly not too late: if it was not too late to have this amendment then it is certainly not too late to take it away again and to adopt another amendment that gives some safeguarding. Seriously, it is worth looking again.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I had not quite finished; the noble Baroness was quick to catch me before I sat down. I still have some way to go, but I will certainly take on board all the points that have been made on this group.

The noble Lord, Lord Knight, asked about Schedule 12. I will happily write with further information on that, but Schedule 12 is about UK premises, so it is probably not the appropriate place to deal with this, as we need to be able to access services in other countries. If there is a serious security risk then it would not necessarily be proportionate. I will write to him with further details.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I want to look at how, in the Government expanding Ofcom’s duties to prioritise media literacy, it has become linked to this group, and to look at the way in which Amendment 274B does this. It is very much linked with misinformation and disinformation. According to the amendment, there has to be an attempt to establish

“accuracy and authenticity of content”

and to

“understand the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it”.

I was wondering about reducing users’ exposure to misinformation and disinformation. That gives me pause, because I worry that reducing exposure will obviously mean the removal or censorship of material. I just want to probe some assumptions. Is it the presumption that incorrect or seemingly untrue or erroneous information is the predominant cause of real harm if it is not suppressed? Is there not a risk of harm in suppressing ideas too? Apart from the fact that heretical scientific and political theories were historically seen as misinformation and now are conventional wisdom, is there a danger that suppression in the contemporary period would create mistrust and encourage conspiratorial thinking—people saying, “What have you got to hide?”—and so on?

I want to push this by probing Amendment 269AA in the name of the noble Lord, Lord Clement-Jones, which itself is a probing amendment as to why Ofcom’s misinformation and disinformation committee is not required to consider the provenance of information to help empower users to understand whether content is real or true and so on, rather than the wording at the moment, “accuracy and authenticity”. When I saw the word “provenance”, I stopped for a moment. In all the debates going on in society about misinformation and disinformation, excellent provenance cannot necessarily guarantee truth.

I was shocked to discover that the then Wellcome Trust director, Jeremy Farrar, who is now the chief scientist at the World Health Organization, claimed that the Wuhan lab leak and the manmade theories around Covid were highly improbable. We now know that there were emails from Jeremy Farrar—I was shocked because I am a great fan of the Wellcome Trust and Jeremy Farrar’s work in general—in which there was a conscious bending of the truth that led to the editing of a scientific paper and a letter in the Lancet that proved to have been spun in a way to give wrong information. When issues such as the Wuhan lab leak were raised by Matt Ridley, recently of this parish—I do not know whether his provenance would count—they were dismissed as some kind of racist conspiracy theory. I am just not sure that it is that clear that you can get provenance right. We know from the Twitter files that the Biden Administration leaned on social media companies to suppress the Hunter Biden laptop story that was in the New York Post, which was described as Russian disinformation. We now know that it was true.

Therefore, I am concerned that, in attempting to be well-meaning, this amendment that says we should have better media information does not give in to these lazy labels of disinformation and misinformation, as if we all know what the truth is and all we need is fact-checkers, provenance and authenticity. Disinformation and misinformation have been weaponised, which can cause some serious problems.

Can the Minister clarify whether the clause on media literacy is a genuine, positive attempt at encouraging people to know more, or itself becomes part of an information war that is going on offline and which will not help users at all but only confuse things?

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I will add to my noble friend’s call for us to consider whether Clause 158 should be struck from the Bill as an unnecessary power for the Secretary of State to take. We have discussed powers for the Secretary of State throughout the Bill, with some helpful improvements led by the noble Baroness, Lady Stowell. This one jars in particular because it is about media literacy; some of the other powers related to whether the Secretary of State could intervene on the codes of practice that Ofcom would issue. The core question is whether we trust Ofcom’s discretion in delivering media literacy and whether we need the Secretary of State to have any kind of power to intervene.

I single out media literacy because the clue is in the name: literacy is a generic skill that you acquire about dealing with the online world; it is not about any specific text. Literacy is a broader set of skills, yet Clause 158 has a suggestion that, in response to specific forms of content or a specific crisis happening in the world, the Secretary of State would want to takesb this power to direct the media literacy efforts. To take something specific and immediate to direct something that is generic and long-term jars and seems inappropriate.

I have a series of questions for the Minister to elucidate why this power should exist at all. It would be helpful to have an example of what kind of “public statement notice”—to use the language in the clause—the Government might want to issue that Ofcom would not come up with on its own. Part of the argument we have been presented with is that, somehow, the Government might have additional information, but it seems quite a stretch that they could come up with that. In an area such as national security, my experience has been that companies often have a better idea of what is going on than anybody in government.

Thousands of people out there in the industry are familiar with APT 28 and APT 29 which, as I am sure all noble Lords know, are better known by their names Fancy Bear and Cozy Bear. These are agents of the Russian state that put out misinformation. There is nothing that UK agencies or the Secretary of State might know about them that is not already widely known. I remember talking about the famous troll factory run by Prigozhin, the Internet Research Agency, with people in government in the context of Russian interference—they would say “Who?” and have to go off and find out. In dealing with threats such as that between the people in the companies and Ofcom, you certainly want a media literacy campaign which tells you about these troll agencies and how they operate and gives warnings to the public, but I struggle to see why you need the Secretary of State to intervene as opposed to allowing Ofcom’s experts to work with company experts and come up with a strategy to deal with those kinds of threat.

The other example cited of an area where the Secretary of State might want to intervene is public health and safety. It would be helpful to be specific; had they had it, how would the Government have used this power during the pandemic in 2020 and 2021? Does the Minister have examples of what they were frustrated about and would have done with these powers that Ofcom would not do anyway in working with the companies directly? I do not see that they would have had secret information which would have meant that they had to intervene rather than trusting Ofcom and the companies to do it.

Perhaps there has been an interdepartmental workshop between DHSC, DCMS and others to cook up this provision. I assume that Clause 158 did not come from nowhere. Someone must have thought, “We need these powers in Clause 158 because we were missing them previously”. Are there specific examples of media literacy campaigns that could not be run, where people in government were frustrated and therefore wanted a power to offer it in future? It would be really helpful to hear about them so that we can understand exactly how the Clause 158 powers will be used before we allow this additional power on to the statute book.

In the view of most people in this Chamber, the Bill as a whole quite rightly grants the Government and Ofcom, the independent regulator, a wide range of powers. Here we are looking specifically at where the Government will, in a sense, overrule the independent regulator by giving it orders to do something it had not thought of doing itself. It is incumbent on the Government to flesh that out with some concrete examples so that we can understand why they need this power. At the moment, as noble Lords may be able to tell, these Benches are not convinced that they do.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I will be very brief. The danger with Clause 158 is that it discredits media literacy as something benign or anodyne; it will become a political plaything. I am already sceptical, but if ever there was anything to add to this debate then it is that.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - - - Excerpts

We established in our last debate that the notion of a recognised news publisher will go much broader than a broadcaster. I put it to the Minister that we could end up in an interesting situation where one bit of the Bill says, “You have to protect content from these people because they are recognised news publishers”. Another bit, however, will be a direction to the Secretary of State saying that, to deal with this crisis, we are going to give a media literacy direction that says, “Please get rid of all the content from this same news publisher”. That is an anomaly that we risk setting up with these different provisions.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

On the previous group, I raised the issue of legal speech that was labelled as misinformation and removed in the extreme situation of a public health panic. This was seemingly because the Government were keen that particular public health information was made available. Subsequently, we discovered that those things were not necessarily untrue and should not have been removed. Is the Minister arguing that this power is necessary for the Government to direct that certain things are removed on the basis that they are misinformation—in which case, that is a direct attempt at censorship? After we have had a public health emergency in which “facts” have been contested and shown to not be as black and white or true as the Government claimed, saying that the power will be used only in extreme circumstances does not fill me with great confidence.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am happy to make it clear, as I did on the last group, that the power allows Ofcom not to require platforms to remove content, only to set out what they are doing in response to misinformation and disinformation—to require platforms to make a public statement about what they are doing to tackle it. In relation to regulating news providers, we have brought the further amendments forward to ensure that those subject to sanctions cannot avail themselves of the special provisions in the Bill. Of course, the Secretary of State will be mindful of the law when issuing directions in the exceptional circumstances that these clauses set out.

--- Later in debate ---
I am happy to say that when this amendment was debated in Committee, it found widespread support from around the House. I hope to find that that support is still solid and strong, such that my noble friend, perhaps as a modest postprandial bonus, will be willing, for a change, to accept something proposed by a colleague from his own Benches, so that we can all rejoice as we go into a very long night. I beg to move.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I put my name to this very important amendment—all the more important because of the previous discussions we have had about the difficulties around misinformation or potential government interference in decisions about what is online and what is not online. The noble Lord, Lord Moylan, is right to indicate that this is a very modest and moderate amendment; it addresses the problems of government influence or government moderation, or at least allows those of us who are concerned about it to keep our eye on it and make sure that the country and Parliament know what is going on.

The original idea of disinformation came from an absolutely rightful concern about foreign disinformation between states. People were rightly concerned about security; we all should be and nobody wants to be taken in, in that way. But there has been a worry when agencies designed to combat those threats increasingly turn inward against the public, in a wide range of countries. Although that might not be exactly what has happened in the UK, we should note that Meta CEO Mark Zuckerberg recently admitted that the US Government asked Facebook to suppress true information. In a recent interview, he said that the scientific establishment

“asked for a bunch of things to be censored that, in retrospect, ended up being more debatable or true”.

We should all be concerned about this. It is not just a matter for those of us who are worried about free speech or raise the issue. If we are genuinely worried about misinformation or fake news, we have to make sure that we are equally concerned if it comes from other sources, not just from malign players.

The noble Lord, Lord Moylan, mentioned the American court case Missouri v Biden. In his 155-page ruling, Judge Doughty depicted quite a dystopian scene when he said that, during the pandemic, the US Government seem

“to have assumed a role similar to an Orwellian ‘Ministry of Truth’”.

I do not think we want to emulate the worst of what is happening in the US here.

The judge there outlined a huge complex of government agencies and officials connected with big tech and an army of bureaucrats hired to monitor websites and flag and remove problematic posts. It is not like that in the UK, but some of us were quite taken aback to discover that the Government ran a counter-disinformation policy forum during the lockdown, which brought tech giants together to discuss how to deal with Covid misinformation, as it was said. There was a worry about political interference then.

I do not think that this is just paranoia. Since then, Big Brother Watch and its investigative work have shown that the UK Government had a secret unit that worked with social media companies to monitor and prevent speech critical of Covid lockdown policies, in the shape of the Counter Disinformation Unit, which was set up by Ministers to deal with groups and individuals who criticised policies such as lockdowns, school closures, vaccine mandates or what have you.

Like the noble Lord, Lord Moylan, I do not want to get stuck on what happened during lockdown. That was an exceptional, extreme situation. None the less, the Counter Disinformation Unit—which works out of the Minister’s own department, the DCMS—is still operating. It seems to be able to get content fast-tracked for possible moderation by social media firms such as Facebook and Twitter. It used an AI firm to search social media posts—we need to know the details of that.

I think, therefore, that to have the transparency which the Government and the Minister have constantly stressed is hugely important for the credibility of the Bill, it is important that there is transparency about the likes of the Counter Disinformation Unit and any government attempts at interfering in what we are allowed to see, read or have access to online.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Before the Minister sits down, I think that it is entirely appropriate for him to say—I have heard it before—“Oh no, nothing was taken down. None of this is believable. No individuals were targeted”. However, that is not the evidence I have seen, and it might well be that I have been shown misinformation. But that is why the Minister has to acknowledge that one of the problems here is that indicated by Full Fact—which, as we know, is often endorsed by government Ministers as fact-checkers. It says that because the Government are avoiding any scrutiny for this unit, it cannot know. It becomes a “he said, she said” situation. I am afraid that, because of the broader context, it would make the Minister’s life easier, and be clearer to the public—who are, after all, worried about this—if he accepted the ideas in the amendment of the noble Lord, Lord Moylan. We would then be clear and it would be out in the open. If the FOIs and so on that have been constantly put forward were answered, would that not clear it up?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.

--- Later in debate ---
There has been much discussion, both in Committee and on Report, on protecting freedom of expression, but not much movement by the Government. I hope that the Minister will use this small amendment to push for draft codes of practice which allow the platforms, when they are not sure of the illegality of content, to use their discretion and consider freedom of expression.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, it is all quite exciting now, is it not? I can say “hear, hear!” a lot; everyone is talking about freedom of expression. I cannot tell noble Lords how relieved and pleased I was both to hear the speeches and to see Amendment 228 from the noble Lord, Lord Allan of Hallam, and the noble Viscount, Lord Colville of Culross, who both explained well why this is so important. I am so glad that, even late in our discussions on Report, it has returned as an important issue.

We have already discussed how in many cases, especially when it comes to what is seen as illegal speech, decisions about illegality are very complicated. They are complicated in the law courts and offline, and that is when they have the full power of lawyers, the criminal justice system and so on trying to make decisions. Leaving it up to people who, through no fault of their own, are not qualified but who work in a social media company to try to make that decision in a climate of quite onerous obligations—and having phrases such as “reasonable grounds to infer”—will lead to lawful expression being overmoderated. Ultimately, online platforms will use an abundance of caution, which will lead to a lot of important speech—perfectly lawful if not worthy speech; the public’s speech and the ability to speak freely—being removed. That is not a trivial side issue; it will discredit the Bill, if it has not done so already.

Whenever noble Lords make contributions about why a wide range of amendments and changes are needed—particularly in relation to protecting children, harm and so on—they constantly tell us that the Bill should send an uncompromising message. The difficulty I have is with the danger that the Bill will send an uncompromising message that freedom of expression is not important. I urge the Minister to look carefully at the amendment, because the message should be that, while the Bill is trying to tackle online harm and to protect children in particular—which I have never argued against—huge swathes of it might inadvertently silence people and deprive them of the right to information that they should be able to have.

My Amendment 229—I am not sure why it is in this group, but that is nothing new in the way that the groupings have worked—is about lawful speech and about what content is filtered by users. I have already argued for the replacement of the old legal but harmful duty, but the new duty of user empowerment is welcome, and at face value it puts users in the driving seat and allows adults to judge for themselves what they want and do not want to see. But—and it is a large but—that will work only if users and providers agree about when content should be filtered and what content is filtered.

As with all decisions on speech, as I have just mentioned, in the context particularly of a heightened climate of confusion and sensitivity regarding identity politics and the cancel-culture issues that we are all familiar with, there are some problems with the way that things stand in the Bill. I hope I am using the term “reasonable grounds to infer” in a better way than it is used in terms of illegality. My amendment specifies that companies need to have reasonable grounds to infer that content is abusive or inciting hatred when filtering out content in those user empowerment tools. Where a user chooses to filter out hateful content based on race, on being a woman or whatever, it should catch only content that genuinely falls under those headings. There is a risk that, without this amendment, technologies or individuals working for companies could operate in a heavy-handed way in filtering out legitimate content.

I shall give a couple of examples. Say that someone chooses to filter out abusive content targeting the protected characteristic of race. I imagine that they would have a reasonable expectation that that filter would target aggressive, unpleasant content demeaning to a person because of their race, but does the provider agree with that? Will it interpret my filtering choice as a user in the most restrictive way possible in a bid to protect my safety or by seeing my sensibilities as having a low threshold for what it might consider to be abuse?

The race issue illustrates where we get into difficulties. Will the filterers take their cue from the document that has just been revealed, which was compiled by the Diocese of St Edmundsbury and Ipswich, which the anti-racist campaigning group Don’t Divide Us has just released, and which is being used in 87 schools? Under the heading of racism we have ideas like passive racism includes agreeing that

“There are two sides to every story”,


or if you deny white privilege or if you start a sentence saying, “Not all white people”. “Veiled racism” in this document—which, as I say, is being used in schools for that particular reason by the Church of England—includes a “Euro-centric curriculum” or “cultural appropriation”. “Racist discrimination” includes “anti- immigration policies”, which, as I pointed out before, would indicate that some people would call the Government’s own Bill tonight racist.

The reason why I mention that is that you might think, “I am going to have racism filtered out”, but if there is too much caution then you will have filtered out very legitimate discussions on immigration and cultural appropriation. You will be protected, but if, for example, the filterer follows certain universities that have deemed the novels of Walter Scott, the plays of William Shakespeare or Enid Blyton’s writing as racist, then you can see that we have some real problems. When universities have said there is misogynistic bullying and sexual abuse in “The Great Gatsby” and Ovid’s “Metamorphoses”, I just want to make sure that we do not end up in a situation where there is oversensitivity by the filterers. Perhaps the filtering will take place by algorithm, machine learning and artificial intelligence, but the EHRC has noted that algorithms just cannot cope with the context, cultural difference and complexity of language within the billions of items of content produced every day.

Amendment 229 ensures that there is a common standard—a standard of objective reasonableness. It is not perfect at all; I understand that reasonableness itself is open to interpretation. However, it is an attempt to ensure that the Government’s concept of user empowerment is feasible by at least aspiring to a basic shared understanding between users and providers as to what will be filtered and what will not, and a check against providers’ filter mechanisms removing controversial or unpopular content in the name of protecting users. Just as I indicated in terms of sending a message, if the Government could indicate to the companies that rather than taking a risk-averse attitude, they had to bear in mind freedom of expression, not be oversensitive and not be too risk-averse or overcautious, we might begin to get some balance. Otherwise, an awful lot of lawful material will be removed that is not even harmful.

Baroness Kidron Portrait Baroness Kidron (CB)
- Parliament Live - Hansard - - - Excerpts

My Lords, I support Amendment 228. I spoke on this issue to the longer amendment in Committee. To decide whether something is illegal without the entire apparatus of the justice system, in which a great deal of care is taken to decide whether something is illegal, at high volume and high speed, is very worrying. It strikes me as amusing because someone commented earlier that they like a “must” instead of a “maybe”. In this case, I caution that a provider should treat the content as content of the kind in question accordingly, that something a little softer is needed, not a cliff edge that ends up in horrors around illegality where someone who has acted in self-defence is accused of a crime of violence, as happens to many women, and so on and so forth. I do not want to labour the point. I just urge a gentle landing rather than, as it is written, a cliff edge.

Online Safety Bill

Baroness Fox of Buckley Excerpts
Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Parliament Live - Hansard - - - Excerpts

My Lords, as the noble Lords, Lord Stevenson and Lord Clement-Jones, have already said, the Communications and Digital Select Committee did indeed recommend a new Joint Committee of both Houses to look specifically at the various different aspects of Ofcom’s implementation of what will be the Online Safety Act and ongoing regulation of digital matters. It is something I still have a lot of sympathy for. However, there has not been much appetite for such a Joint Committee at the other end of the Corridor. I do not necessarily think we should give up on that, and I will come back to that in a moment, but in place of that, I am not keen on what is proposed in Amendment 239, because my fear about how that is laid out is that it introduces something that appears a bit too burdensome and probably introduces too much delay in implementation.

To return to the bigger question, I think that we as parliamentarians need to reflect on our oversight of regulators, to which we are delegating significant new powers and requiring them to adopt a much more principles-based approach to regulation to cope with the fast pace of change in the technological world. We have to reflect on whether our current set-up is adequate for the way in which that is changing. What I have in mind is very much a strategic level of oversight, rather than scrutinising operational decisions, although, notwithstanding what the noble Lord has said, something specific in terms of implementation of the Bill and other new legislation is an area I would certainly wish to explore further.

The other aspect of this is making sure that our regulators keep pace too, not just with technology, and apply the new powers we give them in a way which meets our original intentions, but with the new political dynamics. Earlier today in your Lordships’ Chamber, there was a Question about how banks are dealing with political issues, and that raises questions about how the FCA is regulating the banking community. We must not forget that the Bill is about regulating content, and that makes it ever more sensitive. We need to keep reminding ourselves about this; it is very new and very different.

As has been acknowledged, there will continue to be a role for the Communications and Digital Select Committee, which I have the great privilege of chairing, in overseeing Ofcom. My noble friend Lord Grade and Dame Melanie Dawes appeared before us only a week ago. There is a role for the SIT Committee in the Commons; there is also probably some kind of ongoing role for the DCMS Select Committee in the Commons too, I am not sure. In a way, the fractured nature of that oversight makes it all the more critical that we join up a bit more. So I will take it upon myself to give this more thought and speak to the respective chairs of those committees in the other place, but I think that at some point we will need to consider, in some other fora, the way in which we are overseeing the work of regulators.

At some point, I think we will need to address the specific recommendations in the pre-legislative committee’s report, which were very much in line with what my own committee thought was right for the future of digital regulatory oversight, but on this occasion, I will not be supporting the specifics of Amendment 239.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, very briefly, I was pleased to see this, in whatever form it takes, because as we finish off the Bill, one thing that has come up consistently is that some of us have raised problems of potential unintended consequences, such as whether age gating will lead to a huge invasion of the privacy of adults rather than just narrowly protecting children, or whether the powers given to Ofcom will turn it into the most important and powerful regulator in the country, if not in Europe. In a highly complex Bill, is it possible for us to keep our eye on it a bit more than just by whingeing on the sidelines?

The noble Baroness, Lady Stowell, makes a very important point about the issue in relation to the FCA and banking. Nobody intended that to be the outcome of PEPs, for example, and nobody intended when they suggested encouraging banks to have values such as ESG or EDI—equality, diversity and inclusion—that that would lead to ordinary citizens of this country being threatened with having their banking turned off. It is too late to then retrospectively say, “That wasn’t what we ever intended”.

--- Later in debate ---
If, on the other hand, the ICO says it is problematic, we know then that we need to carry on discussing and debating whether that technology is appropriate and whether the safety/privacy balance has been got right. So, whether you support more scanning of content or are concerned about more scanning of content, to have the providers of the services that we all use every day, in the circumstances where they think there is a fundamental threat, being able to go to our privacy regulator, which we have set up precisely to guard our privacy rights, and ask it for an opinion, I do not think is an excessive ask. I hope that the Government will either accept the amendment or make a commitment that they will bring in something comparable at Third Reading. Absent that, we feel that this is important enough that we should test the opinion of the House in due course.
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I have put my name to and support Amendment 255, laid by the noble Lord, Lord Moylan, which straight- forwardly means that a notice may not impose a requirement relating to a service that would require that provider to weaken or remove end-to-end encryption. It is very clear. I understand that the other amendments introduce safeguards, which is better than nothing. It is not what I would like, but I will support them if they are pushed a vote. I think that the Government should really consider seriously not going anywhere near getting rid of encryption in this Bill and reconsider it by the time we get to Third Reading.

As the noble Lord, Lord Moylan, explained, this is becoming widely known about now, and it is causing some concern. If passed, this Bill, as it is at the moment, gives Ofcom far-reaching powers to force services, such as WhatsApp, to install software that would scan all our private messages to see whether there is evidence of terrorism, child sexual exploitation or abusive content and would automatically send a report to third parties, such as law enforcement, if it suspects wrongdoing—all without the consent or control of the sender or the intended recipient.

I would just like to state that encryption is a wonderful innovation. That is why more than 40 million people in the UK use it every day. It ensures that our private messages cannot be viewed, compromised or altered by anyone else, not even providers of chat services. It really requires somebody handing them over to a journalist and saying, “You can have my WhatsApp messages for anyone to read them”: beyond that, you cannot read them.

One of the interesting things that we have discussed throughout the passage of the Bill is technologies, their design and functionality and making sure they are not harmful. Ironically, it is the design and function of encryption that actually helps to keep us safe online. That is why so many people talk about civil libertarians, journalists and brave dissenters using it. For the rest of us, it is a tool to protect our data and private communications in the digital realm. I just want to pose here that it is an irony that the technologies being proposed in terms of client-side scanning are the technologies that are potentially harmful because it is, as people have noted, the equivalent of putting video cameras in our homes to listening in to every conversation and send reports to the police if we discuss illicit topics. As I have said before, while child sexual abuse is horrendous and vile, we know that it happens largely in the home and, as yet, the Government have not advocated that we film in everybody’s home in order to stop child sexual abuse. We should do almost anything and everything that we can, but I think this is the wrong answer.

Focusing on encryption just makes no sense. The Government have made exemptions, recognising the importance, in a democracy, of private correspondence: so exempted in the Bill are text messages, MSN, Zoom, oral commentaries and email. It seems perverse to demonise encryption in this way. I also note that there are exemptions for anything sent on message apps by law enforcement or public sector or emergency responders. I appreciate that some government communications are said to be done over apps such as WhatsApp. It seems then that the target of this part of the Bill is UK private citizens and residents and that the public are seen as the people who must be spied on.

In consequence, I do not think it surprising that more than 80 national or international civil society organisations have said that this would make the UK the first liberal democracy to require the routine scanning of people’s private chat messages. What does the Minister say to the legal opinion from the technology barrister Matthew Ryder KC, commissioned by Index on Censorship precisely on this part of the Bill? He compares this to law enforcement wiretapping without a warrant and says that the Bill will grant Ofcom a wider remit of surveillance powers over the public than GCHQ has.

Even if the Minister is not interested in lawyers or civil libertarians, surely we should be listening to the advice of science and technology experts in relation to complex technological solutions. Which experts have been consulted? I noted that Matthew Hodgson, the boss of encrypting messaging app Element, has said wryly that

“the Government has not consulted with UK tech firms, only with huge multinational corporations and companies that want to sell software that scans messages, who are unsurprisingly telling lawmakers that it is possible to scan messages without breaking encryption”.

The problem is that it is not possible to scan those messages without breaking encryption. It is actually misinformation to say that. That is why whole swathes of leading scientists and technologists from across the globe have recently put out an open letter explaining why and how it is not true. They explained that it creates really dangerous side-effects that can be harmful in the way that the noble Lord, Lord Moylan, explained, in terms of security, and makes the online world less safe for many of us. Existing scanning technologies are flawed and ineffective and scanning will nullify the purpose of encryption. I refer noble Lords to the work of the Internet Society and the academic paper Bugs in Our Pockets: The Risks of Client-Side Scanning for more details on all the peer-reviewed work.

I understand that, given the horrific nature of child sexual abuse—and, of course, terrorism, but I shall concentrate on child sexual abuse because the Bill is so concerned with it—it can be tempting for the Government to hope that there is a technological silver bullet to eradicate it. But the evidence suggests otherwise. One warning from scientists is that scanning billions of pieces of content could lead to millions of false positives and that could not only frame innocent users but could mean that the police become overwhelmed, diverting valuable resources away from real investigations into child sexual abuse.

A study by the Max Planck Institute for the study of crime of a similar German law that lasted from 2008 to 2010 found that the German police having access to huge amounts of data did not have any deterrent effect, did not assist in cleaning up crimes or increase convictions, but did waste a lot of police time. So it is important that this draconian invasion of privacy is not stated as necessary for protecting children. I share the exasperation of Signal’s president Meredith Whittaker, who challenged the Secretary of State and pointed out that there were some double standards here: for example, slashing early intervention programmes over the past decade did not help protect children and chronically underfunding and underresourcing child social care does not help.

My own bugbears are that when I, having talked to social workers and colleagues, raised the dangers to child protection when we closed down schools in lockdown, they were brushed to one side. When I and others raised the horrors of the young girls who had been systematically raped by grooming gangs whom the authorities had ignored for many, many years, I was told to stop talking about it. There are real threats to children that we ignore. I do not want us in this instance to use that very emotive discussion to attack privacy.

I also want to stress that there is no complacency here. Law enforcement agencies in the UK already possess a wide range of powers to seize devices and compel passwords and even covertly to monitor and hack accounts to identify criminal activity. That is good. Crucially, private messaging services can and do— I am sure they could do more—work in a wide range of ways to tackle abuse and keep people safe without the need to scan or read people’s messages.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Parliament Live - Hansard - - - Excerpts

To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.

While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.

Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.

I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.

The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - -

Will the Minister write to noble Lords who have been here in Committee and on Report in response to the fact that it is not just encryption companies saying that the demands of this clause will lead to the breaching of encryption, even though the Minister and the Government keep saying that it will not? As I have indicated, a wide range of scientists and technologists are saying that, whatever is said, demanding that Ofcom insists that technology notices are used in this way will inadvertently lead to the breaking of encryption. It would be useful if the Government at least explained scientifically and technologically why those experts are wrong and they are right.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am very happy to put in writing what I have said from the Dispatch Box. The noble Baroness may find that it is the same, but I will happily set it out in further detail.

I should make it clear that the Bill does not permit law enforcement agencies to access information held on platforms, including access to private channels. The National Crime Agency will be responsible for receiving reports from in-scope services via secure transmission, processing these reports and, where appropriate, disseminating them to other UK law enforcement bodies and our international counterparts. The National Crime Agency will process only information provided to it by the company; where it determines that the content is child sexual abuse content and meets the threshold for criminality, it can request further information from the company using existing powers.

I am glad to hear that my noble friend Lord Moylan does not intend to divide on his amendment. The restrictions it sets out are not ones we should impose on the Bill.

Amendments 256, 257 and 259 in the name of the noble Lord, Lord Stevenson of Balmacara, require a notice to be approved by a judicial commissioner appointed under the Investigatory Powers Act 2016 and remove Ofcom’s power to require companies to make best endeavours to develop or source new technology to address child sexual exploitation and abuse content.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Parliament Live - Hansard - - - Excerpts

My Lords, I just want to make some brief comments in support of the principle of what the noble Lord, Lord Knight, is aiming at in this amendment.

The Bill is going to have a profound impact on children in the United Kingdom. We hope that the most profound impact will be that it will significantly advance their interests in terms of safety online. But it will also potentially have a significant impact on what they can access online and the functionality of different services. They are going to experience new forms of age assurance, about which they may have very strong views. For example, the use of their biometric data to estimate their age will be there to protect them, but they may still have strong views about that.

I have said many times that there may be some measures in the Bill that will encourage services to become 18-plus only. That is not adult in the sense of adult content. Ordinary user-to-user social media services may look at the obligations and say, “Frankly, we would much rather restrict ourselves to users from the UK who identify as being 18-plus, rather than have to take on board all the associated liabilities in respect of children”—not because they are irresponsible, but precisely because they are responsible, and they can see that there is a lot of work to do in order to be legally and safely available to those under 18. For all those reasons, it is really important that the child advocacy body looks at things such as the United Nations Convention on the Rights of the Child and the rights of children to access information, and that it is able to take a view on them.

The reason I think that is important—as will any politician who has been out and spoken in schools—is that very often children are surprising in terms of what they see as their priorities. We make assumptions about their priorities, which can often be entirely wrong. There has been some really good work done on this. There was a project called EU Kids Online, back in the days of the EU, which used to look at children right across the European Union and ask them what their experience of being online was like and what was important to them. There are groups such as Childnet International, which for years has been convening groups of children and taking them to places such as the Internet Governance Forum. That always generates a lot of information that we here would not have thought of, about what children feel is really important to them about their online experience.

For all those reasons, it really would be helpful to institutionalise this in the new regime as some kind of body that looks in the round at children’s interests—their interests to stay safe, but also their interests to be able to access a wide variety of online services and to use the internet as they want to use it. I hope that that strengthens the case the noble Lord, Lord Knight, has made for such a body to exist in some kind of coalition-like format.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Parliament Live - Hansard - -

My Lords, I am afraid that I have some reservations about this amendment. I was trying not to, but I have. The way that the noble Lord, Lord Allan of Hallam, explained the importance of listening to young people is essential—in general, not being dictated to by them, but to understand the particular ways that they live their lives; the lived experience, to use the jargon. Particularly in relation to a Bill that spends its whole time saying it is designed to protect young people from harm, it might be worth having a word with them and seeing what they say. I mean in an ongoing way—I am not being glib. That seems very sensible.

I suppose my concern is that this becomes a quango. We have to ask who is on it, whether it becomes just another NGO of some kind. I am always concerned about these kinds of organisations when they speak “on behalf of”. If you have an advocacy body for children that says, “We speak on behalf of children”, that makes me very anxious. You can see that that can be a politically very powerful role, because it seems to have the authority of representing the young, whereas actually it can be entirely fictitious and certainly not democratic or accountable.

The key thing we discussed in Committee, which the noble Lord, Lord Knight of Weymouth, is very keen on—and I am too—is that we do not inadvertently deny young people important access rights to the internet in our attempt to protect them. That is why some of these points are here. The noble Baroness, Lady Kidron, was very keen on that. She wants to protect them but does not want to end up with them being denied access to important parts of the internet. That is all good, but I just think this body is wrong.

The only other thing to draw noble Lords’ attention to—I am not trying to be controversial, but it is worth nothing—is that child advocacy is currently in a very toxic state because of some of the issues around who represents children. As we speak, there is a debate about, for example, whether the NSPCC has been captured by Stonewall. I make no comment because I do not know; I am just noting it. We have had situations where a child advocacy group such as Mermaids is now discredited because it is seen to have been promoting chest binders for young people, to have gone down the gender ideology route, which some people would argue is child abuse of a sort, advocating that young women remove their breasts—have double mastectomies. This is all online, by the way.

I know that some people would say, “Oh, you’re always going on about that”, but I raise it because it is a very real and current discussion. I know a lot of people who work in education, with young people or in children’s rights organisations, and they keep telling me that they are tearing themselves apart. I just wondered whether the noble Lord, Lord Knight, might note that there is a danger of walking into a minefield here—which I know he does not mean to walk into—by setting up an organisation that could end up being the subject of major culture wars rows or, even worse, one of those dreaded quangos that pretends it is representing people but does not.