All 3 Lord Griffiths of Burry Port contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2

Online Safety Bill

Lord Griffiths of Burry Port Excerpts
Lord Inglewood Portrait Lord Inglewood (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Harding, made possibly one of the truest statements that has ever been uttered in this House when she told us that this is a very complicated Bill. It is complicated to the extent that I have no confidence that I fully understand it and all its ramifications, and a number of other speakers have said the same. For that reason—because I am aware of my own limitations, and I am pretty sure they are shared by others—it is important to have a statement of purpose at the outset to provide the co-ordinates for the discussion we are going to have; I concur with the approach of the noble Lord, Lord Allan. Because there is then a framework within which we can be sure, we hope, that we will manage to achieve an outcome that is both comprehensive and coherent. As a number of noble Lords have said, there are a number of completely different, or nearly different, aspects to what we are discussing, yet the whole lot have to link together. In the words of EM Forster, we have to

“connect the prose and the passion”.

The Minister may say, “We can’t do that at the outset”. I am not so sure. If necessary, we should actually draft this opening section, or any successor to it, as the last amendment to the Bill, because then we would be able to provide an overview. That overview will be important because, just as I am prepared to concede that I do not think I understand it all now, there is a very real chance that I will not understand it all then either. If we have this at the head of the Bill, I think that will be a great help not only to us but to all those who are subsequently going to have to make use of it.

Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - -

My Lords, I want to say something simple in support of what has already been said. If it is true that the Bill’s purposes are already scattered in the course of the Bill and throughout its substance, I cannot see what possible objection there can be to having them extracted and put at the beginning. They are not contentious—they are there already—so let us have them at the beginning to set a direction of travel. It seems so obvious to me.

It is an important Bill. I thank the Minister and his colleagues because they have put an enormous amount of work into this, and of course the Joint Committee has done its work. We have all been sent I cannot say how many briefing papers from interested bodies and so on. It is vital that, as we try to hold as much of this together as we possibly can in taking this very important Bill forward, we should have a sense of purpose and criteria against which we can measure what we eventually go on to discuss, make decisions about and introduce into the body of the Bill. I cannot see that the logic of all that can possibly be faulted.

Of course, there will be words that are slippery, as has been said. I cannot think of a single word, and I have been a lexicographer in my life, that does not lend itself to slipperiness. I could use words that everybody thinks we have in common in a way that would befuddle noble Lords in two minutes. It seems to me self-evident that these purposes, as stated here at the outset of our consideration in Committee, are logical and sensible. I will be hoping, as the Bill proceeds, to contribute to and build on the astounding work that the noble Baroness, Lady Kidron, has laid before us, with prodigious energy, in alerting all kinds of people, not just in your Lordships’ House but across the country, to the issues at stake here. I hope that she will sense that the Committee is rallying behind her in the astute way that she is bringing this matter before us. But again, I will judge outcomes against the provisions in this opening statement, a criterion for judging even the things that I feel passionate about.

The noble Baroness, Lady Morgan, and I have been in our own discussions about different parts of the Bill, about things such as suicide and self-harm. That is content. There are amendments. We will discuss them. Again, we can hold our own decisions about those matters against what we are seeking to achieve as stated so clearly at the outset of the Bill.

I remember working with the noble Lord, Lord Stevenson. It is so fabulous to have him back; the place feels right when he is here. When I was a bit of a greenhorn—he was the organ grinder and I was the monkey—I remember him pleading at the beginning of what was at that time the Data Protection Bill to have a statement like this at the beginning of that Bill. We were told, “Oh, but it is all in the Bill; all the words are there”. Then why not put them at the beginning, so that we can see them clearly and have something against which to measure our progress?

With all these things said, I hope we will not spend too much time on this. I hope we will nod it through, and then I hope we will remind ourselves of what it seeks to achieve as we go on in the interminable days that lie ahead of us. I have one last word as an old, old preacher remembering what I was told when I started preaching: “First, you tell ‘em what you’re gonna tell ‘em; then you tell ‘em; and then you tell ‘em what you’ve told ‘em”. Let us take at least the first of those steps now.

Online Safety Bill

Lord Griffiths of Burry Port Excerpts
These two amendments would ensure that platforms’ design involves the safest options being on by default. They are two straightforward, common-sense amendments that, as the noble Viscount, Lord Colville—who is not here now—said, balance the understandable concerns about freedom of speech with safety. They do not stop the publication of this objectionable material, but they offer others, particularly the most vulnerable, a real choice about whether they see it. I would argue that it is our minimum duty to make sure these safety protections are on by default. I beg to move.
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - -

My Lords, it is a pleasure to be collaborating with the noble Baroness, Lady Morgan. We seem to have been briefed by the same people, been to the same meetings and drawn the same conclusions. However, there are some things that are worth saying twice and, although I will try to avoid a carbon copy of what the noble Baroness said, I hope the central points will make themselves.

The internet simply must be made to work for its users above all else—that is the thrust of the two amendments that stand in our names. Through education and communication, the internet can be a powerful means of improving our lives, but it must always be a safe platform on which to enjoy a basic right. It cannot be said often enough that to protect users online is to protect them offline. To create a strict division between the virtual and the public realms is to risk ignoring how actions online can have life and death repercussions, and that is at the heart of what these amendments seek to bring to our attention.

I was first made aware of these amendments at a briefing from the Samaritans, where we got to know each other. There I heard the tragic accounts of those whose loved ones had taken their own lives due to exposure to harmful content online. I will not repeat their accounts—this is not the place to do that—but understanding only a modicum of their grief made it obvious to me that the principle of “safest option by default” must underline all our decision-making on this.

I applaud the work already done by Members of this House to ensure the safety of young people online. Yet it is vital, as the noble Baroness has said, that we do not create a drop-off point for future users—one in which turning 18 means sudden exposure to the most harmful content lurking online, as it is always there. Those most at risk of suicide due to exposure to harmful content are aged between their late teens and early 20s. In fact, a 2017 inquiry into the suicides of young people found harmful content accessed online in 26% of the deaths of under 20s and 13% of the deaths of 20 to 24 year-olds. It is vital for us to empower users from their earliest years.

In the Select Committee—I see fellow members sitting here today—we have been looking at digital exclusion and the need for education at all levels for those using the internet. Looking for good habits established in the earliest years is the right way to start, but it goes on after that, because the world that young people go on to inhabit in adulthood is one where they are already in control of the internet—if they had the education earlier. Adulthood comes with the freedom to choose how one expresses oneself online—of course it does—but this must not be at the cost of their continuing freedom from the most insidious content that puts their mental health at risk. Much mention has been made of the triple shield and I need not go there again. Its origins and perhaps deficiencies have been mentioned already.

The Center for Countering Digital Hate recently conducted an experiment, creating new social media accounts that showed interest in body image and mental health. This study found that TikTok served suicide-related content to new accounts within 2.6 minutes, with eating disorder content being recommended within 8 minutes. At the very least, these disturbing statistics tell us that users should have the option to opt in to such content, and not have to suffer this harm before later opting out. While the option to filter out certain categories of content is essential, it must be toggled on by default if safety is to be our primary concern.

The principle of safest by default creates not only a less harmful environment, but one in which users are in a position to define their own online experience. The space in which we carry out our public life is increasingly located on a small number of social media platforms—those category 1 platforms already mentioned several times—which everyone, from children to pensioners, uses to communicate and share their experiences.

We must then ensure that the protections we benefit from offline continue online: namely, protection from the harm and hate that pose a threat to our physical and mental well-being. When a child steps into school or a parent into their place of work, they must be confident that those with the power to do so have created the safest possible environment for them to carry out their interactions. This basic confidence must be maintained when we log in to Twitter, Instagram, TikTok or any other social media giant.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, my Amendment 43 tackles Clause 12(1), which expressly says that the duties in Clause 12 are to “empower” users. My concern is to ensure that, first, users are empowered and, secondly, legitimate criticism around the characteristics listed in Clause 12(11) and (12), for example, is not automatically treated as abusive or inciting hatred, as I fear it could be. My Amendment 283ZA specifies that, in judging content that is to be filtered out after a user has chosen to switch on various filters, the providers act reasonably and pause to consider whether they have “reasonable grounds” to believe that the content is of the kind in question—namely, abusive or problematic.

Anything under the title “empower adult users” sounds appealing—how can I oppose that? After all, I am a fan of the “taking back control” form of politics, and here is surely a way for users to be in control. On paper, replacing the “legal but harmful” clause with giving adults the opportunity to engage with controversial content if they wish, through enhanced empowerment tools, sounds positive. In an earlier discussion of the Bill, the noble Baroness, Lady Featherstone, said that we should treat adults as adults, allowing them to confront ideas with the

“better ethics, reason and evidence”—[Official Report, 1/2/23; col. 735.]

that has been the most effective way to deal with ideas from Socrates onwards. I say, “Hear, hear” to that. However, I worry that, rather than users being in control, there is a danger that the filter system might infantilise adult users and disempower them by hard-wiring into the Bill a duty and tendency to hide content from users.

There is a general weakness in the Bill. I have noted that some platforms are based on users moderating their own sites, which I am quite keen on, but this will be detrimentally affected by the Bill. It would leave users in charge of their own moderation, with no powers to decide what is in, for example, Wikipedia or other Wikimedia projects, which are added to, organised and edited by a decentralised community of users. So I will certainly not take the phrase “user empowerment” at face value.

I am slightly concerned about linguistic double-speak, or at least confusion. The whole Bill is being brought forward in a climate in which language is weaponised in a toxic minefield—a climate of, “You can’t say that”. More nerve-rackingly, words and ideas are seen as dangerous and interchangeable with violent acts, in a way that needs to be unpicked before we pass this legislation. Speakers can be cancelled for words deemed to threaten listeners’ safety—but not physical safety; the opinions are said to be unsafe. Opinions are treated as though they cause damage or harm as viscerally as physical aggression. So lawmakers have to recognise the cultural context and realise that the law will be understood and applied in it, not in the abstract.

I am afraid that the language in Clause 12(1) and (2) shows no awareness of this wider backdrop—it is worryingly woolly and vague. The noble Baroness, Lady Morgan, talked about dangerous content, and all the time we have to ask, “Who will interpret what is dangerous? What do we mean by ‘dangerous’ or ‘harmful’?”. Surely a term such as “abusive”, which is used in the legislation, is open to wide interpretation. Dictionary definitions of “abusive” include words such as “rude”, “insulting” and “offensive”, and it is certainly subjective. We have to query what we mean by the terms when some commentators complain that they have been victims of online abuse, but when you check their timelines you notice that, actually, they have been subject just to angry, and sometimes justified, criticism.

I recently saw a whole thread arguing that the Labour Party’s recent attack ads against the Prime Minister were an example of abusive hate speech. I am not making a point about this; I am asking who gets to decide. If this is the threshold for filtering content, there is a danger of institutionalising safe space echo chambers. It can also be a confusing word for users, because if someone applies a user empowerment tool to protect themselves from abuse, the threshold at which the filter operates could be much lower than they intend or envisage but, by definition, the user would not know what had been filtered out in their name, and they have no control over the filtering because they never see the filtered content.

Online Safety Bill

Lord Griffiths of Burry Port Excerpts
The Government accept that there is a problem. Internet users broadly accept that there is a problem. It must be sensible, in deciding on categorisation, to look at the risk of harm caused by the platforms. I beg to move.
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- View Speech - Hansard - -

My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.

We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.

We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.

A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.

The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.

The Bill’s pre-legislative scrutiny committee recommended that the legislation

“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.

This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.

The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.

This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.

This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.

Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.

The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.