Freedom of Expression (Communications and Digital Committee Report) Debate

Full Debate: Read Full Debate
Department: Department for Digital, Culture, Media & Sport

Freedom of Expression (Communications and Digital Committee Report)

Lord Gilbert of Panteg Excerpts
Thursday 27th October 2022

(1 year, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Moved by
Lord Gilbert of Panteg Portrait Lord Gilbert of Panteg
- View Speech - Hansard - -

To move that this House takes note of the report from the Communications and Digital Committee Free for All? Freedom of Expression in the Digital Age (1st Report, Session 2021-22, HL Paper 54).

Lord Gilbert of Panteg Portrait Lord Gilbert of Panteg (Con)
- Hansard - -

My Lords, I am pleased to introduce this debate on the report of the Communications and Digital Committee, Free for All? Freedom of Expression in the Digital Age. I am very grateful to our outstanding committee staff. Our clerk was Alasdair Love and our policy analyst was Theo Demolder. Rita Cohen once again provided them and us with invaluable support and Dr Ella McPherson provided expert advice throughout the inquiry.

I am grateful too to noble Lords on the committee, many of whom are speaking today and all of whom brought great experience and expertise to this report. The committee is thriving under the fine leadership of my noble friend Lady Stowell of Beeston; I am very much looking forward to her contribution today.

This report was published under my chairmanship in July last year, since when there have been many significant developments and changes in digital regulation and more widely. I was privileged to sit on the Joint Scrutiny Committee for the Online Safety Bill which reported at the end of last year.

Having heard the debate on the demonstrations in Iran, we have to reflect that free speech is still something to be cherished and something that brave people are dying for today. Freedom of expression is about not being prevented from speaking one’s own mind. It is the bedrock of free societies. Although it is subject to important legal limits, including against the incitement of violence and defamation, we must remember what Lord Justice Warby referred to in one judgment as

“the well-established proposition that free speech encompasses the right to offend, and indeed to abuse another.”

It was evidence taken during our previous inquiry on the future of journalism that led us to turn to freedom of expression. We had heard about how the market power of Google and Facebook was threatening media freedom. I am very pleased that the committee is continuing to champion the media and pursue our recommendation for an Australian-style mandatory bargaining code to ensure that publishers receive fair compensation for the use of their content.

It was clear from the outset of this inquiry that there are two major problems online. The first is the dissemination by platforms of the worst kind of content: that which is either illegal or harmful to children. The other problem is the opposite: platforms removing legitimate content or treating some political viewpoints more favourably than others. Among many examples we heard about were Twitter banning Donald Trump while still allowing Ayatollah Khamenei to incite violence against Israel and praise jihadi groups; and Facebook choosing to treat a New York Post story as misinformation, with no evidence, at the same time as taking no action against Chinese state media when they spread lies about the genocide in Xinjiang.

At the core of these twin problems of aggressive promotion of harmful content on the one hand and overremoval of posts on the other is the dominance of the big platforms. Their monopoly of power means that they do not have to respond to users’ concerns about safety or free speech. These companies have monopolised the digital public square, shutting out new entrants that might be able to provide better services.

Tough competition regulation would unleash the power of the market to raise standards. It is a central part of the approach that we recommend in our report and we concluded that it was urgent. The delay in bringing forward legislation on the Digital Markets Unit is disappointing. I hope the Minister will agree that swiftly fixing broken markets to increase competition is the right and indeed Conservative thing to do.

There are two other pillars to the holistic approach we recommend which have not received enough attention. One is digital citizenship initiatives. Schools and public information campaigns both have a role to play in improving online behaviour. One person’s abuse of their right to freedom of expression can have a chilling effect on others, leaving them less able to express themselves freely. There is now much evidence that women and girls most often are being silenced by others online. However, regulation is not the only answer here. Alongside really joined-up, consistent citizen initiatives, an improvement in our public discourse would be a good start. Lord Williams of Oystermouth told us that “abrasive and confrontational styles” of discussion

“do not come from nowhere.”

Indeed. Politicians and other public figures should be setting a better example, showing that we can disagree while respecting those we are arguing with and not condemning as extremists those who have different viewpoints from our own.

The other pillar is regulation of the design of the biggest platforms. Freedom of expression is the right to speak out, but there is no corresponding obligation on others to listen. We called for users to be empowered with tools to filter the type of content they are shown. Everyone has their own individual sensitivities and preferences and only they, if they are an adult, can really decide what they want to see. I am glad that the Government have gone some way in implementing this with new clauses in the Online Safety Bill, which I will come to a moment.

It is not the existence of individual pieces of content which in some circumstances and to some people can be harmful that is the problem, but the way in which algorithms serve up that content in unrelenting barrages. The devastating impact of these business models was laid bare in the astonishing evidence at the inquest into the death of Molly Russell, which we would never have seen were it not for the persistence and courage of her father, Ian Russell. The horrendous material that was targeted, promoted and recommended to Molly changed her perception of herself and her options. Seeing the systemic nature of her abuse in the coroners’ court will help us to take action to save lives, and I hope that Ian and his family find some comfort in that.

Design regulation means ensuring that the largest platforms’ content-creation algorithms, choice architecture and reward mechanisms are not set up to encourage users’ worst instincts and to spread the most unpleasant content the most quickly. Such measures would get to the heart of those business models, which centre on keeping users logged in and viewing adverts for as long as possible—even if that means stoking outrage.

We should be taking different approaches to protect children and adults. For adults, we want a space where they do not find manifestly illegal material but can control their own online environment by insisting that platforms put power in their hands, as I have described, which means an approach that allows adults to in effect create their own algorithms through approaches such as interoperability.

When it comes to children, we want to protect them from content that is not appropriate for their age, but surely we want more than that. We should be aspiring to an online environment that is positive and enriching, and which helps them to grow and learn safely: a space where their privacy is respected and where every stage of the design process puts these objectives ahead of the financial interests of the platform.

It is obvious, then, that platforms and other online services need to know the age of their users. The way in which they do this and the degree of certainty they would need will depend on the risk of children using the service and the risk of children encountering harmful material or design features if they do. That is why, while I will passionately champion free speech when we come to the Online Safety Bill, I will also support the call led by the noble Baroness, Lady Kidron, for a set of standards for age-assurance technology and approaches that preserve privacy. Well-designed and proportionately regulated age assurance is the friend, not the enemy, of free speech.

I have outlined the approach favoured by the committee in its report, and now I turn to the Bill’s approach. We have been told repeatedly by officials and Ministers that the Online Safety Bill is simply about platforms, systems and processes, rather than content. This is incorrect. These are systems and processes to remove content. Their compliance with the legislation will be judged according to the presence of content, even if a single piece of content would not be enough for a platform to be deemed non-complaint.

The “legal but harmful” duty has been the subject of so much debate. Its supporters are right that it is not straightforwardly a duty to remove content; it is about platforms choosing a policy on a given type of legal but harmful content and applying it consistently. However, this is not nearly as simple or as innocuous as it sounds. The vagueness of the concept of harm gives Ofcom significant definitional power. For example, a statutory instrument might designate information which has an adverse physical or psychological impact as a priority category which platforms must include in their terms and conditions. A platform that said that it would not allow such information could be penalised by Ofcom for not removing content which the regulator feels meets this standard but which the platform does not, because the platform either does not believe it is untrue or does not believe it is harmful.

When we asked why it would not be simpler to criminalise given harms, part of the response was that many of those legal harms are so vague as to be impossible to define in law. It is not clear why that would not also make them impossible to regulate. As a committee, we have always felt it a crucial point of principle to focus on the evidence in front of us, and when we did, on this issue, a consensus quickly emerged that the “legal but harmful” provisions are unworkable and would present a serious threat to freedom of expression. They should be removed from the Bill.

We also raised concern about the duty to remove illegal content as currently drafted. The problems with this duty have not received nearly as much attention as the “legal but harmful” duty, but might, I fear, be significantly more dangerous. Unlike with “legal but harmful”, this is straightforwardly a duty to remove content. Of course no one wants illegal content online. If it really is illegal, it should be removed, but we are asking platforms to make decisions which would otherwise be left to the courts. Prosecuting authorities have the time and resource to investigate and examine cases in great detail, understanding the intent, context and effect of posts. Platforms do not. Neither platforms’ content moderation algorithms nor their human moderators are remotely qualified to judge the legality of speech in ambiguous cases.

The new communications offences in Part 10 of the Bill, which have their merit, show the problem most clearly. A platform will have to remove posts which it has reasonable grounds to believe are intended and likely to cause serious distress to a likely audience, having considered whether there might be a reasonable public interest defence. Even courts would struggle with this.

If we oblige platforms to remove posts that they have “reasonable grounds to believe” might be illegal, there is a real danger, surely, that they will simply remove swathes of content to be on the safe side, taking it down if there is the slightest chance it may be prohibited. There is no incentive for them to consider freedom of expression, other than some duties to “have regard for” its importance, which are currently much too weak. Legitimate speech will become collateral damage.

I do not pretend that we have all the answers to these concerns of how to ensure proportionality and accuracy in removing potentially illegal content, but I know that this is something the Government have been looking at. Can my noble friend tell us whether the Government acknowledge the concern about overremoval of legal content and whether consideration has been given to solutions which could include a clear and specific duty on Ofcom to have regard for freedom of expression in designing codes and guidance and using enforcement powers, or more fundamentally, a change in the standard from “reasonable grounds to believe” to “manifestly illegal”?

The committee in its report found drafting of the Bill to be vague in parts, perhaps because it is born of a desire to find some way of getting rid of all the bad things on the internet while avoiding unintended consequences. As Susie Alegre, a leading human rights lawyer at Doughty Street Chambers put it, the Bill is so unclear that

“it is impossible to assess what it will mean in practice, its proportionality in relation to interference with human rights, or the potential role of the Online Safety Bill in the prevention of online harms.”

Ofcom will be left to try to make sense of and implement it. Ofcom is rightly a very well-respected regulator, but it is wrong to hand any regulator such sweeping powers over something so fundamental as what citizens are allowed to say online. There is no analogy in the offline world.

Think of how contested BBC impartiality is. Imagine how much more furiously the debate about Ofcom impartiality will be when both sides of a highly contested debate claim that platforms are wrongly taking their posts down and leaving their opponents’ posts up, demanding Ofcom take action to tackle what they see as harm.

The only winners from all this will be the likes of Facebook and Google. Having left their business models fundamentally unscathed, the Online Safety Bill will create obligations which only they can afford to deal with. New entrants to the market will be crushed under the compliance burden.

Before I conclude, on enforcement, it is sometimes said that the internet is a Wild West. It is not. We are right to put in place regulatory regimes across the digital landscape and, for all its flaws, this Bill is an important step. However, the report identified 12 existing criminal offences and a number of civil law protections that are already in place, and which are especially relevant to the online world. These offences already cover many of the behaviours online that we most worry about. The problem is not a lack of laws but a failure to enforce existing legislation. We called on the Government to ensure that existing laws are enforced and to explore mechanisms for platforms to fund this, and to require platforms to preserve deleted posts for a fixed period.

It will soon be time for this House to turn its attention to detailed scrutiny of the Online Safety Bill. I hope that noble Lords will find the committee’s report and today’s debate a useful preparation. I firmly believe that the approach that we suggest would make the internet safer and freer than would the current proposal. I would like to see an Online Safety Bill that focuses on platform design and content which is manifestly illegal, and which goes much further to protect children. It must also contain strong incentives for platforms not to take down legal content, including a prohibition on removing content from legitimate news publishers.

Parliament must provide ongoing scrutiny on the online safety regime, competition, and all areas of digital regulation, to help regulators do their jobs effectively and show that their powers are never again so completely overtaken by changes in the digital world.

I look forward to hearing from my noble friend the Minister, and warmly congratulate him on his appointment. I am sure that he will approach this debate and the online safety Bill with characteristic depth of thought. I beg to move.

--- Later in debate ---
Lord Gilbert of Panteg Portrait Lord Gilbert of Panteg (Con)
- View Speech - Hansard - -

I will be very brief. The internet, let us be clear, has given voice to many marginalised people and in so many ways has transformed our lives for the better. What we have seen today is a really serious and constructive debate about what we need to do to deal with the societal issues that have come with the digitalisation of the world that we live in.

I thank all noble Lords who gave such insightful contributions today, in particular my noble friend the Minister for his response, and especially the noble Baroness, Lady Merron, and the noble Lord, Lord Clement-Jones. What they demonstrated was that the House really wants to come together to fix these issues and I hope that my noble friend will seek a cross-party approach to this legislation and engage the whole House in coming up with the solutions that we need to resolve these problems. Would he thank his officials and the succession of Ministers who came to see us? His officials were very generous with their time.

I will also take this opportunity, on behalf of the committee, to thank Ofcom for engaging with us. I am confident its people are the right people for this job; they will do an excellent job and we need to hand them a seriously workable piece of legislation, while not forgetting our role as Parliament in asserting societal priorities as Ofcom moves forward with this task.

Motion agreed.