Online Harms Consultation

Lord Clement-Jones Excerpts
Wednesday 16th December 2020

(3 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab) [V]
- Hansard - - - Excerpts

My Lords, we welcome moves to protect children and the vulnerable online. We have been calling on the Government to introduce legislation in this area for several years. Their recent record, particularly on age verification, has been—let us call it—patchy. The Statement says that the UK will lead the way with online harms legislation, and we agree that this is a once-in-a-generation chance to legislate for the kind of internet that we all want to see—one that allows access to information, entertainment and knowledge on an unparalleled scale but at the same time keeps children and vulnerable adult citizens safe, and allows people to control the kind of content that they and those for whom they are responsible see online. Social media platforms have failed for years to self-regulate and we must not miss the opportunity afforded by the forthcoming legislation.

We welcome the announcement that Ofcom will be the regulator in this area. The duties to be allocated to it play to its founding principles, which require it to have regard to users of the services that it regulates as both consumers and citizens. We endorse the duty of care approach to regulation, which, if properly legislated for, has the potential to transform the way in which companies relate to their users. The excellent work done on that approach by the Carnegie UK Trust—in particular, Professor Lorna Woods and William Perrin—should be recognised. We support the measures announced in the Statement that seek to protect and enhance freedom of expression. In general, in so far as we can judge the Government’s current legislative intentions, there appears to be a workable and effective scheme of regulations here—but they should get on with it.

As to our concerns, does the Minister agree that the essential principle in play is that what is illegal in the real world must be illegal in the virtual world? However, the corollary is that we need to be clear that our existing laws are fit for purpose and up to date. What plans do the Government have in this complex area? The test for regulatory or criminal actions is to be “reasonably foreseeable harm” to individuals, and criminal acts. What happens to concerns about systems? If we lose focus on social networks, harms to society arising from disinformation or other threats to the integrity of the electoral process, for example, may not be in scope. That simply does not make sense. Does she agree that limiting the regulator to cases where individual harm has to be proven seems unduly restrictive?

Only the largest and riskier companies will fall into category 1. If they do, they will need to reduce the chance of harm to adults which, though not illegal, will presumably involve working with the regulator to reduce such harms as hate speech and self-harm. However, many of the most egregious examples of such activity have come from small companies. Why is size selected as a basis for this categorisation?

The financial and other penalties are welcome but there must be concerns about reach and scope, as many of the companies likely to be affected are based outwith the UK. Also, can the noble Baroness explain why the Government are not insisting on primary legislation to ensure that criminal liability will attach to senior executives for serious and repeated breaches of the law? Can she explain precisely what is meant by the move to the novel concept of “age assurance”? Age verification was the preferred option until recently. Has that now been dropped? Can we be assured that some means will be found to include fraud and financial scamming, possibly through joint action between regulators such as the FSA?

Finally, it is proposed that Ofcom will be empowered to accept “super-complaints”. That is welcome but it references the recent failure of the department to review in time the need for a similar power in the Data Protection Act. Can the noble Baroness update me on progress on that situation and confirm that this legislation could be used to redress it?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, over three years have elapsed and three Secretaries of State have come and gone since the Green Paper, in the face of a rising tide of online harms, not least during the Covid period, as Ofcom has charted. On these Benches, therefore, we welcome the set of concrete proposals we finally have to tackle online harms through a duty of care. We welcome the proposal for pre-legislative scrutiny, but I hope that there is a clear and early timetable for this to take place.

As regards the ambit of the duty of care, children are of course the first priority in prevention of harm, but it is clear that social media companies have failed to tackle the spread of fake news and misinformation on their platforms. I hope that the eventual definition in the secondary legislation includes a wide range of harmful content such as deep fakes, Holocaust denial and anti-Semitism, and misinformation such as anti-vax and QAnon conspiracy theories.

I am heartened too by the Government’s plans to consider criminalising the encouragement of self-harm. I welcome the commitment to keeping a balance with freedom of expression, but surely the below-the-line exemption proposed should depend on the news publisher being Leveson-compliant in how it is regulated. I think I welcome the way that the major impact of the duty of care will fall on big-tech platforms with the greatest reach, but we on these Benches will want to kick the tyres hard on the definition, threshold and duties of category 2 to make sure that this does not become a licence to propagate serious misinformation by some smaller platforms and networks.

I welcome the confirmation that Ofcom will be the regulator, but the key to success in preventing online harms will be whether Ofcom has teeth. Platforms will need to demonstrate how they have reduced the “reasonably foreseeable” risk of harm occurring from the design of their services. In mitigating the risk of “legal but harmful content”, this comes down to the way in which platforms facilitate and even encourage the sharing of extreme or sensationalist content designed to cause harm. As many excellent bodies such as Reset, Avaaz and Carnegie UK have pointed out—as the noble Lord, Lord Stevenson, said, the latter is the begetter of the duty of care proposal—this means having the power of compulsory audit. Inspection of the algorithms that drive traffic on social media is crucial.

Will Ofcom be able to make a direction to amend a recommender algorithm, how a “like” function operates and how content is promoted? Will it be able to inspect the data by which the algorithm trains and operates? Will Ofcom be able to insist that platforms can establish the identity of a user and address the issue of fake accounts, or that paid content is labelled? Will it be able to require platforms to issue fact-checked corrections to scientifically inaccurate posts? Will Ofcom work hand in hand with the Internet Watch Foundation? International co-ordination will be vital.

Ofcom will also need to work closely with the CMA if the Government are to protect vulnerable victims of online scams, fraud, and fake and misleading online reviews, if they are explicitly excluded from this legislation. Ofcom will need to work with the ASA to regulate harmful online advertising, as well. It will also need to work with the Gambling Commission on the harms of online black-market gambling, as was highlighted yesterday by my noble friend Lord Foster.

How will this new duty of care mesh with compliance with the age-appropriate design code, regulated by the ICO? As the noble Lord, Lord Stevenson, has mentioned, the one major fudge in the response is on age verification. The proposals do not meet the objectives of the original Part 3 of the Digital Economy Act. We were promised action when the response arrived, but we have a much watered-down proposal. Pornography is increasingly available and accessible to young people on more sites than just those with user-generated content. How do the Government propose to tackle this ever more pressing problem? There are many other areas that we will want to examine in the pre-legislative process and when the Bill comes to this House.

As my honourable friend Jamie Stone pointed out in the Commons yesterday, a crucial component of minimising risk online is education. Schools need to educate children about how to use social media responsibly. What commitment do the Government have to online media education? When will the strategy appear and what resources will be devoted to it?

These are some of the yet unanswered questions before the draft legislation arrives, but I hope that the Government commit to a full debate early in the new year so that some of these issues can be unpacked at the same time as the pre-legislative scrutiny process starts.

Baroness Barran Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Baroness Barran) (Con)
- Hansard - - - Excerpts

I thank both noble Lords for welcoming this full response to the consultation. I am happy to echo them both in their thanks, in particular to Carnegie UK and the important work it has done. We hope very much that the Bill will bring us into an age of accountability for big tech.

In response to the point made by the noble Lord, Lord Stevenson, what is illegal in the real world should indeed be illegal in the digital world. This Bill, when it comes, will help us move towards that. He raised the question about the focus on individuals. Obviously, the level of harm—in terms of the more individuals who are impacted—will be relevant to the sanctions that Ofcom can enforce. But he also raised a wider and very important point about trust in our institutions; clearly, social media and big tech platforms are institutions where the level of trust has been tremendously eroded in recent years. We want to restore that, so that what the big tech platforms say they will do is actually what happens in practice.

Both noble Lords asked about the category 1 companies, how those are defined and whether we will miss important actors as a result of that definition. Category 1 businesses will be based on size of audience but also on the functionality that they offer. For example, the ability to share content widely or to contact users anonymously, which are obviously higher-risk characteristics, could put a platform with a smaller audience into that category 1. Ofcom will publish the thresholds for these factors, assess companies against those thresholds and then publish a list of them. To be clear, all companies working in this area with user-generated content have to tackle all illegal content, and they have to protect children in relation to legal but harmful content. We are building safety by design into our approach from the get-go.

The noble Lord, Lord Stevenson, asked about criminal liability; we are not shying away from it. Indeed, the powers to introduce criminal liability for directors are, as he knows, being included in the Bill and can be introduced via secondary legislation. We would just rather give the technology companies a chance to get their house in order. The significant fines that can be levied—up to 10% of the turnover of the parent company or £18,000, whichever is higher—are obviously, for the larger tech companies, very substantial sums of money. We think that those fines will help to focus their minds.

The noble Lord, Lord Clement-Jones, talked about legal but harmful content. This is a very important and delicate area. We need to protect freedom of expression; we cannot dictate that legal content should automatically be taken down. That is why we agree with him that a duty of care is the right way forward. He questioned whether this would be sufficient to protect children. Our aim, and our number one priority, throughout this is clearly the protection of children.

The noble Lord, Lord Clement-Jones, asked a number of questions about Ofcom. I might not have time to answer them all now, but we believe that the Bill will give Ofcom the tools it needs to understand how to address the harms that need addressing through transparency reports, and to take action if needed. Ofcom will have extensive powers in order to achieve this. He also mentioned international co-ordination. We are clearly very open to working with other countries and regulators and are keen to do so.

Both noble Lords questioned whether the shift from age verification to age assurance is in some way a step backwards. We really do not believe that this is the case. We think that when the Bill comes, its scope will be very broad. We expect companies to use age-assurance or age-verification technologies to prevent children accessing services that pose the highest risk of harm to them, such as online pornography. The legislation will not mandate the use of specific technological approaches because we want it to be future-proofed. The emphasis will be on the duty of care and the undiluted responsibility of the tech companies to provide sufficient protection to children. We are therefore tech neutral in our approach, but we expect the regulator to be extremely robust towards those sites that pose the highest risk of harm to children.

The noble Lord, Lord Clement-Jones, also asked about our media literacy strategy, which we are working on at the moment.