Debates between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth during the 2019 Parliament

Wed 6th Sep 2023
Wed 19th Jul 2023
Mon 17th Jul 2023
Wed 12th Jul 2023
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Mon 20th Jun 2022

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I do not know how everyone has spent their summer, but this feels a bit like we have been working on a mammoth jigsaw puzzle and we are now putting in the final pieces. At times, through the course of this Bill, it has felt like doing a puzzle in the metaverse, where we have been trying to control an unreliable avatar that is actually assembling the jigsaw—but that would be an unfair description of the Minister. He has done really well in reflecting on what we have said, influencing his ministerial colleagues in a masterclass of managing upwards, and coming up with reasonable resolutions to previously intractable issues.

We are trusting that some of the outcome of that work will be attended to in the Commons, as the noble Baroness, Lady Morgan, has said, particularly the issues that she raised on risk, that the noble Baroness, Lady Kidron, raised on children’s safety by design, and that my noble friend Lady Merron raised on animal cruelty. We are delighted at where we think these issues have got to.

For today, I am pleased that the concerns of the noble Baroness, Lady Stowell, on Secretary of State powers, which we supported, have been addressed. I also associate myself with her comments on parliamentary scrutiny of the work of the regulator. Equally, we are delighted that the Minister has answered the concerns of my noble friend Lady Kennedy and that he has secured the legislative consent orders which he informed us of at the outset today. We would be grateful if the Minister could write to us answering the points of my noble friend Lord Rooker, which were well made by him and by the Delegated Powers Committee.

I am especially pleased to see that the issues which we raised at Report on remote access have been addressed. I feel smug, as I had to press quite hard for the Minister to leave the door open to come back at this stage on this. I am delighted that he is now walking through the door. Like the noble Lord, Lord Allan, I have just a few things that I would like clarification on—the proportional use of the powers, Ofcom taking into account user privacy, especially regarding live user data, and that the duration of the powers be time- limited.

Finally, I thank parliamentarians on all sides for an exemplary team effort. With so much seemingly falling apart around us, it is encouraging that, when we have common purpose, we can achieve a lot, as we have with this Bill.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, let me first address the points made by the noble Lord, Lord Rooker. I am afraid that, like my noble friend Lady Stowell of Beeston, I was not aware of the report of your Lordships’ committee. Unlike her, I should have been. I have checked with my private office and we have not received a letter from the committee, but I will ask them to contact the clerk to the committee immediately and will respond to this today. I am very sorry that this was not brought to my attention, particularly since the members of the committee met during the Recess to look at this issue. I have corresponded with my noble friend Lord McLoughlin, who chairs the committee, on each of its previous reports. Where we have disagreed, we have done so explicitly and set out our reasons. We have agreed with most of its previous recommendations. I am very sorry that I was not aware of this report and have not had the opportunity to provide answers for your Lordships’ House ahead of the debate.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I have good news and bad news for the Minister. The good news is that we have no problem with his amendments. The bad news, for him, is that we strongly support Amendment 245 from the noble Baroness, Lady Morgan of Coates, which, as others have said, we think is a no-brainer.

The beauty of the simple amendment has been demonstrated; it just changes the single word “and” to “or”. It is of course right to give Ofcom leeway—or flexibility, as the noble Baroness, Lady Finlay, described it—in the categorisation and to bring providers into the safety regime. What the noble Baroness, Lady Morgan, said about the smaller platforms, the breadcrumbing relating to the Jake Davison case and the functionality around bombarding Zoe Lyalle with those emails told the story that we needed to hear.

As it stands, the Bill requires Ofcom to always be mindful of size. We need to be more nuanced. From listening to the noble Lord, Lord Allan of Hallam—with his, as ever, more detailed analysis of how things work in practice—my concern is that in the end, if it is all about size, Ofcom will end up having to have a much larger number in scope on the categorisation of size in order to cover all the platforms that it is worried about. If we could give flexibility around size or functionality, that would make the job considerably easier.

We on this side think categorisation should happen with a proportionate, risk-based approach. We think the flexibility should be there, the Minister is reasonable—come on, what’s not to like?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I shall explain why the simple change of one word is not as simple as it may at first seem. My noble friend’s Amendment 245 seeks to amend the rule that a service must meet both a number-of-users threshold and a functionality threshold to be designated as category 1 or 2B. It would instead allow the Secretary of State by regulation to require a service to have to meet only one or other of the two requirements. That would mean that smaller user-to-user services could be so categorised by meeting only a functionality threshold.

In practical terms, that would open up the possibility of a future Secretary of State setting only a threshold condition about the number of users, or alternatively about functionality, in isolation. That would create the risk that services with a high number of users but limited functionality would be caught in scope of category 1. That could be of particular concern to large websites that operate with limited functionality for public interest reasons, and I am sure my noble friend Lord Moylan can think of one that fits that bill. On the other hand, it could capture a vast array of low-risk smaller services merely because they have a specific functionality—for instance, local community fora that have livestreaming capabilities. So we share the concerns of the noble Lord, Lord Allan, but come at it from a different perspective from him.

My noble friend Lady Morgan mentioned the speed of designation. The Bill’s approach to the pace of designation for the category 1 watchlist and register is flexible—deliberately so, to allow Ofcom to act as quickly as is proportionate to each emerging service. Ofcom will have a duty proactively to identify, monitor and evaluate emerging services, which will afford it early visibility when a service is approaching the category 1 threshold. It will therefore be ready to act accordingly to add services to the register should the need arise.

The approach set out in my noble friend’s Amendment 245 would not allow the Secretary of State to designate individual services as category 1 if they met one of the threshold conditions. Services can be designated as category 1 only if they meet all the relevant threshold conditions set out in the regulations made by the Secretary of State. That is the case regardless, whether the regulations set out one condition or a combination of several conditions.

The noble Baroness, Lady Finlay, suggested that the amendment would assist Ofcom in its work. Ofcom itself has raised concerns that amendments such as this—to introduce greater flexibility—could increase the risk of legal challenges to categorisation. My noble friend Lady Morgan was part of the army of lawyers before she came to Parliament, and I am conscious that the noble Lord, Lord Clement-Jones, is one as well. I hope they will heed the words of the regulator; this is not a risk that noble Lords should take lightly.

I will say more clearly that small companies can pose significant harm to users—I have said it before and I am happy to say it again—which is why there is no exemption for small companies. The very sad examples that my noble friend Lady Morgan gave in her speech related to illegal activity. All services, regardless of size, will be required to take action against illegal content, and to protect children if they are likely to be accessed by children. This is a proportionate regime that seeks to protect small but excellent platforms from overbearing regulation. However, I want to be clear that a small platform that is a font of illegal content cannot use the excuse of its size as an excuse for not dealing with it.

Category 1 services are those services that have a major influence over our public discourse online. Again, I want to be clear that designation as a category 1 service is not based only on size. The thresholds for category 1 services will be based on the functionalities of a service as well as the size of the user base. The thresholds can also incorporate other characteristics that the Secretary of State deems relevant, which could include factors such as a service’s business model or its governance. Crucially, Ofcom has been clear that it will prioritise engagement with high-risk or high-impact services, irrespective of their categorisation, to understand their existing safety systems and how they plan to improve them.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, first, I reassure noble Lords that the Government are fully committed to making sure that the interests of children are both represented and protected. We believe, however, that this is already achieved through the provisions in the Bill.

Rather than creating a single advocacy body to research harms to children and advocate on their behalf, as the noble Lord’s amendment suggests, the Bill achieves the same effect through a combination of Ofcom’s research functions, the consultation requirements and the super-complaints provisions. Ofcom will be fully resourced with the capacity and technological ability to assess and understand emerging harms and will be required to research children’s experiences online on an ongoing basis.

For the first time, there will be a statutory body in place charged with protecting children from harm online. As well as its enforcement functions, Ofcom’s research will ensure that the framework remains up to date and that Ofcom itself has the latest, in-depth information to aid its decision-making. This will ensure that new harms are not just identified in retrospect when children are already affected by them and complaints are made; instead, the regulator will be looking out for new issues and working proactively to understand concerns as they develop.

Children’s perspectives will play a central role in the development of the framework, as Ofcom will build on its strong track record of qualitative research to ensure that children are directly engaged. For example, Ofcom’s ongoing programme, Children’s Media Lives, involves engaging closely with children and tracking their views and experiences year on year.

Alongside its own research functions, super-complaints will ensure that eligible bodies can make complaints on systemic issues, keeping the regulator up to date with issues as they emerge. This means that if Ofcom does not identify a systemic issue affecting children for any reason, it can be raised and then dealt with appropriately. Ofcom will be required to respond to the super-complaint, ensuring that its subsequent decisions are understood and can be scrutinised. Complaints by users will also play a vital role in Ofcom’s horizon scanning and information gathering, providing a key means by which new issues can be raised.

The extensive requirements for Ofcom to consult on codes of practice and guidance will further ensure that it consistently engages with groups focused on the interests of children as the codes and guidance are developed and revised. Children’s interests are embedded in the implementation and delivery of this framework.

The Children’s Commissioner will play a key and ongoing role. She will be consulted on codes of practice and any further changes to those codes. The Government are confident that she will use her statutory duties and powers effectively to understand children’s experiences of the digital world. Her primary function as Children’s Commissioner for England is promoting and protecting the rights of children in England and to promote and protect the rights of children across the United Kingdom where those rights are or may be affected by reserved matters. As the codes of practice and the wider Bill relate to a reserved area of law—namely, internet services—the Children’s Commissioner for England will be able to represent the interests of children from England, Scotland, Wales and Northern Ireland when she is consulted on the preparation of codes of practice. That will ensure that children’s voices are represented right across the UK. The Children’s Commissioner for England and her office also regularly speak to the other commissioners about ongoing work on devolved and reserved matters. Whether she does that in branches of Pret A Manger, I do not know, but she certainly works with her counterparts across the UK.

I am very happy to take back the idea that the noble Lord has raised and discuss it with the commissioner. There are many means by which she can carry out her duties, so I am very happy to take that forward. I cannot necessarily commit to putting it in legislation, but I shall certainly commit to discussing it with her. On the proposals in the noble Lord’s amendment, we are concerned that a separate child user advocacy body would duplicate the functions that she already has, so I hope with that commitment he will be happy to withdraw.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to those who have spoken in this quick debate and for the support from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Fox, about children’s voices being heard. I think that we are getting to the point when there will not be a quango or indeed a minefield, so that makes us all happy. The Minister almost derailed me, because so much of his speaking note was about the interests of children and I am more interested in the voice of children being heard directly rather than people acting on their behalf and representing their interests, but his final comments around being happy to take the idea forward means that I am very happy to withdraw my amendment.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the philosophical question on which we still disagree. Features and functionality can be harmful but, to manifest that harm, there must be some content which they are functionally, or through their feature, presenting to the user. We therefore keep talking about content, even when we are talking about features and functionality. A feature on its own which has no content is not what the noble Baroness, Lady Kidron, my noble friend Lady Harding and others are envisaging, but to follow the logic of the point they are making, it requires some content for the feature or functionality to cause its harm.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

But the content may not be harmful.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, even if the content is not harmful. We keep saying “content” because it is the way the content is disseminated, as the Bill sets out, but the features and functionalities can increase the risks of harm as well. We have addressed this through looking at the cumulative effects and in other ways.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

A provider would have a right to bring a legal challenge against Ofcom if it considered that a particular exercise of the remote access power was unlawful. I am sure that would be looked at swiftly, but I will write to the noble Lord on the anticipated timelines while that judicial review was pending. Given the serious nature of the issues under consideration, I am sure that would be looked at swiftly. I will write further on that.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I will write on Schedule 12 as well.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the Minister for giving way so quickly. I think the House is asking him to indicate now that he will go away and look at this issue, perhaps with some of us, and that, if necessary, he would be willing to look at coming back with something at Third Reading. From my understanding of the Companion, I think he needs to say words to that effect to allow him to do so, if that is what he subsequently wants to do at Third Reading.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am very happy to discuss this further with noble Lords, but I will reserve the right, pending that discussion, to decide whether we need to return to this at Third Reading.

Amendments 270 and 272, tabled by my noble friend Lady Fraser of Craigmaddie, to whom I am very grateful for her careful scrutiny of the devolved aspects of the Bill, seek to require Ofcom to include separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in the research about users’ experiences of regulated services and in Ofcom’s transparency reports. While I am sympathetic to her intention—we have corresponded on it, for which I am grateful—it is important that Ofcom has and retains the discretion to prioritise information requests that will best shed light on the experience of users across the UK.

My noble friend and other noble Lords should be reassured that Ofcom has a strong track record of using this discretion to produce data which are representative of people across the whole United Kingdom. Ofcom is committed to reflecting the online experiences of users across the UK and intends, wherever possible, to publish data at a national level. When conducting research, Ofcom seeks to gather views from a representative sample of the United Kingdom and seeks to set quotas that ensure an analysable sample within each of the home nations.

It is also worth noting the provisions in the Communications Act 2003 that require Ofcom to operate offices in each of the nations of the UK, to maintain advisory committees for each, and to ensure their representation on its various boards and panels—and, indeed, on the point raised by the noble Baroness, Lady Kidron, to capture the experiences of children and users of all ages. While we must give Ofcom the discretion it needs to ensure that the framework is flexible and remains future-proofed, I hope that I have reassured my noble friend that her point will indeed be captured, reported on and be able to be scrutinised, not just in this House but across the UK.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister sits down—he has been extremely generous in taking interventions—I want to put on record my understanding of his slightly ambiguous response to Amendment 247A, so that he can correct it if I have got it wrong. My understanding is that he has agreed to go away and reflect on the amendment and that he will have discussions with us about it. Only if he then believes that it is helpful to bring forward an amendment at Third Reading will he do so.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Yes, but I do not want to raise the hopes of the noble Lord or others, with whom I look forward to discussing this matter. I must manage their expectations about whether we will bring anything forward. With that, I beg to move.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, the Government have moved on this issue, and I very much welcome that. I am grateful to the Minister for listening and for the fact that we now have Section 11 of the Communications Act being brought into the digital age through the Government’s Amendments 274B and 274C. The public can now expect to be informed and educated about content-related harms, reliability and accuracy; technology companies will have to play their part; and Ofcom will have to regularly report on progress, and will commission and partner with others to fulfil those duties. That is great progress.

The importance of this was underscored at a meeting of the United Nations Human Rights Council just two weeks. Nada Al-Nashif, the UN Deputy High Commissioner for Human Rights in an opening statement said that media and digital literacy empowered individuals and

“should be considered an integral part of education efforts”.

Tawfik Jelassi, the assistant director-general of UNESCO, in a statement attached to that meeting, said that

“media and information literacy was essential for individuals to exercise their right to freedom of opinion and expression”—

I put that in to please the noble Baroness, Lady Fox—and

“enabled access to diverse information, cultivated critical thinking, facilitated active engagement in public discourse, combatted misinformation, and safeguarded privacy and security, while respecting the rights of others”.

If only the noble Lord, Lord Moylan, was in his place to hear me use the word privacy. He continued:

“Together, the international community could ensure that media and information literacy became an integral part of everyone’s lives, empowering all to think critically, promote digital well-being, and foster a more inclusive and responsible global digital community”.


I thought those were great words, summarising why we needed to do this.

I am grateful to Members on all sides of the House for the work that they have done on media literacy. Part of repeating those remarks was that this is so much more about empowerment than it is about loading safety on to individuals, as the noble Baroness, Lady Kidron, rightly said in her comments.

Nevertheless, we want the Minister to reflect on a couple of tweaks. Amendment 269C in my name is around an advisory committee being set up within six months and in its first report assessing the need for a code on misinformation. I have a concern that, as the regime that we are putting in place with this Bill comes into place and causes some of the harmful content that people find engaging to be suppressed, the algorithms will go to something else that is engaging, and that something else is likely to be misinformation and disinformation. I have a fear that that will become a growing problem that the regulator will need to be able to address, which is why it should be looking at this early.

Incidentally, that is why the regulator should also look at provenance, as in Amendment 269AA from the noble Lord, Lord Clement-Jones. It was tempting in listening to him to see whether there was an AI tool that could trawl across all the comments that he has made during the deliberations on this Bill to see whether he has quoted the whole of the joint report—but that is a distraction.

My Amendment 269D goes to the need for media literacy on systems, processes and business models, not just on content. Time and again, we have emphasised the need for this Bill to be as much about systems as content. There are contexts where individual, relatively benign pieces of content can magnify if part of a torrent that then creates harm. The Mental Health Foundation has written to many of us to make this point. In the same way that the noble Baroness, Lady Bull, asked about ensuring that those with disability have their own authentic voice heard as these media literacy responsibilities are played out, so the Mental Health Foundation wanted the same kind of involvement from young people; I agree with both. Please can we have some reassurance that this will be very much part of the literacy duties on Ofcom and the obligations it places on service providers?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I am grateful to noble Lords for their comments, and for the recognition from the noble Lord, Lord Knight, of the changes that we have made. I am particularly grateful to him for having raised media literacy throughout our scrutiny of this Bill.

His Amendments 269C and 269D seek to set a date by which the establishment of the advisory committee on misinformation and disinformation must take place and to set requirements for its first report. Ofcom recognises the valuable role that the committee will play in providing advice in relation to its duties on misinformation and disinformation, and has assured us that it will aim to establish the committee as soon as is reasonably possible, in recognition of the threats posed by misinformation and disinformation online.

Given the valuable role of the advisory committee, Ofcom has stressed how crucial it will be to have appropriate time to appoint the best possible committee. Seeking to prescribe a timeframe for its implementation risks impeding Ofcom’s ability to run the thorough and transparent recruitment process that I am sure all noble Lords want and to appoint the most appropriate and expert members. It would also not be appropriate for the Bill to be overly prescriptive on the role of the committee, including with regard to its first report, in order for it to maintain the requisite independence and flexibility to give us the advice that we want.

Amendment 269AA from the noble Lord, Lord Clement-Jones, seeks to add advice on content provenance to the duties of the advisory committee. The new media literacy amendments, which update Ofcom’s media literacy duties, already include a requirement for Ofcom to take steps to help users establish the reliability, accuracy and authenticity of content found on regulated services. Ofcom will have duties and mechanisms to be able to advise platforms on how they can help users to understand whether content is authentic; for example, by promoting tools that assist them to establish the provenance of content, where appropriate. The new media literacy duties will require Ofcom to take tangible steps to prioritise the public’s awareness of and resilience to misinformation and disinformation online. That may include enabling users to establish the reliability, accuracy and authenticity of content, but the new duties will not remove content online; I am happy to reassure the noble Baroness, Lady Fox, on that.

The advisory committee is already required under Clause 141(4)(c) to advise Ofcom on its exercise of its media literacy functions, including its new duties relating to content authenticity. The Bill does not stipulate what tools service providers should use to fulfil their duties, but Ofcom will have the ability to recommend in its codes of practice that companies use tools such as provenance technologies to identify manipulated media which constitute illegal content or content that is harmful to children, where appropriate. Ofcom is also required to take steps to encourage the development and use of technologies that provide users with further context about content that they encounter online. That could include technologies that support users to establish content provenance. I am happy to reassure the noble Lord, Lord Clement-Jones, that the advisory committee will already be required to advise on the issues that he has raised in his amendment.

On media literacy more broadly, Ofcom retains its overall statutory duty to promote media literacy, which remains broad and non-prescriptive. The new duties in this Bill, however, are focused specifically on harm; that is because the of nature of the Bill, which seeks to make the UK the safest place in the world to be online and is necessarily focused on tackling harms. To ensure that Ofcom succeeds in the delivery of these new specific duties with regard to regulated services, it is necessary that the regulator has a clearly defined scope. Broadening the duties would risk overburdening Ofcom by making its priorities less clear.

The noble Baroness, Lady Bull—who has been translated to the Woolsack while we have been debating this group—raised media literacy for more vulnerable users. Under Ofcom’s existing media literacy programme, it is already delivering initiatives to support a range of users, including those who are more vulnerable online, such as people with special educational needs and people with disabilities. I am happy to reassure her that, in delivering this work, Ofcom is already working not just with expert groups including Mencap but with people with direct personal experiences of living with disabilities.

The noble Lord, Lord Clement-Jones, raised Ofsted. Effective regulatory co-ordination is essential for addressing the crosscutting opportunities and challenges posed by digital technologies and services. Ofsted will continue to engage with Ofcom through its existing mechanisms, including engagement led by its independent policy team and those held with Ofcom’s online safety policy director. In addition to that, Ofsted is considering mechanisms through which it can work more closely with Ofcom where appropriate. These include sharing insights from inspections in an anonymised form, which could entail reviews of its inspection bases and focus groups with inspectors, on areas of particular concern to Ofcom. Ofsted is committed to working with Ofcom’s policy teams to work these plans up in more detail.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for raising this; it is important. Clause 49(3)(a)(i) mentions content

“generated directly on the service by a user”,

which, to me, implies that it would include the actions of another user in the metaverse. Sub-paragraph (ii) mentions content

“uploaded to or shared on the service by a user”,

which covers bots or other quasi-autonomous virtual characters in the metaverse. As we heard, a question remains about whether any characters or objects provided by the service itself are covered.

A scenario—in my imagination anyway—would be walking into an empty virtual bar at the start of a metaverse service. This would be unlikely to be engaging: the attractions of indulging in a lonely, morose drink at that virtual bar are limited. The provider may therefore reasonably configure the algorithm to generate characters and objects that are engaging until enough users then populate the service to make it interesting.

Of course, there is the much more straightforward question of gaming platforms. On Monday, I mentioned “Grand Theft Auto”, a game with an advisory age of 17—they are still children at that age—but that is routinely accessed by younger children. Shockingly, an article that I read claimed that it can evolve into a pornographic experience, where the player becomes the character from a first-person angle and received services from virtual sex workers, as part of the game design. So my question to the Minister is: does the Bill protect the user from these virtual characters interacting with users in virtual worlds?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I will begin with that. The metaverse is in scope of the Bill, which, as noble Lords know, has been designed to be technology neutral and future-proofed to ensure that it keeps pace with emerging technologies—we have indeed come a long way since the noble Lord, Lord Clement-Jones, the noble Lords opposite and many others sat on the pre-legislative scrutiny committee for the Bill. Even as we debate, we envisage future technologies that may come. But the metaverse is in scope.

The Bill will apply to companies that enable users to share content online or to interact with each other, as well as search services. That includes a broad range of services, such as websites, applications, social media services, video games and virtual reality spaces, including the metaverse.

Any service that enables users to interact, as the metaverse does, will need to conduct a child access test and will need to comply with the child safety duties—if it is likely to be accessed by children. Content is broadly defined in the Bill as,

“anything communicated by means of an internet service”.

Where this is uploaded, shared or directly generated on a service by a user and able to be encountered by other users, it will be classed as user-generated content. In the metaverse, this could therefore include things like objects or avatars created by users. It would also include interactions between users in the metaverse such as chat—both text and audio—as well as images, uploaded or created by a user.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, from this side we certainly welcome these government amendments. I felt it was probably churlish to ask why it had taken until this late stage to comply with international standards, but that point was made very well by the noble Lord, Lord Allan of Hallam, and I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am grateful to noble Lords for their support for these amendments and for their commitment, as expected, to ensuring that we have the strongest protections in the Bill for children.

The noble Lord, Lord Allan of Hallam, asked: why only now? It became apparent during the regular engagement that, as he would expect, the Government have with the National Crime Agency on issues such as this that this would be necessary, so we are happy to bring these amendments forward. They are vital amendments to enable law enforcement partners to prosecute offenders and keep children safe.

Reports received by the National Crime Agency are for intelligence only and so cannot be relied on as evidence. As a result, in some cases law enforcement agencies may be required to request that companies provide data in an evidential format. The submitted report will contain a limited amount of information from which law enforcement agencies will have to decide what action to take. Reporting companies may hold wider data that relate to the individuals featured in the report, which could allow law enforcement agencies to understand the full circumstances of the event or attribute identities to the users of the accounts.

The data retention period will provide law enforcement agencies with the necessary time to decide whether it is appropriate to request data in order to continue their investigations. I hope that explains the context of why we are doing this now and why these amendments are important ones to add to the Bill. I am very grateful for noble Lords’ support for them.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as we have heard, this is a small group of amendments concerned with preventing size and lack of capacity being used as a reasonable excuse for allowing children to be unsafe. Part of the problem is the complexity of the Bill and the way it has been put together.

For example, Clause 11, around user-to-user services, is the pertinent clause and it is headed “Safety duties protecting children”. Clause 11(2) is preceded in italics with the wording “All services” so anyone reading it would think that what follows applies to all user-to-user services regardless of size. Clause 11(3) imposes a duty on providers

“to operate a service using proportionate systems and processes”

to protect children from harm. That implies that there will be judgment around what different providers can be expected to do to protect children; for example, by not having to use a particular unaffordable technical solution on age assurance if they can show the right outcome by doing things differently. That starts to fudge things a little.

The noble Lord, Lord Bethell, who introduced this debate so well with Amendment 39, supported by my noble friend Lady Ritchie, wants to be really sure that the size of the provider can never be used to argue that preventing all children from accessing porn is disproportionate and that a few children slipping through the net might just be okay.

The clarity of Clause 11 unravels even further at the end of the clause, where in subsection (12)(b) it reads that

“the size and capacity of the provider of a service”

is relevant

“in determining what is proportionate”.

The clause starts to fall apart at that point quite thoroughly in terms of anyone reading it being clear about what is supposed to happen.

Amendment 43 seeks to take that paragraph out, as we have heard from the noble Lord, Lord Russell, and would do the same for search in Amendment 87. I have added my name to these amendments because I fear that the ambiguity in the wording of this clause will give small and niche platforms an easy get out from ensuring that children are safe by design.

I use the phrase “by design” deliberately. We need to make a choice with this Bill even at this late stage. Is the starting point in the Bill children’s safety by design? Or is the starting point one where we do not want to overly disrupt the way providers operate their business first—which is to an extent how the speech from the noble Lord, Lord Allan, may have been heard—and then overlay children’s safety on top of that?

Yesterday, I was reading about how children access inappropriate and pornographic content, not just on Twitter, Instagram, Snapchat, TikTok and Pinterest but on Spotify and “Grand Theft Auto”—the latter being a game with an age advisory of “over 17” but which is routinely played by teenaged children. Wherever we tolerate children being online, there are dangers which must be tackled. Listening to the noble Baroness, Lady Harding, took me to where a big chunk of my day job in education goes to—children’s safeguarding. I regularly have to take training in safeguarding because of the governance responsibilities that I have. Individual childminders looking after one or two children have an assessment and an inspection around their safeguarding. In the real world we do not tolerate a lack of safety for children in this context. We should not tolerate it in the online world either.

The speech from the noble Lord, Lord Russell, reminded me of the breadcrumbing from big platforms into niche platforms that is part of that incel insight that he referenced. Content that is harmful to children can also be what some children are looking for, which keeps them engaged. Small, emergent services aggressively seeking growth could set algorithms accordingly. They must not be allowed to believe that engaging harmful content is okay until they get to the size that they need to be to afford the age-assurance technology which we might envisage in the Bill. I hope that the Minister shares our concerns and can help us with this problem.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

My Lords, short debates can be helpful and useful. I am grateful to noble Lords who have spoken on this group.

I will start with Amendment 39, tabled by my noble friend Lord Bethell. Under the new duty at Clause 11(3)(a), providers which allow pornography or other forms of primary priority content under their terms of service will need to use highly effective age verification or age estimation to prevent children encountering it where they identify such content on their service, regardless of their size or capacity. While the size and capacity of providers is included as part of a consideration of proportionality, this does not mean that smaller providers or those with less capacity can evade the strengthened new duty to protect children from online pornography. In response to the questions raised by the noble Baronesses, Lady Ritchie of Downpatrick and Lady Kidron, and others, no matter how much pornographic content is on a service, where providers do not prohibit this content they would still need to meet the strengthened duty to use age verification or age estimation.

Proportionality remains relevant for the purposes of providers in scope of the new duty at Clause 11(3)(a) only in terms of the age-verification or age-estimation measures that they choose to use. A smaller provider with less capacity may choose to go for a less costly but still highly effective measure. For instance, a smaller provider with less capacity might seek a third-party solution, whereas a larger provider with greater capacity might develop their own solution. Any measures that providers use will need to meet the new high bar of being “highly effective”. If a provider does not comply with the new duties and fails to use measures which are highly effective at correctly determining whether or not a particular user is a child, Ofcom can take tough enforcement action.

The other amendments in this group seek to remove references to the size and capacity of providers in provisions relating to proportionality. The principle of proportionate, risk-based regulation is fundamental to the Bill’s regulatory framework, and we consider that the Bill as drafted already strikes the correct balance. The Bill ultimately will regulate a large number of services, ranging from some of the biggest companies in the world to smaller, voluntary organisations, as we discussed in our earlier debate on exemptions for public interest services.

The provisions regarding size and capacity recognise that what it is proportionate to require of companies of various sizes and business models will be different. Removing this provision would risk setting a lowest common denominator standard which does not create incentives for larger technology companies to do more to protect their users than smaller organisations. For example, it would not be proportionate for a large multinational company which employs thousands of content moderators and which invests in significant safety technologies to argue that it is required to take only the same steps to comply as a smaller provider which might have only a handful of employees and a few thousand UK users.

While the size and capacity of providers is included as part of a consideration of proportionality, let me be clear that this does not mean that smaller providers or those with less capacity do not need to meet the child safety duties and other duties in the Bill, such as the illegal content safety duties. These duties set out clear requirements for providers. If providers do not meet these duties, they will face enforcement action.

I hope that is reassuring to my noble friend Lord Bethell and to the other noble Lords with amendments in this group. I urge my noble friend to withdraw his amendment.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.

We are sympathetic to the amendment. It is complex, and this has been a useful debate—

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.

The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.

The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.

The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.

Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.

In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.

With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.

We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.

As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.

The Bill already includes a definition of age assurance in Clause 207, which is

“measures designed to estimate or verify the age or age-range of users of a service”.

As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.

This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I am all that is left between us and hearing from the Minister with his good news, so I will constrain my comments accordingly.

The noble Baroness, Lady Kidron, begin by paying tribute to the parents of Olly, Breck, Molly, Frankie and Sophie. I very much join her in doing that; to continually have to come to this place and share their trauma and experience comes at a great emotional cost. We are all very grateful to them for doing it and for continuing to inform and motivate us in trying to do the right thing. I am grateful to my noble friend Lady Healy and in particular to the noble Baroness, Lady Newlove, for amplifying that voice and talking about the lost opportunity, to an extent, of our failure to find a way of imposing a general duty of care on the platforms, as was the original intention when the noble Baroness, Lady Morgan, was the Secretary of State.

I also pay a big tribute to the noble Baroness, Lady Kidron. She has done the whole House, the country and the world a huge service in her campaigning around this and in her influence on Governments—not just this one—on these issues. We would not be here without her tireless efforts, and it is important that we acknowledge that.

We need to ensure that coroners can access the information they need to do their job, and to have proper sanctions available to them when they are frustrated in being able to do it. This issue is not without complication, and I very much welcome the Government’s engagement in trying to find a way through it. I too look forward to the good news that has been trailed; I hope that the Minister will be able to live up to his billing. Like the noble Baroness, Lady Harding, I would love to see him embrace, at the appropriate time, the “safety by design” amendments and some others that could complete this picture. I also look forward to his answers on issues such as data preservation, which the noble Lord, Lord Allan, covered among the many other things in his typically fine speech.

I very much agree that we should have a helpline and do more about that. Some years ago, when my brother-in-law sadly died in his 30s, it fell to me to try to sort out his social media accounts. I was perplexed that the only way I could do it was by fax to these technology companies in California. That was very odd, so to have proper support for bereaved families going through their own grief at that moment seems highly appropriate.

As we have discussed in the debates on the Bill, a digital footprint is an asset that is exploited by these companies. But it is an asset that should be regarded as part of one’s estate that can be bequeathed to one’s family; then some of these issues would perhaps be lessened. On that basis, and in welcoming a really strong and moving debate, I look forward to the Minister’s comments.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - -

My Lords, this has been a strong and moving debate, and I am grateful to the noble Baroness, Lady Kidron, for bringing forward these amendments and for the way she began it. I also echo the thanks that the noble Baroness and others have given to the families of Breck Bednar, Sophie Parkinson, Molly Russell, Olly Stephens, Frankie Thomas and all the young people whose names she rightly held in remembrance at the beginning of this debate. There are too many others who find themselves in the same position. The noble Lord, Lord Knight, is right to pay tribute to their tirelessness in campaigning, given the emotional toll that we know it has on them. I know that they have followed the sometimes arcane processes of legislation and, as my noble friend Lady Morgan said, we all look forward to the Bill becoming an Act of Parliament so that it can make a difference to families who we wish to spare from the heartache they have had.

Every death is sorrowful, but the death of a child is especially heartbreaking. The Government take the issues of access to information relating to a deceased child very seriously. We have undertaken extensive work across government and beyond to understand the problems that parents, and coroners who are required to investigate such deaths, have faced in the past in order to bring forward appropriate solutions. I am pleased to say that, as a result of that work, and thanks to the tireless campaigning of the noble Baroness, Lady Kidron, and our discussions with those who, very sadly, have first-hand experience of these problems, we will bring forward a package of measures on Report to address the issues that parents and coroners have faced. Our amendments have been devised in close consultation with the noble Baroness and bereaved families. I hope the measures will rise to the expectations they rightly have and that they will receive their support.

The package of amendments will ensure that coroners have access to the expertise and information they need to conduct their investigations, including information held by technology companies, regardless of size, and overseas services such as Wattpad, mentioned by the noble Baroness, Lady Healy of Primrose Hill, in her contribution. This includes information about how a child interacted with specific content online as well as the role of wider systems and processes, such as algorithms, in promoting it. The amendments we bring forward will also help to ensure that the process for accessing data is more straightforward and humane. The largest companies must ensure that they are transparent with parents about their options for accessing data and respond swiftly to their requests. We must ensure that companies cannot stonewall parents who have lost a child and that those parents are treated with the humanity and compassion they deserve.

I take the point that the noble Baroness, Lady Kidron, rightly makes: small does not mean safe. All platforms will be required to comply with Ofcom’s requests for information about a deceased child’s online activity. That will be backed by Ofcom’s existing enforcement powers, so that where a company refuses to provide information without a valid excuse it may be subject to enforcement action, including sanctions on senior managers. Ofcom will also be able to produce reports for coroners following a Schedule 5 request on matters relevant to an investigation or inquest. This could include information about a company’s systems and processes, including how algorithms have promoted specific content to a child. This too applies to platforms of any size and will ensure that coroners are provided with information and expertise to assist them in understanding social media.

Where this Bill cannot solve an issue, we are exploring alternative avenues for improving outcomes as well. For example, the Chief Coroner has committed to consider issuing non-legislative guidance and training for coroners about social media, with the offer of consultation with experts.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I am happy to reassure my noble friend that the director of the Dawes Centre for Future Crime sits on the Home Office’s Science Advisory Council, whose work is very usefully fed into the work being done at the Home Office. Colleagues at the Ministry of Justice keep criminal law under constant review, in light of research by such bodies and what we see in the courts and society. I hope that reassures my noble friend that the points she raised, which are covered by organisations such as the Dawes Centre, are very much in the mind of government.

The noble Lord, Lord Allan of Hallam, explained very effectively the nuances of how behaviour translates to the virtual world. He is right that we will need to keep both offences and the framework under review. My noble friend Lady Berridge asked a good and clear question, to which I am afraid I do not have a similarly concise answer. I can reassure her that generated child sexual abuse and exploitation material is certainly illegal, but she asked about sexual harassment via a haptic suit; that would depend on the specific circumstances. I hope she will allow me to respond in writing, at greater length and more helpfully, to the very good question she asked.

Under Clause 56, Ofcom will also be required to undertake periodic reviews into the incidence and severity of content that is harmful to children on the in-scope services, and to recommend to the Secretary of State any appropriate changes to regulations based on its findings. Clause 141 also requires Ofcom to carry out research into users’ experiences of regulated services, which will likely include experiences of services such as the metaverse and other online spaces that allow user interaction. Under Clause 147, Ofcom may also publish reports on other online safety matters.

The questions posed by the noble Lord, Lord Russell of Liverpool, about international engagement are best addressed in a group covering regulatory co-operation, which I hope we will reach later today. I can tell him that we have introduced a new information-sharing gateway for the purpose of sharing information with overseas regulators, to ensure that Ofcom can collaborate effectively with its international counterparts. That builds on existing arrangements for sharing information that underpin Ofcom’s existing regulatory regimes.

The amendments tabled by the noble Lord, Lord Knight of Weymouth, relate to providers’ judgments about when content produced by bots is illegal content, or a fraudulent advertisement, under the Bill. Clause 170 sets out that providers will need to take into account all reasonably available relevant information about content when making a judgment about its illegality. As we discussed in the group about illegal content, providers will need to treat content as illegal when this information gives reasonable grounds for inferring that an offence was committed. Content produced by bots is in scope of providers’ duties under the Bill. This includes the illegal content duties, and the same principles for assessing illegal content will apply to bot-produced content. Rather than drawing inferences about the conduct and intent of the user who generated the content, the Bill specifies that providers should consider the conduct and the intent of the person who can be assumed to have controlled the bot at the point it created the content in question.

The noble Lord’s amendment would set out that providers could make judgments about whether bot-produced content is illegal, either by reference to the conduct or mental state of the person who owns the bot or, alternatively, by reference to the person who controls it. As he set out in his explanatory statement and outlined in his speech, I understand he has brought this forward because he is concerned that providers will sometimes not be able to identify the controller of a bot, and that this will impede providers’ duties to take action against illegal content produced by them. Even when the provider does not know the identity of the person controlling the bot, however, in many cases there will still be evidence from which providers can draw inferences about the conduct and intent of that person, so we are satisfied that the current drafting of the Bill ensures that providers will be able to make a judgment on illegality.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My concern is also whether or not the bot is out of control. Can the Minister clarify that issue?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It depends on what the noble Lord means by “out of control” and what content the bot is producing. If he does not mind, this may be an issue which we should go through in technical detail and have a more free-flowing conservation with examples that we can work through.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I regret that my noble friend Lord Lipsey is unable to be here. I wish him and the noble Lord, Lord McNally, well. I also regret that my noble friend Lord Stevenson is not here to wind up this debate and introduce his Amendment 127. Our inability to future-proof these proceedings means that, rather than talking to the next group, I am talking to this one.

I want to make four principal points. First, the principle of press freedom, as discussed by the noble Lords, Lord Black and Lord Faulks, in particular, is an important one. We do not think that this is the right Bill to reopen those issues. We look forward to the media Bill as the opportunity to discuss these things more fully across the House.

Secondly, I have some concerns about the news publisher exemption. In essence, as the noble Lord, Lord Clement-Jones, set out, as long as you have a standards code, a complaints process, a UK address and a team of contributors, the exemption applies. That feels a bit loose to me, and it opens up the regime to some abuse. I hear what the noble Baronesses, Lady Gohir and Lady Grey-Thompson, said about how we already see pretty dodgy outfits allowing racist and abusive content to proliferate. I look forward to the Minister’s comments on whether the bar we have at the moment is too low and whether there is some reflection to be done on that.

The third point is on my noble friend Lord Stevenson’s Amendment 127, which essentially says that we should set a threshold around whether complaints are dealt with in a timely manner. In laying that amendment, my noble friend essentially wanted to probe. The noble Lord, Lord Faulks, is here, so this is a good chance to have him listen to me say that we think that complaints should be dealt with more swiftly and that the organisation that he chairs could do better at dealing with that.

My fourth comment is about comments, particularly after listening to the speech of the noble Baroness, Lady Grey-Thompson, about some of the hateful comment that is hidden away inside the comments that news publishers carry. I was very much struck by what she said in respect of some of the systems of virality that are now being adopted by those platforms. There, I think Amendment 227 is tempting. I heard what the noble Baroness, Lady Stowell, said, and I think I agree that this is better addressed by Parliament.

For me, that just reinforces the need for this Bill, more than any other that I have ever worked on in this place, to have post-legislative scrutiny by Parliament so that we, as a Parliament, can review whether the regime we are setting up is running appropriately. It is such a novel regime, in particular around regulating algorithms and artificial intelligence. It would be an opportunity to see whether, in this case, the systems of virality were creating an amplification of harm away from the editorial function that the news publishers are able to exercise over the comments.

On that basis, and given the hour, I am happy to listen with care to the wise words of the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join noble Lords who have sent their best wishes to the noble Lords, Lord Lipsey and Lord McNally.

His Majesty’s Government are committed to defending the invaluable role of a free media. We are clear that our online safety legislation must protect the vital role of the press in providing people with reliable and accurate information.

We have included strong protections for news publishers’ and journalistic content in the Bill, which extends to the exemption from the Bill’s safety duties for users’ comments and reviews on news publishers’ sites. This reflects a wider exemption for comments and reviews on provider content more generally. For example, reviews of products on retailers’ sites are also exempt from regulation. This is designed to avoid disproportionate regulatory burden on low-risk services.

Amendment 124 intends to modify that exemption, so that the largest news websites no longer benefit and are subject to the Bill’s regulatory regime. Below-the-line comments are crucial for enabling reader engagement with the news and encouraging public debate, as well as for the sustainability—and, as the noble Baroness, Lady Fox, put it, the accountability—of the news media. We do not consider it proportionate, necessary or compatible with our commitment to press freedom to subject these comment sections to oversight by Ofcom.

We recognise that there can sometimes be unpleasant or abusive below-the-line comments. We have carefully considered the risks of this exemption against the need to protect freedom of speech and media freedoms on matters of public interest. Although comment functions will not be subject to online regulation, I reassure the Members of the Committee who raised concerns about some of the comments which have attracted particular attention that sites hosting such comments can, in some circumstances, be held liable for any illegal content appearing on them, where they have actual knowledge of the content in question and fail to remove it expeditiously.

The strong protections for recognised news publishers in the Bill include exempting their content from the Bill’s safety duties, requiring category 1 platforms to notify recognised news publishers and to offer a right of appeal before removing or moderating any of their content. Clause 50 stipulates the clear criteria that publishers will have to meet to be considered a “recognised news publisher” and to benefit from those protections. When drafting these criteria, the Government have been careful to ensure that established news publishers are captured, while limiting the opportunity for bad actors to qualify.

Amendment 126 seeks to restrict the criteria for recognised news publishers in the Bill, so that only members of an approved regulator within the meaning of Section 42 of the Crime and Courts Act 2013 benefit from the protections offered by the Bill. This would create strong incentives for publishers to join specific press regulators. We do not consider this to be compatible with our commitment to a free press. We will repeal existing legislation that could have that effect, specifically Section 40 of the Crime and Courts Act 2013, through the media Bill, as noble Lords have noted, which has recently been published. Without wanting to make a rod for my own back when we come to that Bill, I agree with my noble friend Lord Black of Brentwood that it would be the opportunity to have this debate, if your Lordships so wished.

The current effect of this amendment would be to force all news publishers to join a single press regulator—namely Impress, the only UK regulator which has sought approval by the Press Recognition Panel—if they were to benefit from the exclusion for recognised news publishers. Requiring a publisher to join specific regulators is, in the view of His Majesty’s Government, not only incompatible with protecting press freedom in the UK but unnecessary given the range of detailed criteria which a publisher must meet to qualify for the additional protections, as set out in Clause 50 of the Bill.

As part of our commitment to media freedom, we are committed to independent self-regulation of the press. As I have indicated, Clause 50 stipulates the clear criteria which publishers will have to meet to be considered a “recognised news publisher” and to benefit from the protections in the Bill. One of those criteria is for entities to have policies and procedures for handling and resolving complaints. Amendment 127 from the noble Lord, Lord Stevenson, adds a requirement that these policies and procedures must cover handling and resolving complaints “in a timely manner”. To include such a requirement will place the responsibility on Ofcom to decide what constitutes “timely”, and, in effect, put it in the position of press regulator. That is not something that we would like. We believe that the criteria set out in Clause 50 are already strong, and we have taken significant care to ensure that established news publishers are captured, while limiting the opportunity for bad actors to benefit.

I turn now to Amendment 227. We recognise that, as legislation comes into force, it will be necessary to ensure that the protections we have put in place for journalistic and news publisher content are effective. We need to ensure that the regulatory framework does not hinder access to such content, particularly in the light of the fact that, in the past, news content has sometimes been removed or made less visible by social media moderators or algorithms for unclear reasons, often at the height of news cycles. That is why we have required Ofcom to produce a specific report, under Clause 144, assessing the impact of the Bill on the availability and treatment of news publisher and journalistic content on category 1 services.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

Before the Minister closes his folder and sits down, perhaps I could say that I listened carefully and would just like him to reflect a little more for us on my question of whether the bar is set too low and there is too much wriggle room in the exemption around news publishers. A tighter definition might be something that would benefit the Bill and the improvement of the Bill when we come back to it on Report.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Looking at the length of Clause 50—and I note that the noble Lord, Lord Allan of Hallam, made much the same point in his speech—I think the definitions set out in Clause 50 are extensive. Clause 50(1) sets out a number of recognised news publishers, obviously including

“the British Broadcasting Corporation, Sianel Pedwar Cymru”—

self-evidently, as well as

“the holder of a licence under the Broadcasting Act 1990 or 1996”

or

“any other entity which … meets all of the conditions in subsection (2), and … is not an excluded entity”

as set out in subsection (3). Subsection (2) sets out a number of specific criteria which I think capture the recognised news publishers we want to see.

Noble Lords will be aware of the further provisions we have brought forward to make sure that entities that are subject to a sanction are not able to qualify, such as—

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.

These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.

I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.

As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.

The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.

Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.

Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.

Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:

“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”


That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, the Government recognise the objectives of the amendments in this group: to strengthen protections for adults online. I hope noble Lords will agree that the Bill will indeed significantly improve the safety of all adult users, particularly those who are more vulnerable.

The user empowerment content features will not be the only measures in the Bill that will protect adults. They will act as a final layer of protection, coming after the duties on illegal content and the requirement on category 1 providers to uphold their terms of service. However, as the Clause 12 duties apply to legal content, we need to tread carefully and not inadvertently restrict free expression.

Amendments 34 and 35 in the name of my noble friend Lady Morgan of Cotes and Amendments 36 and 37 in the name of the noble Lord, Lord Clement-Jones, seek to require category 1 services to have their user empowerment content features in operation by default for adult users. The Government share concerns about users who experience disproportionate levels of abuse online or those who are more susceptible to suicide, self-harm or eating disorder content, but these amendments encroach on users’ rights in two ways.

First, the amendments intend to make the decision on behalf of users about whether to have these features turned on. That is aimed especially at those who might not otherwise choose to use those features. The Government do not consider it appropriate to take that choice away from adults, who must be allowed to decide for themselves what legal content they see online. That debate was distilled in the exchange just now between the noble Lord, Lord Knight, and the noble Baroness, Lady Fox, when the noble Lord said he would err on the side of caution, even overcaution, while he characterised the other side as a free-for-all. I might say that it was erring on the side of freedom. That is the debate that we are having, and should have, when looking at these parts of the Bill.

Secondly, the amendments would amount to a government requirement to limit adults’ access to legal content. That presents real concerns about freedom of expression, which the Government cannot accept.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

We will come in a moment to the provisions that are in the Bill to make sure that decisions can be taken by adults, including vulnerable adults, easily and clearly. If the noble Lord will allow, I will cover that point.

I was in the middle of reminding noble Lords that there are a range of measures that providers can put in place under these duties, some of which might have an impact on a user’s experience if they were required to be switched on by default. That may include, for example, restricting a user’s news feed to content from connected users, adding to the echo chamber and silos of social media, which I know many noble Lords would join me in decrying. We think it is right that that decision is for individual users to make.

The Bill sets out that the user empowerment content tools must be offered to all adult users and must be easy to access—to go the point raised just now as well as by my noble friend Lady Harding, and the noble Baroness, Lady Burt, and, as noble Lords were right to remind us, pushed by the noble Baroness, Lady Campbell of Surbiton, who I am pleased to say I have been able to have discussions with separately from this Committee.

Providers will also be required to have clear and accessible terms of service about what tools are offered on their service and how users might take advantage of them. Ofcom will be able to require category 1 services to report on user empowerment tools in use through transparency reports. Ofcom is also bound by the Communications Act 2003 and the public sector equality duty, so it will need to take into account the ways that people with certain characteristics, including people with disabilities, may be affected when performing its duties, such as writing the codes of practice for the user empowerment duties.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I think the Minister is trying to answer the point raised by my noble friend about vulnerable adults. I am interested in the extent to which he is relying on the Equality Act duty on Ofcom then to impact the behaviour of the platforms that it is regulating in respect of how they are protecting vulnerable adults. My understanding is that the Equality Act duty will apply not to the platforms but only to Ofcom in the way that it regulates them. I am unclear how that is going to provide the protection that we want.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

That is right. Platforms are not in the public sector, so the public sector equality duty does not apply to them. However, that duty applies to Ofcom, taking into account the ways in which people with certain characteristics can be affected through the codes of practice and the user empowerment duties that it is enforcing. So it suffuses the thinking there, but the duty is on Ofcom as a public sector body.

We talk later in Clause 12(11) of some of the characteristics that are similar in approach to the protected characteristics in the Equality Act 2010. I will come to that again shortly in response to points made by noble Lords.

I want to say a bit about the idea of there being a cliff edge at the age of 18. This was raised by a number of noble Lords, including the noble Lord, Lord Griffiths, my noble friends Lady Morgan and Lady Harding and the noble Baroness, Lady Kidron. The Bill’s protections recognise that, in law, people become adults when they turn 18—but it is not right to say that there are no protections for young adults. As noble Lords know, the Bill will provide a triple shield of protection, of which the user empowerment duties are the final element.

The Bill already protects young adults from illegal content and content that is prohibited in terms and conditions. As we discussed in the last group, platforms have strong commercial incentives to prohibit content that the majority of their users do not want to see. Our terms of service duties will make sure that they are transparent about and accountable for how they treat this type of content.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There is an element of circularity to what the Minister is saying. This is precisely why we are arguing for the default option. It allows this vulnerability to be taken account of.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Perhaps it would help if the Minister wanted to just set out the difference for us. Clearly, this Committee has spent some time debating the protection for children, which has a higher bar than protection for adults. It is not possible to argue that there will be no difference at the age of 18, however effective the first two elements of the triple shield are. Maybe the Minister needs to think about coming at it from the point of view of a child becoming an adult, and talk us through what the difference will be.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Once somebody becomes an adult in law at the age of 18, they are protected through the triple shield in the Bill. The user empowerment duties are one element of this, along with the illegal content duties and the protection against content prohibited in terms and conditions and the redress through Ofcom.

The legislation delivers protection for adults in a way that preserves their choice. That is important. At the age of 18, you can choose to go into a bookshop and to encounter this content online if you want. It is not right for the Government to make decisions on behalf of adults about the legal content that they see. The Bill does not set a definition of a vulnerable adult because this would risk treating particular adults differently, or unfairly restricting their access to legal content or their ability to express themselves. There is no established basis on which to do that in relation to vulnerability.

Finally, we remain committed to introducing a new criminal offence to capture communications that intentionally encourage or assist serious self-harm, including eating disorders. This will provide another layer of protection on top of the regulatory framework for both adults and children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I understand all of that—I think—but that is not the regime being applied to children. It is really clear that children have a safer, better experience. The difference between those experiences suddenly happening on an 18th birthday is what we are concerned about.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Before the Minister stands up—a new phrase—can he confirm that it is perfectly valid to have a choice to lift the user empowerment tool, just as it is to impose it? Choice would still be there if our amendments were accepted.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It would be, but we fear the chilling effect of having the choice imposed on people. As the noble Baroness, Lady Fox, rightly put it, one does not know what one has not encountered until one has engaged with the idea. At the age of 18, people are given the choice to decide what they encounter online. They are given the tools to ensure that they do not encounter it if they do not wish to do so. As the noble Lord has heard me say many times, the strongest protections in the Bill are for children. We have been very clear that the Bill has extra protections for people under the age of 18, and it preserves choice and freedom of expression online for adult users—young and old adults.

My noble friend Lady Buscombe asked about the list in Clause 12(11). We will keep it under constant review and may consider updating it should compelling evidence emerge. As the list covers content that is legal and designed for adults, it is right that it should be updated by primary legislation after a period of parliamentary scrutiny.

Amendments 42 and 38A, tabled by the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, respectively, seek to change the scope of user empowerment content features. Amendment 38A seeks to expand the user empowerment content features to include the restriction of content the provenance of which cannot be authenticated. Amendment 42 would apply features to content that is abusive on the basis of characteristics protected under the Equality Act 2010.

The user empowerment content list reflects areas where there is the greatest need for users to be offered choice about reducing their exposure to types of content. While I am sympathetic to the intention behind the amendments, I fear they risk unintended consequences for users’ rights online. The Government’s approach recognises the importance of having clear, enforceable and technically feasible duties that do not infringe users’ rights to free expression. These amendments risk undermining this. For instance, Amendment 38A would require the authentication of the provenance of every piece of content present on a service. This could have severe implications for freedom of expression, given its all-encompassing scope. Companies may choose not to have anything at all.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I will try to help the Minister. If the amendment has been poorly drafted, I apologise. It does not seek to require a platform to check the provenance of every piece of content, but content that is certified as having good provenance would have priority for me to be able to see it. In the Bill, I can see or not see verified users. In the same way, I could choose to see or not see verified content.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Thank you. I may be reading the noble Lord’s Amendment 38A excessively critically. I will look at it again. To try to reassure the noble Lord, the Bill already ensures that all services take steps to remove illegal manufactured or manipulated content when they become aware of it. Harmful and illegal misinformation and disinformation is covered in that way.

Amendment 42 would require providers to try to establish on a large scale what is a genuinely held belief that is more than an opinion. In response, I fear that providers would excessively apply the user empowerment features to manage that burden.

A number of noble Lords referred to the discrepancy between the list—

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I think those last two comments were what are known in court as leading questions.

As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.

These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.

The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.

The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.

The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.

In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.

Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful, as ever, to the noble Baroness, and I hope that has assisted the noble Lord, Lord Vaizey.

Finally—just about—I will speak to Amendment 32A, tabled in my name, about VPNs. I was grateful to the noble Baroness for her comments. In many ways, I wanted to give the Minister the opportunity to put something on the record. I understand, and he can confirm whether my understanding is correct, that the duties on the platforms to be safe is regardless of whether a VPN has been used to access the systems and the content. The platforms, the publishers of content that are user-to-user businesses, will have to detect whether a VPN is being used, one would suppose, in order to ensure that children are being protected and that that is genuinely a child. Is that a correct interpretation of how the Bill works? If so, is it technically realistic for those platforms to be able to detect whether someone is landing on their site via a VPN or otherwise? In my mind, the anecdote that the noble Baroness, Lady Harding, related, about what the App Store algorithm on Apple had done in pushing VPNs when looking for porn, reinforces the need for app stores to become in scope, so that we can get some of that age filtering at that distribution point, rather than just relying on the platforms.

Substantially, this group is about platforms anticipating harms, not reviewing them and then fixing them despite their business model. If we can get the platforms themselves designing for children’s safety and then working out how to make the business models work, rather than the other way around, we will have a much better place for children.

Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - -

My Lords, I join in the chorus of good wishes to the bungee-jumping birthday Baroness, Lady Kidron. I know she will not have thought twice about joining us today in Committee for scrutiny of the Bill, which is testament to her dedication to the cause of the Bill and, more broadly, to protecting children online. The noble Lord, Lord Clement-Jones, is right to note that we have already had a few birthdays along the way; I hope that we get only one birthday each before the Bill is finished.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Very good—only one each, and hopefully fewer. I thank noble Lords for the points they raised in the debate on these amendments. I understand the concerns raised about how the design and operation of services can contribute to risk and harm online.

The noble Lord, Lord Russell, was right, when opening this debate, that companies are very successful indeed at devising and designing products and services that people want to use repeatedly, and I hope to reassure all noble Lords that the illegal and child safety duties in the Bill extend to how regulated services design and operate their services. Providers with services that are likely to be accessed by children will need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service. It also includes reviewing children’s use of higher-risk features, such as live streaming or private messaging. Service providers are also specifically required to consider the design of functionalities, algorithms and other features when delivering the child safety duties imposed by the Bill.

I turn first to Amendments 23 and 76 in the name of the noble Lord, Lord Russell. These would require providers to eliminate the risk of harm to children identified in the service’s most recent children’s risk assessment, in addition to mitigating and managing those risks. The Bill will deliver robust and effective protections for children, but requiring providers to eliminate the risk of harm to children would place an unworkable duty on providers. As the noble Baroness, Lady Fox, my noble friend Lord Moylan and others have noted, it is not possible to eliminate all risk of harm to children online, just as it is not possible entirely to eliminate risk from, say, car travel, bungee jumping or playing sports. Such a duty could lead to service providers taking disproportionate measures to comply; for instance, as noble Lords raised, restricting children’s access to content that is entirely appropriate for them to see.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Does the Minister accept that that is not exactly what we were saying? We were not saying that they would have to eliminate all risk: they would have to design to eliminate risks, but we accept that other risks will apply.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

It is part of the philosophical ruminations that we have had, but the point here is that elimination is not possible through the design or any drafting of legislation or work that is there. I will come on to talk a bit more about how we seek to minimise, mitigate and manage risk, which is the focus.

Amendments 24, 31, 32, 77, 84, 85 and 295, from the noble Lord, Lord Russell, seek to ensure that providers do not focus just on content when fulfilling their duties to mitigate the impact of harm to children. The Bill already delivers on those objectives. As the noble Baroness, Lady Kidron, noted, it defines “content” very broadly in Clause 207 as

“anything communicated by means of an internet service”.

Under this definition, in essence, all communication and activity is facilitated by content.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I cannot give a firm timescale today but I will seek what further information I can provide in writing. I have not seen it yet, but I know that the work continues.

Amendments 28 and 82, in the name of the noble Lord, Lord Russell, seek to remove the size and capacity of a service provider as a relevant factor when determining what is proportionate for services in meeting their child safety duties. This provision is important to ensure that the requirements in the child safety duties are appropriately tailored to the size of the provider. The Bill regulates a large number of service providers, which range from some of the biggest companies in the world to small voluntary organisations. This provision recognises that what it is proportionate to require of providers at either end of that scale will be different.

Removing this provision would risk setting a lowest common denominator. For instance, a large multinational company could argue that it is required only to take the same steps to comply as a smaller provider.

Amendment 32A from the noble Lord, Lord Knight of Weymouth, would require services to have regard to the potential use of virtual private networks and similar tools to circumvent age-restriction measures. He raised the use of VPNs earlier in this Committee when we considered privacy and encryption. As outlined then, service providers are already required to think about how safety measures could be circumvented and take steps to prevent that. This is set out clearly in the children’s risk assessment and safety duties. Under the duty at Clause 10(6)(f), all services must consider the different ways in which the service is used and the impact of such use on the level of risk. The use of VPNs is one factor that could affect risk levels. Service providers must ensure that they are effectively mitigating and managing risks that they identify, as set out in Clause 11(2). The noble Lord is correct in his interpretation of the Bill vis-à-vis VPNs.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Is this technically possible?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Technical possibility is a matter for the sector—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

I am grateful to the noble Lord for engaging in dialogue while I am in a sedentary position, but I had better stand up. It is relevant to this Committee whether it is technically possible for providers to fulfil the duties we are setting out for them in statute in respect of people’s ability to use workarounds and evade the regulatory system. At some point, could he give us the department’s view on whether there are currently systems that could be used —we would not expect them to be prescribed—by platforms to fulfil the duties if people are using their services via a VPN?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

This is the trouble with looking at legislation that is technologically neutral and future-proofed and has to envisage risks and solutions changing in years to come. We want to impose duties that can technically be met, of course, but this is primarily a point for companies in the sector. We are happy to engage and provide further information, but it is inherently part of the challenge of identifying evolving risks.

The provision in Clause 11(16) addresses the noble Lord’s concerns about the use of VPNs in circumventing age-assurance or age-verification measures. For it to apply, providers would need to ensure that the measures they put in place are effective and that children cannot normally access their services. They would need to consider things such as how the use of VPNs affects the efficacy of age-assurance and age-verification measures. If children were routinely using VPNs to access their service, they would not be able to conclude that Clause 11(16) applies. I hope that sets out how this is covered in the Bill.

Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A, 122, 122ZA, 122ZB and 122ZC from the noble Lord, Lord Russell of Liverpool, seek to make the measures Ofcom sets out in codes of practice mandatory for all services. I should make it clear at the outset that companies must comply with the duties in the Bill. They are not optional and it is not a non-statutory regime; the duties are robust and binding. It is important that the binding legal duties on companies are decided by Parliament and set out in legislation, rather than delegated to a regulator.

Codes of practice provide clarity on how to comply with statutory duties, but should not supersede or replace them. This is true of codes in other areas, including the age-appropriate design code, which is not directly enforceable. Following up on the point from my noble friend Lady Harding of Winscombe, neither the age-appropriate design code nor the SEND code is directly enforceable. The Information Commissioner’s Office or bodies listed in the Children and Families Act must take the respective codes into account when considering whether a service has complied with its obligations as set out in law.

As with these codes, what will be directly enforceable in this Bill are the statutory duties by which all sites in scope of the legislation will need to abide. We have made it clear in the Bill that compliance with the codes will be taken as compliance with the duties. This will help small companies in particular. We must also recognise the diversity and innovative nature of this sector. Requiring compliance with prescriptive steps rather than outcomes may mean that companies do not use the most effective or efficient methods to protect children.

I reassure noble Lords that, if companies decide to take a different route to compliance, they will be required to document what their own measures are and how they amount to compliance. This will ensure that Ofcom has oversight of how companies comply with their duties. If the alternative steps that providers have taken are insufficient, they could face enforcement action. We expect Ofcom to take a particularly robust approach to companies which fail to protect their child users.

My noble friend Lord Vaizey touched on the age-appropriate design code in his remarks—

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we too support the spirit of these amendments very much and pay tribute to the noble Lord, Lord Russell, for tabling them.

In many ways, I do not need to say very much. I think the noble Baroness, Lady Kidron, made a really powerful case, alongside the way the group was introduced in respect of the importance of these things. We do want the positivity that the noble Baroness, Lady Harding, talked about in respect of the potential and opportunity of technology for young people. We want them to have the right to freedom of expression, privacy and reliable information, and to be protected from exploitation by the media. Those happen to be direct quotes from the UN Convention on the Rights of the Child, as some of the rights they would enjoy. Amendments 30 and 105, which the noble Lord, Lord Clement-Jones, tabled—I attached my name to Amendment 30—are very much in that spirit of trying to promote well-being and trying to say that there is something positive that we want to see here.

In particular, I would like to see that in respect of Ofcom. Amendment 187 is, in some ways, the more significant amendment and the one I most want the Minister to reflect on. That is the one that applies to Ofcom: that it should have reference to the UN Convention on the Rights of the Child. I think even the noble Lord, Lord Weir, could possibly agree. I understand his thoughtful comments around whether or not it is right to encumber business with adherence to the UN convention, but Ofcom is a public body in how it carries out its duties as a regulator. There are choices for regulation. Regulation can just be about minimum standards, but it can also be about promoting something better. What we are seeking here in trying to have reference to the UN convention is for Ofcom to regulate for something more positive and better, as well as police minimum standards. On that basis, we support the amendments.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, I will start in the optimistic spirit of the debate we have just had. There are many benefits to young people from the internet: social, educational and many other ways that noble Lords have mentioned today. That is why the Government’s top priority for this legislation has always been to protect children and to ensure that they can enjoy those benefits by going online safely.

Once again, I find myself sympathetic to these amendments, but in a position of seeking to reassure your Lordships that the Bill already delivers on their objectives. Amendments 25, 78, 187 and 196 seek to add references to the United Nations Convention on the Rights of the Child and general comment 25 on children’s rights in relation to the digital environment to the duties on providers and Ofcom in the Bill.

As I have said many times before, children’s rights are at the heart of this legislation, even if the phrase itself is not mentioned in terms. The Bill already reflects the principles of the UN convention and the general comment. Clause 207, for instance, is clear that a “child” means a person under the age of 18, which is in line with the convention. All providers in scope of the Bill need to take robust steps to protect users, including children, from illegal content or activity on their services and to protect children from content which is harmful to them. They will need to ensure that children have a safe, age-appropriate experience on services designed for them.

Both Ofcom and service providers will also have duties in relation to users’ rights to freedom of expression and privacy. The safety objectives will require Ofcom to ensure that services protect children to a higher standard than adults, while also making sure that these services account for the different needs of children at different ages, among other things. Ofcom must also consult bodies with expertise in equality and human rights, including those representing the interests of children, for instance the Children’s Commissioner. While the Government fully support the UN convention and its continued implementation in the UK, it would not be appropriate to place obligations on regulated services to uphold an international treaty between state parties. We agree with the reservations that were expressed by the noble Lord, Lord Weir of Ballyholme, in his speech, and his noble friend Lady Foster.

The convention’s implementation is a matter for the Government, not for private businesses or voluntary organisations. Similarly, the general comment acts as guidance for state parties and it would not be appropriate to refer to that in relation to private entities. The general comment is not binding and it is for individual states to determine how to implement the convention. I hope that the noble Lord, Lord Russell, will feel reassured that children’s rights are baked into the Bill in more ways than a first glance may suggest, and that he will be content to withdraw his amendment.

The noble Lord, Lord Clement-Jones, in his Amendments 30 and 105, seeks to require platforms and Ofcom to consider a service’s benefits to children’s rights and well-being when considering what is proportionate to fulfil the child safety duties of the Bill. They also add children’s rights and well-being to the online safety objectives for user-to-user services. The Bill as drafted is focused on reducing the risk of harm to children precisely so that they can better enjoy the many benefits of being online. It already requires companies to take a risk-based and proportionate approach to delivering the child safety duties. Providers will need to address only content that poses a risk of harm to children, not that which is beneficial or neutral. The Bill does not require providers to exclude children or restrict access to content or services that may be beneficial for them.

Children’s rights and well-being are already a central feature of the existing safety objectives for user-to-user services in Schedule 4 to the Bill. These require Ofcom to ensure that services protect children to a higher standard than adults, while making sure that these services account for the different needs of children at different ages, among other things. On this basis, while I am sympathetic to the aims of the amendments the noble Lord has brought forward, I respectfully say that I do not think they are needed.

More pertinently, Amendment 30 could have unintended consequences. By introducing a broad balancing exercise between the harms and benefits that children may experience online, it would make it more difficult for Ofcom to follow up instances of non-compliance. For example, service providers could take less effective safety measures to protect children, arguing that, as their service is broadly beneficial to children’s well-being or rights, the extent to which they need to protect children from harm is reduced. This could mean that children are more exposed to more harmful content, which would reduce the benefits of going online. I hope that this reassures the noble Lord, Lord Russell, of the work the Bill does in the areas he has highlighted, and that it explains why I cannot accept his amendments. I invite him to withdraw Amendment 25.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.

The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.

Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.

With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.

In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.

The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.

Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.

The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.

The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.

The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.

More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.

I turn now to Amendments 14—

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

If I can finish my point, this will bring into scope services of the kind set out in the amendments, such as those designed or intended for use by children, or where children form a substantial and identifiable user group. The current condition also considers the nature and content of the service and whether it has a particular appeal for children. Ofcom will be required to consult the Information Commissioner’s Office on its guidance to providers on fulfilling this test, which will further support alignment between the Bill and the age-appropriate design code.

On the meaning of “significant”, a significant number of children means a significant number in itself or a significant proportion of the total number of UK-based users on the service. In the Bill, “significant” has its ordinary meaning, and there are many precedents for it in legislation. Ofcom will be required to produce and publish guidance for providers on how to make the children’s access assessment. Crucially, the test in the Bill provides more legal certainty and clarity for providers than the test outlined in the code. “Substantive” and “identifiable”, as suggested in this amendment, do not have such a clear legal meaning, so this amendment would give rise to the risk that the condition is more open to challenge from providers and more difficult to enforce. On the other hand, as I said, “significant” has an established precedent in legislation, making it easier for Ofcom, providers and the courts to interpret.

The noble Lord, Lord Knight, talked about the importance of future-proofing the Bill and emerging technologies. As he knows, the Bill has been designed to be technology neutral and future-proofed, to ensure that it keeps pace with emerging technologies. It will apply to companies which enable users to share content online or to interact with each other, as well as to search services. Search services using AI-powered features will be in scope of the search duties. The Bill is also clear that content generated by AI bots is in scope where it interacts with user-generated content, such as bots on Twitter. The metaverse is also in scope of the Bill. Any service which enables users to interact as the metaverse does will have to conduct a child access test and comply with the child safety duties if it is likely to be accessed by children.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

I know it has been said that the large language models, such as that used by ChatGPT, will be in scope when they are embedded in search, but are they in scope generally?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

They are when they apply to companies enabling users to share content online and interact with each other or in terms of search. They apply in the context of the other duties set out in the Bill.

Amendments 19, 22, 298 and 299, tabled by my noble friend Lady Harding of Winscombe, seek to impose child safety duties on application stores. I am grateful to my noble friend and others for the collaborative approach that they have shown and for the time that they have dedicated to discussing this issue since Second Reading. I appreciate that she has tabled these amendments in the spirit of facilitating a conversation, which I am willing to continue to have as the Bill progresses.

As my noble friend knows from our discussions, there are challenges with bringing application stores—or “app stores” as they are popularly called—into the scope of the Bill. Introducing new duties on such stores at this stage risks slowing the implementation of the existing child safety duties, in the way that I have just outlined. App stores operate differently from user-to-user and search services; they pose different levels of risk and play a different role in users’ experiences online. Ofcom would therefore need to recruit different people, or bring in new expertise, to supervise effectively a substantially different regime. That would take time and resources away from its existing priorities.

We do not think that that would be a worthwhile new route for Ofcom, given that placing child safety duties on app stores is unlikely to deliver any additional protections for children using services that are already in the scope of the Bill. Those services must already comply with their duties to keep children safe or will face enforcement action if they do not. If companies do not comply, Ofcom can rely on its existing enforcement powers to require app stores to remove applications that are harmful to children. I am happy to continue to discuss this matter with my noble friend and the noble Lord, Lord Knight, in the context of the differing implementation timelines, as he has asked.

Online Safety Bill

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Monday 7th November 2022

(1 year, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

My noble friend is right to point to the noble Lord, Lord Grade of Yarmouth, as one of many voices in your Lordships’ House who will help us in the important scrutiny of this Bill. We are very keen for that to take place. Of course, the other place has to finish its scrutiny before this happens. Once it has done that, we can debate it here.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, business managers will be listening. I hope they will make sure that we are given sufficient time in this House to give proper scrutiny to a highly complex Bill.

If part of the compromises that may have been made in the department are to remove aspects of the Bill, particularly around “legal but harmful”, could the Minister also consider—and have conversations across government—about finding time in a subsequent legislative Session for us to finish the job if the Bill that he brings to this House does not do a proper job?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - -

Regarding future legislative Sessions, I will restrict myself to the debate on the current one. The noble Lord is right: the business managers will have heard how anxious your Lordships’ House is to see the Bill and begin its scrutiny. The decision will be communicated in the usual way.

Media Literacy

Debate between Lord Parkinson of Whitley Bay and Lord Knight of Weymouth
Monday 20th June 2022

(1 year, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

Digital literacy is a key priority in the computing national curriculum in England, which equips people with knowledge, understanding and skills to use the internet creatively and purposefully. Through citizenship education and other subjects, as I mentioned, we are making sure that schoolchildren are equipped with the skills that they need, and of course the companies themselves have a role to play in delivering and funding media literacy education. We welcome the steps that platforms have already taken, but we believe that they can go further to empower and educate their users.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, the 2003 media literacy duty on Ofcom that the Minister referred to predates social media and urgently needs updating. Carole Cadwalladr’s work has shown how online misinformation has potentially perverted our democracy. The Ofcom strategy is insufficient. Will the Minister agree to meet me and other members of the All-Party Parliamentary Group on Media Literacy in advance of the Online Safety Bill being introduced in this House to try to resolve this problem?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - -

I would be very happy to meet the noble Lord and other members ahead of the Online Safety Bill, during which I know we will debate this important area in greater detail. He is right that much has happened since the Communications Act 2003 was passed, but Ofcom’s own strategy published in December last year shows its up-to-date thinking and work in this important and evolving area.