Debates between Baroness Harding of Winscombe and Baroness Kidron during the 2019 Parliament

Wed 20th Mar 2024
Data Protection and Digital Information Bill
Grand Committee

Committee stage & Committee stage: Minutes of Proceedings & Committee stage: Minutes of Proceedings & Committee stage & Committee stage
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Data Protection and Digital Information Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.

We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.

Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.

I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.

I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.

Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.

There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.

Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:

“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.


The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.

The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:

“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.


In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.

I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.

The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.

Data Protection and Digital Information Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendments 8, 21, 23 and 145 in my name and thank the other noble Lords who have added their names to them. In the interests of brevity, and as the noble Lord, Lord Clement-Jones, has done some of the heavy lifting on this, I will talk first to Amendment 8.

The definition of scientific research has been expanded to include commercial and non-commercial activity, so far as it

“can reasonably be described as scientific”,

but “scientific” is not defined. As the noble Lord said, there is no public interest requirement, so a commercial company can, in reality, develop almost any kind of product on the basis that it may have a scientific purpose, even—or maybe especially—if it measures your propensity to impulse buy or other commercial things. The spectre of scientific inquiry is almost infinite. Amendment 8 would exclude children simply by adding proposed new paragraph (e), which says that

“the data subject is not a child or could or should be known to be a child”,

so that their personal data cannot be used for scientific research purposes to which they have not given their consent.

I want to be clear that I am pro-research and understand the critical role that data plays in enabling us to understand societal challenges and innovate towards solutions. Indeed, I have signed the amendment in the name of the noble Lord, Lord Bethell, which would guarantee access to data for academic researchers working on matters of public interest. Some noble Lords may have been here last night, when the US Surgeon- General Vice Admiral Dr Murthy, who gave the Lord Speaker’s lecture, made a fierce argument in favour of independent public interest research, not knowing that such a proposal has been laid. I hope that, when we come to group 17, the Government heed his wise words.

In the meantime, Clause 3 simply embeds the inequality of arms between academics and corporates and extends it, making it much easier for commercial companies to use personal data for research while academics continue to be held to much higher ethical and professional standards. They continue to require express consent, DBS checks and complex ethical requirements. Not doing so, simply using personal data for research, is unethical and commercial players can rely on Clause 3 to process data without consent, in pursuit of profit. Like the noble Lord, Lord Clement-Jones, I would prefer an overall solution to this but, in its absence, this amendment would protect data from being commoditised in this way.

Amendments 21 and 23 would specifically protect children from changes to Clause 6. I have spoken on this a little already, but I would like it on the record that I am absolutely in favour of a safeguarding exemption. The additional purposes, which are compatible with but go beyond the original purpose, are not a safeguarding measure. Amendment 21 would amend the list of factors that a data controller must take into account to include the fact that children are entitled to a higher standard of protection.

Amendment 23 would not be necessary if Amendment 22 were agreed. It would commit the Secretary of State to ensuring that, when exercising their power under new Article 8A, as inserted by Clause 6(5), to add, vary or omit provisions of Annex 2, they take the 2018 Act and children’s data protection into account.

Finally, Amendment 145 proposes a code of practice on the use of children’s data in scientific research. This code would, in contrast, ensure that all researchers, commercial or in the public interest, are held to the same high standards by developing detailed guidance on the use of children’s data for research purposes. A burning question for researchers is how to properly research children’s experience, particularly regarding the harms defined by the Online Safety Act.

Proposed new subsection (1) sets out the broad headings that the ICO must cover to promote good practice. Proposed new subsection (2) confirms that the ICO must have regard to children’s rights under the UNCRC, and that they are entitled to a higher standard of protection. It would also ensure that the ICO consulted with academics, those who represent the interests of children and data scientists. There is something of a theme here: if the changes to UK GDPR did not diminish data subjects’ privacy and rights, there would be no need for amendments in this group. If there were a code for independent public research, as is so sorely needed, the substance of Amendment 145 could usefully form a part. If commercial companies can extend scientific research that has no definition, and if the Bill expands the right to further processing and the Secretary of State can unilaterally change the basis for onward processing, can the Minister explain, when he responds, how he can claim that the Bill maintains protections for children?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I will be brief because I associate myself with everything that the noble Baroness, Lady Kidron, just said. This is where the rubber hits the road from our previous group. If we all believe that it is important to maintain children’s protection, I hope that my noble friend the Minister will be able to accept if not the exact wording of the children-specific amendments in this group then the direction of travel—and I hope that he will commit to coming back and working with us to make sure that we can get wording into the Bill.

I am hugely in favour of research in the private sector as well as in universities and the public sector; we should not close our minds to that at all. We need to be realistic that all the meaningful research in AI is currently happening in the private sector, so I do not want to close that door at all, but I am extremely uncomfortable with a Secretary of State having the ability to amend access to personal data for children in this context. It is entirely sensible to have a defined code of conduct for the use of children’s data in research. We have real evidence that a code of conduct setting out how to protect children’s rights and data in this space works, so I do not understand why it would not be a good idea to do research if we want the research to happen but we want children’s rights to be protected at a much higher level.

It seems to me that this group is self-evidently sensible, in particular Amendments 8, 22, 23 and 145. I put my name to all of them except Amendment 22 but, the more I look at the Bill, the more uncomfortable I get with it; I wish I had put my name to Amendment 22. We have discussed Secretary of State powers in each of the digital Bills that we have looked at and we know about the power that big tech has to lobby. It is not fair on Secretaries of State in future to have this ability to amend—it is extremely dangerous. I express my support for Amendment 22.

Digital Markets, Competition and Consumers Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The noble Lord, Lord Knight, has said so much of my speech that I will be very rapid. There are two points to make here. One is that regulatory co-operation is a theme in every digital Bill. We spent a long time on it during the passage of the Online Safety Act, we will do it again in the Data Protection and Digital Information Bill, and here it is again. As the noble Lord, Lord Knight, said, if the wording or the approach is not right, that does not matter, but any move to bring regulators together is a good thing.

The second point, which may come up again in amendments in a later group that looks at citizens, is that it is increasingly hard to understand what a user, a worker or a citizen is in this complicated digital system. As digital companies have both responsibilities and powers across these different themes, it is important, as I argued last week, to ensure that workers are not forgotten in this picture.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, it is with great trepidation that I rise to speak to these amendments because, I think for the first time in my brief parliamentary career, I am not complete ad idem with the noble Lord, Lord Knight, and the noble Baroness, Lady Kidron, on digital issues where normally we work together. I hope they will forgive me for not having shared some of my concerns with them in advance.

I kicked myself for not saying this last week, so I am extremely grateful that they have brought the issue back this week for a second run round. My slight concern is that history is littered with countries trying to stop innovation, whether we go back to the Elizabethans trying to stop looms for hand knitters or to German boatmen sinking the first steamboat as it went down the Rhine. We must be very careful that in the Bill we do not encourage the CMA to act in such a way that it stops the rude competition that will drive the innovation that will lead to growth and technology. I do not for a moment think that the noble Lord or the noble Baroness think that, but we have to be very cautious about it.

We also learn from history that innovation does not affect or benefit everybody equally. As we go through this enormous technology transformation, it is important that as a society we support people who do not necessarily immediately benefit or who might be considerably worse off, but I do not think that responsibility should lie with the CMA. Last week, the noble Lord, Lord Knight, challenged with, “If not in this Bill, where?” and I feel similarly about this amendment. It is right that we want regulators to co-operate more, but it is important that our regulators have very clear accountabilities. Having been a member of the Court of the Bank of England for eight years in my past life, I hate the fact that there are so many that the Bank of England must take note of in its responsibilities. We have to be very careful that we do not create a regime for the CMA whereby it has to take note of a whole set of issues that are really the broad responsibility of government. Where I come back into alignment with the noble Lord, Lord Knight, is that I think it is important that the Government address those issues, just probably not in this Bill.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I promise I will speak briefly to associate myself with the remarks of my noble friend Lady Stowell and support her Amendment 77 and Amendment 76 in the name of the noble Viscount, Lord Colville.

Despite the fact that there are fewer of us here than there have been in the debates on some of the other quite contentious issues, this is an extremely important amendment and a really important principle that we need to change in the Bill. To be honest, I thought that the power granted to the Secretary of State here was so egregious that it had to have been inserted as part of a cunning concession strategy to distract us from some of the other more subtle increases in powers that were included in the other place. It is extremely dangerous, both politically and technocratically, to put an individual Secretary of State in this position. I challenge any serious parliamentarian or politician to want to put themselves in that place, as my noble friend Lady Stowell said.

On its own, granting the Secretary of State this power will expose them to an enormous amount of lobbying; it is absolutely a lobbyist’s charter. This is about transparency, as the noble Baroness, Lady Kidron, said, and parliamentary scrutiny, which we will come to properly in our debate on the next group of amendments. However, it is also about reducing the risk of lobbying from the world’s most powerful institutions that are not Governments.

For those reasons, I have a slight concern. In supporting Amendment 77, I do not want the Government or my noble friend the Minister to think that establishing parliamentary scrutiny while maintaining the Secretary of State’s powers would be a happy compromise. It would be absolutely the wrong place for us to be. We need to remove the Secretary of State’s powers over guidance and establish better parliamentary scrutiny.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I will be brief. It is an honour to follow the noble Lord, Lord Fox, and his passionate exposé about the importance of interoperability while reminding us that we should be thinking globally, not just nationally. I did not come expecting to support his amendment but, as a result of that passion, I do.

I rise to support my noble friend Lady Stowell. She set out extremely clearly why stronger parliamentary oversight of the digital regulators is so important. I speak having observed this from all possible angles. I have been the chief executive of a regulated company, I have chaired a regulator, in the form of NHS Improvement, I have been on the board of a regulator, in the form of the Bank of England and I am a member of my noble friend’s committee. I have genuinely seen this from all angles, and it is clear that we need to form a different approach in Parliament to recognise the enormous amounts of power we are passing to the different regulators. Almost all of us in Committee today talked about this when the Online Safety Bill was passing through our House, and it was clear then that we needed to look at this. We have given enormous power to Ofcom in the Online Safety Act; this Bill looks at the CMA and very soon, in this same Room, we will be looking at changing and increasing the powers of the ICO, and if we think that that is it, we have not even begun on what AI is going to do to touch a whole host of regulators. I add my voice to my noble friend’s and congratulate her on the process that she seems to be well advanced in in gathering support not just in this House but in the other place.

I also express some support for Amendment 83. I am concerned that if we are not careful, the easiest way to ensure that the CMA is not bold enough is to not resource it properly. Unlike the passage of the Online Safety Act, where we got to see how far advanced Ofcom was in bringing in genuine experts from the technology and digital sector, it has not yet been so obvious as this Bill has progressed. That may be just because of the stage we are at, but I suspect it is also because the resourcing is not yet done in the CMA. Therefore, I ask the Minister for not so much an annual update as a current update on where the CMA is in resourcing and what support the Government are giving it to ensure it is able to meet a timetable that still looks painfully slow for this Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise mainly to correct the record that I called the amendment in the name of the noble Baroness modest and also to celebrate the fact that I am once again back on the side of the noble Baroness, Lady Harding; it was very uncomfortable there for a moment.

I was on both committees that the noble Baroness, Lady Stowell, referred to. We took evidence, and it was clear from all sorts of stakeholders that they would like to see more parliamentary engagement in the new powers we are giving to regulators. They are very broad and sometimes novel powers. However, the point I want to make at this moment is about the sheer volume of what is coming out of regulators. I spent a great deal of my Christmas holiday reading the 1,500 pages of consultation material on illegal harms for the Online Safety Act, and that was only one of three open consultations. We need to understand that we cannot have sufficient oversight unless someone is properly given that job. I challenge the department and Secretary of State to have that level of oversight and interest in things that are already passed. So, the points that the noble Baroness made about resource and capacity are essential.

My other, very particular, point is on the DRCF. I went to a meeting—it was a private meeting, so I do not want to say too much, but fundamentally people were getting together and those attending were very happy with their arrangements. They were going to publish all sorts of things to let the world know how they, in their combination, saw various matters. I asked, “Is there an inbox?” They looked a little quizzical and said, “What do you mean?” I said, “Well, are you taking information in, as a group, as well as giving it out?” The answer was no, of course, because it is not a described space or something that has rules but is a collection of very right-minded people. But here in Committee, we make the point that we need good processes, not good people. So I passionately support this group of amendments.

I briefly turn to the amendment tabled by the noble Lord, Lord Fox, in which there is an unexpected interest in that I work with the IEEE, America’s largest standards organisation, and with CEN-CENELEC, which does standards for the European Union. I also have a seat on the Broadband Commission, which is the ITU’s institute that looks after the SDGs. Creating standards is, as a representative of Google once said to me, soft power. It is truly global, and as we create and move towards standards, there are often people in their pyjamas on the other side of the world contributing because people literally work in all time zones to the same effect. It is a truly consensual, global and important way forward. Everyone who has used the wifi today has used an IEEE standard.

Digital Markets, Competition and Consumers Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I shall also discuss the leveraging or whack-a-mole provisions. Perhaps Conservative Peers today are London buses: this is the fourth London bus to make the same point. I too would have added my name to my noble friend Lord Vaizey’s amendment had I been organised enough.

I shall make a couple of points. The noble Lord, Lord Tyrie, said earlier that we are all here on the Bill because harm has already been done. If noble Lords will forgive me, I will tell a little story. In 2012, I went on a customer trip to Mountain View, Google’s headquarters in California, as the chief executive of TalkTalk. We were in the early days of digital advertising and TalkTalk was one of its biggest customers. A whole group of customers went on what people now call a digital safari to visit California and see these tech companies in action.

I will never forget that the sales director left us for a bit for a demo from some engineers from head office in Mountain View, from Google, who demoed a new functionality they were working on to enable you to easily access price comparisons for flights. It was an interesting demo because some of the other big customers of Google search at the time were independent flight search websites, whose chief executives had been flown out by Google to see all the new innovation. The blood drained from their faces as this very well-meaning engineer described and demoed the new functionality and explained how, because Google controlled the page, it would be able to promote its flight search functionality to the top of the page and demote the companies represented in the room. When the sales director returned, it was, shall we say, quite interesting,

I tell that tale because there are many examples of these platforms leveraging the power of their platform to enter adjacent markets. As my noble friend has said, that gets to the core of the Bill and how important it is that the CMA is able to impose conduct requirements without needing to go through the whole SMS designation process all over again.

I know that the tech firms’ counterargument to this is that it is important that they have the freedom to innovate, and that for a number of them this would somehow create “a regulatory requirement to seek permission to innovate”. I want to counter that: we want all companies in this space to have the freedom to innovate, but they should not have the freedom to prioritise their innovation on their monopoly platform over other people’s innovation. That is why we have to get a definition of the leveraging principle, or the whack-a-mole principle, right. As with almost all the amendments we have discussed today, I am not particularly wedded to the specific wording, but I do not think that the Bill as it is currently drafted captures this clearly enough, and Amendments 25, 26, and 27 get us much closer to where we need to be.

I, too, add my voice in support my noble friend Lord Lansley’s amendments. I must apologise for not having studied them properly in advance of today, but my noble friend introduced them so eloquently that it is very clear that we need to put data clearly in the Bill.

Finally, as a member of my noble friend’s Communications and Digital Committee, I, too, listened very carefully to the comments made by the noble Lord, Lord Clement-Jones, about copyright. I feel this is a very big issue. Whether this is the right place to address it, I do not know, but I am sure he is right that we need to address it somehow.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I am sorry to break the Conservative bus pattern but I, too, will speak to Amendments 26 and 27, to which I have added my name, and to Amendment 30. Before I do, I was very taken by the amendments spoken to by the noble Lord, Lord Lansley, and I support them. I feel somewhat sheepish that I had not seen the relationship between data and the Bill, having spent most of the past few months with my head in the data Bill. That connection is hugely important, and I am very grateful to the noble Lord for making such a clear case. In supporting Amendments 26 and 27, I recognise the value of Amendment 25, tabled by the noble Lord, Lord Vaizey, and put on record my support for the noble Lord, Lord Holmes, on Amendment 24. So much has been said that we have managed to change the name of the leveraging principle to the whack-a-mole principle and everything that has been said has been said very well.

The only point I want to make on these two amendments, apart from to echo the profound importance that other noble Lords have already spoken of, is that the ingenuity of the sector has always struck me as being equally divided between its incredible creativity in creating new products and things for us to do and services that it can provide, and an equal ingenuity in avoiding regulation of all kinds in all parts of the world. Without having not only the designated activity but the activities the sector controls that are adjacent to the activity, we do not have the core purpose of the Bill. At one point I thought it might help the Minister to see that the argument he made in relation to Clause 6(2) and (3), which was in defence of some flexibility for the Secretary of State, might equally be made on behalf of the regulator in this case.

Turning briefly to Amendment 30 in the name of the noble Lord, Lord Clement-Jones, I first have to make a slightly unusual declaration in that my husband was one of the Hollywood writers who went on strike and won a historic settlement to be a human being in charge of their AI rather than at the behest of the AI. Not only in the creative industries but in academia, I have seen first-hand the impact of scraping information. Not only is the life’s work of an academic taken without permission, but then regurgitating it as an inaccurate mere guess undermines the very purpose of academic distinctions. There is clearly a copyright issue that requires an ability both to opt out and correct, and to share in the upside, as the noble Lord pointed out.

I suggest that the LLMs and general AI firms have taken the axiom “it’s better to ask forgiveness than permission” to unbelievable new heights. Our role during the passage of this Bill may be to turn that around and say that it is better to ask permission than forgiveness.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to speak to all the amendments in this group. It is a cause of great regret that, despite many private meetings with officials, government lawyers and Ministers, we have not yet come to an agreement that would explicitly include in the Bill harm that does not derive from content. I will be listening very carefully to the Minister, if he should change his mind during the debate.

The amendments in this group fall into three categories. First, there is a series of amendments in my name and those of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford: Amendments 35, 36, 37A and 85. I hope the Government will accept them as consequential because, in meetings last week, they would not accept that harm to children can arise from the functionality and design of services and not just from the content. Each of these amendments simply makes it clear that harm can arise absent from content: nothing more, nothing less. If the Minister agrees that harm may derive from the design of products and services, can he please explain, when he responds, why these amendments are not acceptable? Simply put, it is imperative that the features, functionalities or behaviours that are harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill. This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

The Government have primary priority harmful content, priority content or non-designated harmful content, the latter being a category that is yet to be defined, but not the harm that emerges from how the regulated company designs its service. For example, there are the many hundreds of small reward loops that make up a doomscroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park; or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups. For example, they deliberately push 13 year-old boys towards Andrew Tate—not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.

The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers. To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.

The focus on content is old-fashioned and looks backwards. The Bill is drafted as if it has particular situations and companies in mind but does not think about how fast the business moves. When we started the Bill, none of us thought about the impact of TikTok; last week, we saw a new service, Threads, go from zero to 70 million users in a single day. It is an act of stunning hubris to be so certain of the form of harm. To be unprepared to admit that some harm is simply design means that, despite repeated denials, this is just a content Bill. The promise of systems and processes being at the heart of the Bill has been broken.

The second set of amendments in this group are in the name of my noble friend Lord Russell. Amendments 46 and 90 further reveal the attitude of the Government, in that they are protecting the companies rather than putting them four-square in the middle of their regime. The Government specifically exempt the manner of dissemination from the safety duties. My noble friend Lord Russell’s amendment would leave that out and ensure that the manner of dissemination, which is fundamental to the harm that children experience, is included. Similarly, Amendment 240 would take out “presented by content” so that harm that is the result of the design decisions is included in the Bill.

The third set are government Amendments 281C and 281D, and Amendment 281F, in my name. For absence of doubt, I am totally supportive of government Amendments 281C to 281E, which acknowledge the cumulative harms; for example, those that Molly Russell experienced as she was sent more and more undermining and harmful content. In as far as they are a response to my entreaties, and those of other noble Lords, that we ensure that cumulative harmful content is the focus of our concerns, I am grateful to the Government for tabling them. However, I note that the Government have conceded only the role of cumulative harm for content. Amendments 281D and 281E once again talk about content as the only harm to children.

The noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names to Amendment 281F, and I believe I am right in saying that if there were not a limit to four names, there were a great many Peers who would have added their names also. For the benefit of the House, I will quote directly from the amendment:

“When in relation to children, references to harm include the potential impact of the design and operation of a regulated service separately and additionally from harms arising from content, including the following considerations … the potential cumulative impact of exposure to harm or a combination of harms … the potential for harm to result from features, functionalities or behaviours enabled or created by the design and operation of services … the potential for some features and functionalities within a service to be higher risk than other aspects of the service … that a service may, when used in conjunction with other services, facilitate harm to a child on a different service … the potential for design strategies that exploit a child’s developmental vulnerabilities to create harm, including validation metrics and compulsive reward loops … the potential for real time services, features and functionalities such as geolocation, livestream broadcasts or events, augmented and virtual environments to put children at immediate risk … the potential for content neutral systems that curate or generate environments, content feeds or contacts to create harm to children … that new and emerging harms may arise from artificial intelligence, machine generated and immersive environments”.


Before I continue, I ask noble Lords to consider which of those things they would not like for their children, grandchildren or, indeed, other people’s children. I have accepted that the Government will not add the schedule of harms as I first laid it: the four Cs of content, conduct, contact and commercial harms. I have also accepted that the same schedule, written in the less comfortable language of primary priority, priority and non-designated harms, has also been rejected. However, the list that I just set out, and the amendment to the duties that reflect those risks, would finally put the design of the system at the heart of the Bill. I am afraid that, in spite of all our conversations, I cannot accept the Government’s argument that all harm comes from content.

Even if we are wrong today—which we are most definitely not—in a world of AI, immersive tech and augmented reality, is it not dangerous and, indeed, foolish, to exclude harm that might come from a source other than content? I imagine that the Minister will make the argument that the features are covered in the risk assessment duties and that, unlike content, features may be good or bad so they cannot be characterised as harmful. To that I say: if the risk assessment is the only process that matters, why do the Government feel it necessary to define the child safety duties and the interpretation of harm? The truth is, they have meaning. In setting out the duty of a company to a child, why would the Government not put the company’s design decisions right at the centre of that duty?

As for the second part of the argument, a geolocation feature may of course be great for a map service but less great if it shows the real-time location of a child to a predator, and livestreaming from a school concert is very different from livestreaming from your bedroom. Just as the noble Lord, Lord Allan, explained on the first day on Report, there are things that are red lines and things that are amber; in other words, they have to be age-appropriate. This amendment does not seek—nor would it mean—that individual features or functionalities would be prevented, banned or stopped. It would mean that a company had a duty to make sure that their features and functionalities were age-appropriate and did not harm children—full stop. There would be no reducing this to content.

Finally, I want to repeat what I have said before. Sitting in the court at Molly Russell’s inquest, I watched the Meta representative contest content that included blood cascading down the legs of a young woman, messages that said, “You are worthless”, and snippets of film of people jumping off buildings. She said that none of those things met the bar of harmful content according to Meta’s terms and conditions.

Like others, I believe that the Online Safety Bill could usher in a new duty of care towards children, but it is a category error not to see harm in the round. Views on content can always differ but the outcome on a child is definitive. It is harm, not harmful content, that the Bill should measure. If the Minister does not have the power to accede, I will, with great regret, be testing the opinion of the House. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, as so often in the course of the Bill, I associate myself wholeheartedly with the comments that the noble Baroness, Lady Kidron, just made. I, too, thank my noble friend the Minister and the Secretary of State for listening to our debates in Committee on the need to be explicit about the impact of cumulative harmful content. So I support Amendments 281C, 281D and 281E, and I thank them for tabling them.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, as somebody who is only five feet and two inches, I have felt that size does not matter for pretty much all my life and have long wanted to say that in a speech. This group of amendments is really about how size does not matter; risk does. I will briefly build on the speech just given by the noble Lord, Lord Allan, very eloquently as usual, to describe why risk matters more than size.

First, there are laws for which size does matter—small companies do not need to comply with certain systems and processes—but not those concerned with safety. I have in my mind’s eye the small village fête, where we expect a risk assessment if we are to let children ride on rides. That was not the case 100 years ago, but is today because we recognise those dangers. One of the reasons why we stumbled into thinking that size should matter in this Bill is that we are not being honest about the scale of the risk for our children. If the risk is large enough, we should not be worrying about size; we should be worrying about that risk. That is the first reason why we have to focus on risk and not size.

The second reason is subsequent to what I have just said—the principles of the physical world should apply to the online world. That is one of the core tenets of this Bill. That means that if you recognise the real and present risks of the digital world you have to say that it does not matter whether a small number of people are affected. If it is a small business, it still has an obligation not to put people in harm’s way.

Thirdly, small becomes big very quickly—unfortunately, that has not been true for me, but it is true in the digital world as Threads has just shown us. Fourthly, we also know that in the digital world re-engineering something once it has got very big is really difficult. There is also a practical reason why you want engineers to think about the risks before they launch services rather than after the event.

We keep being told, rightly, that this is a Bill about systems and processes. It is a Bill where we want not just the outcomes that the noble Lord, Lord Allan, has referred to in terms of services in the UK genuinely being safer; we are trying to effect a culture change. I would argue one of the most important culture changes is that any bright, young tech entrepreneur has to start by thinking about the risks and therefore the safety procedures they need to put in place as they build their tech business from the ground up and not once they have reached some artificial size threshold.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have to admit that it was incompetence rather than lack of will that meant I did not add my name to Amendment 39 in the name of the noble Lord, Lord Bethell, and I would very much like the Government to accept his argument.

In the meantime, I wonder whether the Minister would be prepared to make it utterly clear that proportionality does not mean a little bit of porn to a large group of children or a lot of porn to a small group of children; rather, it means that high-risk situations require effective measures and low-risk situations should be proportionate to that. On that theme, I say to the noble Lord, Lord Allan, whose points I broadly agree with, that while we would all wish to see companies brought into the fold rather than being out of the fold, it rather depends on their risk.

This brings me neatly to Amendments 43 and 87 from the noble Lord, Lord Russell, to which I managed to add my name. They make a very similar point to Amendment 39 but across safety duties. Amendment 242 in my name, to which the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford have added their names, makes the same point—yet again—in relation to Ofcom’s powers.

All these things are pointing in the same direction as Amendment 245 in the name of the noble Baroness, Lady Morgan, which I keep on trumpeting from these Benches and which offers an elegant solution. I urge the Minister to consider Amendment 245 before day four of Report because if the Government were to accept it, it would focus company resources, focus Ofcom resources and, as we discussed on the first day of Report, permit companies which do not fit the risk profile of the regime and are unable to comply with something that does not fit their model yet leaves them vulnerable to enforcement also to be treated in an appropriate way.

Collectively, the ambition is to make sure that we are treating things in proportion to the risk and that proportionate does not start meaning something else.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendment 1, to which I was happy to add my name alongside that of the Minister. I too thank the noble Lord, Lord Stevenson, for tabling the original amendment, and my noble and learned friend Lord Neuberger for providing his very helpful opinion on the matter.

I am especially pleased to see that ensuring that services are safe by design and offer a higher standard of protection for children is foundational to the Bill. I want to say a little word about the specificity, as I support the noble Baroness, Lady Merron, in trying to get to the core issue here. Those of your Lordships who travel to Westminster by Tube may have seen TikTok posters saying that

“we’re committed to the safety of teens on TikTok. That’s why we provide an age appropriate experience for teens under 16. Accounts are set to private by default, and their videos don’t appear in public feeds or search results. Direct messaging is also disabled”.

It might appear to the casual reader that TikTok has suddenly decided unilaterally to be more responsible, but each of those things is a direct response to the age-appropriate design code passed in this House in 2018. So regulation does work and, on this first day on Report, I want to say that I am very grateful to the Government for the amendments that they have tabled, and “Please do continue to listen to these very detailed matters”.

With that, I welcome the amendment. Can the Minister confirm that having safety by design in this clause means that all subsequent provisions must be interpreted through that lens and will inform all the decisions of Report and those of Ofcom, and the Secretary of State’s approach to setting and enforcing standards?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I too thank my noble friend the Minister for tabling Amendment 1, to which I add my support.

Very briefly, I want to highlight one word in it, to add to what the noble Baroness, Lady Kidron, has just said. The word is “activity”. It is extremely important that in Clause 1 we are setting out that the purpose is to

“require providers of services regulated by this Act to identify, mitigate and manage”

not just illegal or harmful content but “activity”.

I very much hope that, as we go through the few days on Report, we will come back to this and make sure that in the detailed amendments that have been tabled we genuinely live up to the objective set out in this new clause.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.

First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.

Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.

Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.

One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.

My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank everyone for their contributions this evening. As the noble Lord, Lord Stevenson, said, it is very compelling when your Lordships’ House gets itself together on a particular subject and really agrees, so I thank noble Lords very much for that.

I am going to do two things. One is to pick up on a couple of questions and, as has been said by a number of noble Lords, concentrate on outcomes rather than contributions. On a couple of issues that came up, I feel that the principle of pornography being treated in the same way in Parts 3 and 5 is absolute. We believe we have done it. After Committee we will discuss that with noble Lords who feel that is not clear in the amendment to make sure they are comfortable that it is so. I did not quite understand in the Minister’s reply that pornography was being treated in exactly the same way in Parts 3 and 5. When I say “exactly the same way”, like the noble Lord, Lord Allan, I mean not necessarily by the same technology but to the same level of outcome. That is one thing I want to emphasise because a number of noble Lords, including the noble Baroness, Lady Ritchie, the noble Lord, Lord Farmer, and others, are rightly concerned that we should have an outcome on pornography, not concentrate on how to get there.

The second thing I want to pick up very briefly, because it was received so warmly, is the question of devices and on-device age assurance. I believe that is one method, and I know that at least one manufacturer is thinking about it as we speak. However, it is an old battle in which companies that do not want to take responsibility for their services say that people over here should do something different. It is very important that devices, app stores or any of the supposed gatekeepers are not given an overly large responsibility. It is the responsibility of everyone to make sure that age assurance is adequate.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

I hope that what the noble Baroness is alluding to is that we need to include gatekeepers, app stores, device level and sideloading in another part of the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

But of course—would I dare otherwise? What I am saying is that these are not silver bullets and we must have a mixed economy, not only for what we know already but for what we do not know. We must have a mixed economy, and we must not make an overly powerful one platform of age assurance. That is incredibly important, so I wanted to pick up on that.

I also want to pick up on user behaviour and unintended consequences. I think there was a slight reference to an American law, which is called COPPA and is the reason that every website says 13. That is a very unhelpful entry point. It would be much better if children had an age-appropriate experience from five all the way to 18, rather than on and off at 13. I understand that issue, but that is why age assurance has to be more than one thing. It is not only a preventive thing but an enabling thing. I tried to make that very clear so I will not detain the Committee on that.

On the outcome, I say to the Minister, who has indeed given a great deal of time to this, that more time is needed because we want a bar of assurance. I speak not only for all noble Lords who have made clear their rightful anxiety about pornography but also on behalf of the bereaved parents and other noble Lords who raised issues about self-harming of different varieties. We must have a measurable bar for the things that the Bill says that children will not encounter—the primary priority harms. In the negotiation, that is non-negotiable.

On the time factor, I am sorry to say that we are all witness to what happened to Part 3. It was pushed and pushed for years, and then it did not happen—and then it was whipped out of the Bill last week. This is not acceptable. I am happy, as I believe other noble Lords are, to negotiate a suitable time that gives Ofcom comfort, but it must be possible, with this Bill, for a regulator to bring something in within a given period of time. I am afraid that history is our enemy on this one.

The third thing is that I accept the idea that there has to be more than principles, which is what I believe Ofcom will provide. But the principles have to be 360 degrees, and the questions that I raised about security, privacy and accessibility should be in the Bill so that Ofcom can go away and make some difficult judgments. That is its job; ours is to say what the principle is.

I will tell one last tiny story. About 10 years ago, I met in secret with one of the highest-ranking safety officers in one of the companies that we always talk about. They said to me, “We call it the ‘lost generation’. We know that regulation is coming, but we know that it is not soon enough for this generation”. On behalf of all noble Lords who spoke, I ask the Government to save the next generation. With that, I withdraw the amendment.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

It is a great pleasure to follow my noble friend Lord Russell and to thank him for his good wishes. I assure the Committee that there is nowhere I would rather spend my birthday, in spite of some competitive offers. I remind noble Lords of my interests in the register, particularly as the chair of 5Rights Foundation.

As my noble friend has set out, these amendments fall in three places: the risk assessments, the safety duties and the codes of practice. However, together they work on the overarching theme of safety by design. I will restrict my detailed remarks to a number of amendments in the first two categories. This is perhaps a good moment to recall the initial work of Carnegie, which provided the conceptual approach of the Bill several years ago in arguing for a duty of care. The Bill has gone many rounds since then, but I think the principle remains that a regulated service should consider its impact on users before it causes them harm. Safety by design, to which all the amendments in this group refer, is an embodiment of a duty of care. In thinking about these amendments as a group, I remind the Committee that both the proportionality provisions and the fact that this is a systems and processes Bill means that no company can, should or will be penalised for a single piece of content, a single piece of design or, indeed, low-level infringements.

Amendments 24, 31, 77 and 84 would delete “content” from the Government’s description of what is harmful to children, meaning that the duty is to consider harm in the round rather than just harmful content. The definition of “content” is drawn broadly in Clause 207 as

“anything communicated by means of an internet service”,

but the examples in the Bill, including

“written material … music and data of any description”,

once again fail to include design features that are so often the key drivers of harm to children.

On day three of Committee, the Minister said:

“The Bill will address cumulative risk where it is the result of a combination of high-risk functionality, such as live streaming, or rewards in service … This will initially be identified through Ofcom’s sector risk assessments, and Ofcom’s risk profiles and risk assessment guidance will reflect where a combination of risk in functionalities such as these can drive up the risk of harm to children. Service providers will have to take Ofcom’s risk profiles into account in their own risk assessments for content which is illegal or harmful to children”.—[Official Report, 27/4/23; col. 1385.]


However, in looking at the child safety duties, Clause 11(5) says:

“The duties … in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used”,


but subsection (14) says:

“The duties set out in subsections (3) and (6)”—


which are the duties to operate proportionate systems and processes to prevent and protect children from encountering harmful content and to include them in terms of service—

“are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination)”.

I hesitate to say whether that is contradictory. I am not actually sure, but it is confusing. I am concerned that while we are reassured that “content” means content and activity and that the risk assessment considers functionality, “harm” is then repeatedly expressed only in the form of content.

Over the weekend, I had an email exchange with the renowned psychoanalyst and author, Norman Doidge, whose work on the plasticity of the brain profoundly changed how we think about addiction and compulsion. In the exchange, he said that

“children’s exposures to super doses, of supernormal images and scenes, leaves an imprint that can hijack development”.

Then, he said that

“the direction seems to be that AI would be working out the irresistible image or scenario, and target people with these images, as they target advertising”.

His argument is that it is not just the image but the dissemination and tailoring of that image that maximises the impact. The volume and frequency of those images create habits in children that take a lifetime to change—if they change at all. Amendments 32 and 85 would remove this language to ensure that content that is harmful by virtue of its dissemination is accounted for.

I turn now to Amendments 28 and 82, which cut the reference to the

“size and capacity of the provider of the service”

in deeming what measures are proportionate. We have already discussed that small is not safe. Such platforms such as Yubo, Clapper and Discord have all been found to harm children and, as both the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, told us, small can become big very quickly. It is far easier to build to a set of rules than it is to retrofit them after the event. Again, I point out that Ofcom already has duties of proportionality; adding size and capacity is unnecessary and may tip the scale to creating loopholes for smaller services.

Amendment 138 seeks to reverse the exemption in Clause 54 of financial harms. More than half of the 100 top-grossing mobile phone apps contain loot boxes, which are well established as unfair and unhealthy, priming young children to gamble and leading to immediate hardship for parents landed with extraordinary bills.

By rights, Amendments 291 and 292 could fit in the future-proof set of amendments. The way that the Bill in Clause 204 separates out functionalities in terms of search and user-to-user is in direct opposition to the direction of travel in the tech sector. TikTok does shopping, Instagram does video, Amazon does search; autocomplete is an issue across the full gamut of services, and so on and so forth. This amendment simply combines the list of functionalities that must be risk-assessed and makes them apply on any regulated service. I cannot see a single argument against this amendment: it cannot be the Government’s intention that a child can be protected, on search services such as Google, from predictive search or autocomplete, but not on TikTok.

Finally, Amendment 295 will embed the understanding that most harm is cumulative. If the Bereaved Parents for Online Safety were in the Chamber, or any child caught up in self-harm, depression sites, gambling, gaming, bullying, fear of exposure, or the inexorable feeling of losing their childhood to an endless scroll, they would say at the top of their voices that it is not any individual piece of content, or any one moment or incident, but the way in which they are nudged, pushed, enticed and goaded into a toxic, harmful or dangerous place. Adding the simple words

“the volume of the content and the frequency with which the content is accessed”

to the interpretation of what can constitute harm in Clause 205 is one of the most important things that we can do in this Chamber. This Bill comes too late for a whole generation of parents and children but, if these safety by design amendments can protect the next generation of children, I will certainly be very glad.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, it is an honour, once again, to follow the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, in this Committee. I am going to speak in detail to the amendments that seek to change the way the codes of practice are implemented. Before I do, however, I will very briefly add my voice to the general comments that the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell, have just taken us through. Every parent in the country knows that both the benefit and the harm that online platforms can bring our children is not just about the content. It is about the functionality: the way these platforms work; the way they suck us in. They do give us joy but they also drive addiction. It is hugely important that this Bill reflects the functionality that online platforms bring, and not just content in the normal sense of the word “content”.

I will now speak in a bit more detail about the following amendments: Amendments 65, 65ZA, 65AA, 89, 90, 90B, 96A, 106A, 106B, 107A, 114A—I will finish soon, I promise—112, 122ZA, 122ZB and 122ZC.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.

I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.

Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.

The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.

Online Safety Bill

Debate between Baroness Harding of Winscombe and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I refer the Committee to my interests as put in the register and declared in full at Second Reading. I will speak to Amendment 2 in my name and those of the right reverend Prelate the Bishop of Oxford and the noble Baroness, Lady Harding, to Amendments 3 and 5 in my name, and briefly to Amendments 19, 22, 298 and 299 in the name of the noble Baroness, Lady Harding.

The digital world does not have boundaries in the way that the Bill does. It is an ecosystem of services and products that are interdependent. A user journey is made up of incremental signals, nudges and enticements that mean that, when we use our devices, very often we do not end up where we intended to start. The current scope covers user-to-user, search and commercial porn services, but a blog or website that valorises self-harm and depression or suggests starving yourself to death is still exempt because it has limited functionality. So too are games without a user-to-user function, in spite of the known harm associated with game addiction highlighted recently by Professor Henrietta Bowden-Jones, national expert adviser on gambling harms, and the World Health Organization in 2019 when it designated gaming disorder as a behavioural addiction.

There is also an open question about immersive technologies, whose protocols are still very much in flux. I am concerned that the Government are willing to assert that these environments will meet the bar of user-to-user when those that are still building immersive environments make quite clear that that is not a given. Indeed, later in Committee I will be able to demonstrate that already the very worst harms are happening in environments that are not clearly covered by the Bill.

Another unintended consequence of the current drafting is that the task of working out whether you are on a regulated or unregulated service is left entirely to children. That is not what we had been promised. In December the Secretary of State wrote in a public letter to parents,

“I want to reassure every person reading this letter that the onus for keeping young people safe online will sit squarely on the tech companies’ shoulders”.

It is likely that the Minister will suggest that the limited- functionality services will be caught by the gatekeepers. But, as in the case of immersive technology, it is dangerous to suggest that, just because search and user- to-user are the primary access points in 2023, that will remain the case. We must be more forward thinking and ensure that services likely to be accessed that promote harm are in scope by default.

Amendments 3 and 5 are consequential, so I will not debate them now. I have listened to the Government and come back with a reasonable and implementable amendment that applies only to services that are likely to be accessed by children and that enable harm. I now ask the Government to listen and do likewise.

Amendments 92 and 193 cover the child user condition. The phrase “likely to be accessed”, introduced in this House into what became the Data Protection Act 2018, is one of the most unlikely successful British exports. Both the phrase and its definition, set out by the ICO, have been embedded in regulations in countries the world over—yet the Bill replaces this established language while significantly watering down the definition.

The Bill requires

“a significant number of children”

to use the service, or for the service to be

“likely to attract a significant number of users who are children”.

“Significant” in the Bill is defined relative to the overall UK user base, which means that extremely large platforms could deem a few thousand child users not significant compared with the several million-strong user base. Since only services that cross this threshold need comply with the child safety duties, thousands of children will not benefit from the safety duties that the Minister told us last week were at the heart of the Bill.

Amendment 92 would put the ICO’s existing and much-copied definition into the Bill. It says a service is

“likely to be accessed by children”

if

“the service is designed or intended for use by children … children form a substantive and identifiable user group … the possibility of a child accessing the service is more probable than not, taking into consideration … the nature and content of the service and whether that has particular appeal for children … the way in which the service is accessed and any measures in place to prevent children gaining access … market research, current evidence on user behaviour, the user base of similar or existing services”

that are likely to be accessed.

Having two phrases and definitions is bad for business and even worse for regulators. The ICO has first-mover advantage and a more robust test. It is my contention that parents, media and perhaps even our own colleagues would be very shocked to know that the definition in the Bill has the potential for many thousands, and possibly tens of thousands, of children to be left without the protections that the Bill brings forward. Perhaps the Minister could explain why the Government have not chosen regulatory alignment, which is good practice.

Finally, I will speak briefly in support of Amendments 19, 22, 298 and 299. I am certain that the noble Baroness, Lady Harding, will spell out how the app stores of Google and Apple are simply a subset of “search”, in that they are gatekeepers to accessing more than 5 million apps worldwide and the first page of each is indeed a search function. Their inclusion should be obvious, but I will add a specific issue about which I have spoken directly with both companies and about which the 5Rights Foundation, of which I am chair, has written to the ICO.

When we looked at the age ratings of apps across Google Play Store and Apple, four things emerged. First, apps are routinely rated much lower than their terms and conditions: for example, Amazon Shopping says 18 but has an age rating of 4 on Apple. This pattern goes across both platforms, covering social sites, gaming, shopping, et cetera.

Secondly, the same apps and services did not have the same age rating across both services, which, between them, are gatekeepers for more than 95% of the app market. In one extreme case, an app rated four on one of them was rated 16 on the other, with other significant anomalies being extremely frequent.

Thirdly, almost none of the apps considered their data protection duties in coming to a decision on their age rating, which is a problem, since privacy and safety and inextricably linked.

Finally, in the case of Apple, using a device registered to a 15 year-old, we were able to download age-restricted apps including a dozen or more 18-plus dating sites. In fairness, I give a shoutout to Google, which, because of the age-appropriate design code, chose more than a year ago not to show 18-plus content to children in its Play Store. So this is indeed a political and business choice and not a question of technology. Millions of services are accessed via the App Store. Given the Government’s position—that gatekeepers have specific responsibilities in relation to harmful content and activity—surely the amendments in the name of the noble Baroness, Lady Harding, are necessary.

My preference was for a less complicated Bill based on principles and judged on outcomes. I understand that that ship has sailed, but it is not acceptable for the Government now to use the length and complexity of the Bill as a reason not to accept amendments that would fill loopholes where harm has been proven. It is time to deliver on the promises made to parents and children, and to put the onus for keeping young people safe online squarely on tech companies’ shoulders. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - -

My Lords, I rise to speak to Amendments 19, 22, 298 and 299 in my name and those of the noble Baroness, Lady Stowell, and the noble Lords, Lord Knight and Lord Clement-Jones. I will also briefly add at the end of my speech my support for the amendments in the name of my friend, the noble Baroness, Lady Kidron. It has been a huge privilege to be her support act all the way from the beginnings of the age-appropriate design code; it feels comfortable to speak after her.

I want briefly to set out what my amendments would do. Their purpose is to bring app stores into the child protection elements of the Bill. Amendment 19 would require app stores to prepare

“risk assessments equal to user-to-user services due to their role in distributing online content through apps to children and as a primary facilitator of user-to-user”

services reaching children. Amendment 22 would mandate app stores

“to use proportionate and proactive measures, such as age assurance, to prevent children”

coming into contact with

“primary priority content that is harmful to children”.

Amendments 298 and 299 would simply define “app” and “app stores”.

Let us be clear what app stores do. They enable customers to buy apps and user-to-user services. They enable customers to download free apps. They offer up curated content in the app store itself and decide what apps someone would like to see. They enable customers to search for apps for user-to-user content. They provide age ratings; as the noble Baroness, Lady Kidron, said, they may be different age ratings in different app stores for the same app. They sometimes block the download of apps based on the age rating and their assessment of someone’s age, but not always, and it is different for different app stores.

Why should they be included in this Bill—if it is not obvious from what I have already said? First, two companies are profiting from selling user-to-user products to children. Two app stores account for some 98%-plus of all downloads of user-to-user services, with no requirements to assess the risk of selling those products to children or to mitigate those risks. We do not allow that in the physical world so we should not allow it in the digital world.

Secondly, parents and teenagers tell us that this measure would help. A number of different studies have been done; I will reference just two. One was by FOSI, the Family Online Safety Institute, which conducted an international research project in which parents consistently said that having age assurance at the app store level would make things simpler and more effective for them; ironically, the FOSI research was conducted with Google.