Debates between Baroness Bennett of Manor Castle and Baroness Kidron during the 2019 Parliament

Thu 11th May 2023
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Data Protection and Digital Information Bill

Debate between Baroness Bennett of Manor Castle and Baroness Kidron
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

I will add to that the issue of overseas bank accounts. I cannot see how the British Government can apply this measure to them. Will this not push people to go to overseas bank accounts? Or will the Government try to pursue them through challenger banks—including multiple accounts from one person who may have one original, normal current account here?

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

How many accounts of “signalling” already exist in the current backlog in the business-as-usual version? What kind of investment will it take when you supercharge these powers and get many more tens of thousands of signals?

Data Protection and Digital Information Bill

Debate between Baroness Bennett of Manor Castle and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.

AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.

The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.

I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.

On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.

Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.

As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.

Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.

Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.

Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.

Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.

There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.

As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.

I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.

This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.

What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.

The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.

Data Protection and Digital Information Bill

Debate between Baroness Bennett of Manor Castle and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

Before the Minister replies, we may as well do the full round. I agree with him, in that I very much believe in votes at 16 and possibly younger. I have been on many a climate demonstration with young people of 14 and under, so they can be involved, but the issue here is bigger than age. The main issue is not age but whether anybody should be subjected to a potential barrage of material in which they have not in any way expressed an interest. I am keen to make sure that this debate is not diverted to the age question and that we do not lose the bigger issue. I wanted to say that I sort of agree with the Minister on one element.

--- Later in debate ---
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Bassam, who has already set out very clearly what the group is about. I will chiefly confine myself to speaking to my Amendment 38A, which seeks to put in the Bill a clear idea of what having a human in the loop actually means. We need to have a human in the loop to ensure that a human interpreted, assessed and, perhaps most crucially, was able to intervene in the decision and any information on which it is based.

Noble Lords will be aware of many situations that have already arisen in which artificial intelligence is used—I would say that what we are currently describing is artificial intelligence but, in real terms, it is not truly that at all. What we have is a very large use of big data and, as the noble Lord, Lord Bassam, said, big data can be a very useful and powerful tool to be used for many positive purposes. However, we know that the quality of decision-making often depends on the quality of the data going in. A human is able to see whether something looks astray or wrong; there is a kind of intelligence that humans apply to this, which machines simply do not have the capacity for.

I pay credit to Justice, the law reform and human rights organisation which produced an excellent briefing on the issues around Clause 14. It asserts that, as it is currently written, it inadequately protects individuals from automated harm.

The noble Lord, Lord Bassam, referred to the Horizon case in the UK; that is the obvious example but, while we may think of some of the most vulnerable people in the UK, the Robodebt case in Australia is another case where crunching big data, and then crunching down on individuals, had truly awful outcomes. We know that there is a real risk of unfairness and discrimination in the use of these kinds of tools. I note that the UK has signed the Bletchley declaration, which says that

“AI should be designed, developed, deployed, and used, in a manner that is … human-centric, trustworthy and responsible”.

I focus particularly on “human-centric”: human beings can sympathise with and understand other human beings in a way that big data simply does not.

I draw a parallel with something covered by a special Select Committee of your Lordships’ House, last year: lethal autonomous weapon systems, or so-called killer robots. This is an obvious example of where there is a very strong argument for having a human in the loop, as the terminology goes. From the last I understood and heard about this, I am afraid that the UK Government are not fully committed to a human in the loop in the case of killer robots, but I hope that we get to that point.

When we talk about how humans’ data is used and managed, we are also talking about situations that are—almost equally—life and death: whether people get a benefit, whether they are fairly treated and whether they do not suddenly disappear off the system. Only this morning, I was reading a case study of a woman aged over 80, highlighting how she had been through multiple government departments, but could not get her national insurance number. Without a national insurance number, she could not get the pension to which she was entitled. If there is no human in the loop to cut through those kinds of situations, there is a real risk that people will find themselves just going around and around machines—a circumstance with which we are personally all too familiar, I am sure. My amendment is an attempt to put a real explanation in the Bill for having that human in the loop.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.

Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.

I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:

“Clause 14 risks eroding trust in AI”.


That would be a very sad outcome.

Online Safety Bill

Debate between Baroness Bennett of Manor Castle and Baroness Kidron
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

I thank the noble Lord for his intervention. He has made me think of the fact that a particular area where this may be of grave concern is cosmetic procedures, which I think we debated during the passage of the Health and Care Act. These things are all interrelated, and it is important that we see them in an interrelated way as part of what is now the health system.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to a number of amendments in this group. I want to make the point that misinformation and disinformation was probably the issue we struggled with the most in the pre-legislative committee. We recognised the extraordinary harm it did, but also—as the noble Baroness, Lady Fox, said—that there is no one great truth. However, algorithmic spread and the drip, drip, drip of material that is not based on any search criteria or expression of an opinion but simply gives you more of the same, particularly the most shocking, moves very marginal views into the mainstream.

I am concerned that our debates over the last five days have concentrated so much on content, and that the freedom we seek does not take enough account of the way in which companies currently exercise control over the information we see. Correlations such as “Men who like barbecues are also susceptible to conspiracy theories” are then exploited to spread toxic theories that end in real-world harm or political tricks that show, for example, the Democrats as a paedophile group. Only last week I saw a series of pictures, presented as “evidence”, of President Biden caught in a compromising situation that gave truth to that lie. As Maria Ressa, the Nobel Peace Prize winner for her contribution to the freedom of expression, said in her acceptance speech:

“Tech sucked up our personal experiences and data, organized it with artificial intelligence, manipulated us with it, and created behavior at a scale that brought out the worst in humanity”.


That is the background to this set of amendments that we must take seriously.

As the noble Lord, Lord Bethell, said, Amendment 52 will ensure that platforms undertake a health misinformation risk assessment and provide a clear policy on dealing with harmful, false and misleading information. I put it to the Committee that, without this requirement, we will keep the status quo in which clicks are king, not health information.

It is a particular pleasure to support the noble Lord, Lord Moylan, on his Amendments 59 and 107. Like him, I am instinctively against taking material down. There are content-neutral ways of marking or questioning material, offering alternatives and signposting to diverse sources—not only true but diverse. These can break this toxic drip feed for long enough for people to think before they share, post and make personal decisions about the health information that they are receiving.

I am not incredibly thrilled by a committee for every occasion, but since the Bill is silent on the issue of misinformation and disinformation—which clearly will be supercharged by the rise of large language data models—it would be good to give a formal role to this advisory committee, so that it can make a meaningful and formal contribution to Ofcom as it develops not only this code of conduct but all codes of conduct.

Likewise, I am very supportive of Amendment 222, which seeks independence for the chair of the advisory body. I have seen at first hand how a combination of regulatory capture and a very litigious sector with deep pockets slows down progress and transparency. While the independence of the chair should be a given, our collective lived experience would suggest otherwise. This amendment would make that requirement clear.

Finally, and in a way most importantly, Amendment 224 would allow Ofcom to consider after the effect whether the code of conduct is necessary. This strikes a balance between adding to its current workload, which we are trying not to do, and tying one hand behind its back in the future. I would be grateful to hear from the Minister why we would not give Ofcom this option as a reasonable piece of future-proofing, given that this issue will be ever more important as AI creates layers of misinformation and disinformation at scale.

--- Later in debate ---
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the noble Baroness, Lady Prashar, and I join her in thanking the noble Lord, Lord Knight, for introducing this group very clearly.

In taking part in this debate, I declare a joint interest with the noble Baroness, Lady Fox, in that I was for a number of years a judge in the Debating Matters events to which she referred. Indeed, the noble Baroness was responsible for me ending up in Birmingham jail, during the time that such a debate was conducted with the inmates of Birmingham jail. We have a common interest there.

I want to pick up a couple of additional points. Before I joined your Lordships’ Committee today I was involved in the final stages of the Committee debate on the economic crime Bill, where the noble Lord, Lord Sharpe of Epsom, provided a powerful argument—probably unintentionally—for the amendments we are debating here now. We were talking, as we have at great length in the economic crime Bill, about the issue of fraud. As the noble Lord, Lord Holmes of Richmond, highlighted, in the context of online harms fraud is a huge aspect of people’s lives today and one that has been under-covered in this Committee, although it has very much been picked up in the economic crime Bill Committee. As we were talking about online fraud, the noble Lord, Lord Sharpe of Epsom, said that consumers have to be “appropriately savvy”. I think that is a description of the need for education and critical thinking online, equipping people with the tools to be, as he said, appropriately savvy when facing the risks of fraud and scams, and all the other risks that people face online.

I have attached my name to two amendments here: Amendment 91, which concerns the providers of category 1 and 2A services having a duty, and Amendment 236, which concerns an Ofcom duty. This joins together two aspects. The providers are making money out of the services they provide, which gives them a duty to make some contribution to combatting the potential harms that their services present to people. Ofcom as a regulator obviously has a role. I think it was the noble Lord, Lord Knight, who said that the education system also has a role, and there is some reference in here to Ofsted having a role.

What we need is a cross-society, cross-systems approach. This is where I also make the point that we need to think outside the scope of the Bill—it is part of the whole package—about how the education system works, because media literacy is not a stand-alone thing that you can separate out from the issues of critical thinking more broadly. We need to think about our education system, which is far too often, for schools in particular, where we get pupils to learn and regurgitate a whole set of facts and then reward them for that. We need to think about how our education system prepares children for the modern online world.

There is a great deal we can learn from the example—often cited but worth referring to—of Finland, which by various tests has been ranked as the country most resistant to fake news. A very clearly built-in idea of questioning, scrutiny and challenge is being encouraged among pupils, starting from the age of seven. That is something we need to transform our education system to achieve. However, of course, many people using the internet now are not part of our education system, so this needs to be across our society. A focus on the responsibilities of Ofcom and the providers has to be in the Bill.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, over the last decade, I have been in scores of schools, run dozens of workshops and spoken to literally thousands of children and young people. A lot of what I pass off as my own wisdom in this Chamber is, indeed, their wisdom. I have a couple of points, and I speak really from the perspective of children under 18 with regard to these amendments, which I fully support.

Media literacy—or digital literacy, as it is sometimes called—is not the same as e-safety. E-safety regimes concentrate on the behaviour of users. Very often, children say that what they learn in those lessons is focused on adult anxieties about predators and bullies, and when something goes wrong, they feel that they are to blame. It puts the responsibility on children. This response, which I have heard hundreds of times, normally comes up after a workshop in which we have discussed reward loops, privacy, algorithmic bias, profiling or—my own favourite—a game which reveals what is buried in terms and conditions; for example, that a company has a right to record the sound of a device or share their data with more than a thousand other companies. When young people understand the pressures that they are under and which are designed into the system, they feel much better about themselves and rather less enamoured of the services they are using. It is my experience that they then go on to make better choices for themselves.

Secondly, we have outsourced much of digital literacy to companies such as Google and Meta. They too concentrate on user behaviour, rather than looking at their own extractive policies focused on engagement and time spent. With many schools strapped for cash and expertise, this teaching is widespread. However, when I went to a Google-run assembly, children aged nine were being taught about features available only on services for those aged over 13—and nowhere was there a mention of age limits and why they are important. It cannot be right that the companies are grooming children towards their services without taking full responsibility for literacy, if that is the literacy that children are being given in school.

Thirdly, as the Government’s own 2021 media literacy strategy set out, good media literacy is one line of defence from harm. It could make a crucial difference in people making informed and safe decisions online and engaging in a more positive online debate, at the same time as understanding that online actions have consequences offline.

However, while digital literacy and, in particular, critical thinking are fundamental to a contemporary education and should be available throughout school and far beyond, they must not be used as a way of putting responsibility on the user for the company’s design decisions. I am specifically concerned that in the risk-assessment process, digital literacy is one of the ways that a company can say it has mitigated a potential risk or harm. I should like to hear from the Minister that that is an additional responsibility and not instead of responsibility.

Finally, over all these years I have always asked at the end of the session what the young people care about the most. The second most important thing is that the system should be less addictive—it should have less addiction built into it. Again, I point the Committee in the direction of the safety-by-design amendments in the name of my noble friend Lord Russell that try to get to the crux of that. They are not very exciting amendments in this debate but they get to the heart of it. However, the thing the young people most often say is, “Could you do something to get my parents to put down their phones?” I therefore ask the Minister whether he can slip something into the Bill, and indeed ask the noble Lord, Lord Grade, whether that could emerge somewhere in the guidance. That is what young people want.

Online Safety Bill

Debate between Baroness Bennett of Manor Castle and Baroness Kidron
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- View Speech - Hansard - -

My Lords, my name is attached to Amendment 203 in this group, along with those of the noble Lords, Lord Clement-Jones, Lord Strathcarron and Lord Moylan. I shall speak in general terms about the nature of the group, because it is most usefully addressed through the fundamental issues that arise. I sincerely thank the noble Lord, Lord Allan, for his careful and comprehensive introduction to the group, which gave us a strong foundation. I have crossed out large amounts of what I had written down and will try not to repeat, but rather pick up some points and angles that I think need to be raised.

As was alluded to by the noble Baroness, Lady Kidron, this debate and the range of these amendments shows that the Bill is currently extremely deficient and unclear in this area. It falls to this Committee to get some clarity and cut-through to see where we could end up and change where we are now.

I start by referring to a briefing, which I am sure many noble Lords have received, from a wide range of organisations, including Liberty, Big Brother Watch, the Open Rights Group, Article 19, the Electronic Frontier Foundation, Reset and Fair Vote. It is quite a range of organisations but very much in the human rights space, particularly the digital human rights space. The introduction of the briefing includes a sentence that gets to the heart of why many of us have received so many emails about this element of the Bill:

“None of us want to feel as though someone is looking over our shoulder when we are communicating”.