(4 days, 20 hours ago)
Lords ChamberMy Lords, I support all the amendments in this group, and in particular I pay tribute to the noble Baroness, Lady Kidron, for her endless work in this capacity. This is the first time I have spoken on any of these groups of amendments. I find everything the noble Lord, Lord Nash, the noble Baroness, Lady Kidron, and others have said truly shocking. Some 55 years ago, I started a magazine called Spare Rib. If I had ever dreamed, in my wildest and worst nightmares, that I would find myself listening to what everyone has been talking about, I suppose we would not have gone on. In so many ways, this is a worse situation that women find themselves in, and certainly young girls. I carried on riding a pony till I was 15—that was my childhood—and then I found boys. This is so terrible, and I congratulate every noble Lord, and particularly the noble Baronesses, on the work that they have done.
I will be very brief, as I just want to speak in support of the amendment from the noble Lord, Lord Nash, and Amendment 266, which simply says that AI is already being used to harm children. Unless we act decisively, this harm will just escalate. The systems that everyone has been discussing today are extraordinary technological achievements—and they are very dangerous. The Internet Watch Foundation has reported an explosion in AI-generated child sexual abuse material. Offenders can now share instructions on how to manipulate the models, how to train them on illegal material and how to evade all the filters. The tools are becoming so accessible and so frictionless that a determined offender can produce in minutes material that once would have involved an entire criminal enterprise. Against that backdrop, it is quite staggering that we do not already require AI providers to assess whether their systems can be used to generate illegal child abuse. Amendment 266 would plug this gap. Quite frankly, I cannot for the life of me see why any responsible company would resist such a requirement.
Amendment 479 addresses a confusion that has gone on for too long. We cannot have a situation where some companies argue that generative AI is a search service and therefore completely in scope of the Online Safety Act, while others argue the opposite. If a model can retrieve, repackage or generate harmful content in response to a query, the public deserve clarity about precisely where that law applies.
On Amendment 480, this really is an issue that keeps me awake at night. These chatbots can be astonishingly persuasive. As the noble Baroness, Lady Kidron, says, they are also addictive: they are friendly, soothing and intimate, and are a perfect confidant for a lonely child. They also generate illegal material, encourage harmful behaviour and groom children. We have already seen chatbots modelled on sex offenders and heard reports of chatbots sending sexualised messages to children, including the appalling case of a young boy who took his life after weeks of interaction with AI. We will no doubt hear of more such cases. The idea that such systems might fall through the cracks is unthinkable.
What these amendments do is simple. They say that if a system can generate illegal or harmful content for a child, it should not be allowed to do so. Quite frankly, anything that man or woman can make, man or woman can unmake—that is still just true. We have often said in this Chamber that children deserve no less protection online than they do offline. With AI, however, we should demand more, because these systems are capable of things no human predator could ever manage. They work 24/7, they target thousands simultaneously and they adapt perfectly to the vulnerabilities of every child they encounter. The noble Baroness, Lady Kidron, is right to insist that we act now, not in two years—think how different it was two years ago. We have to act now. I say to the Government that this is a real chance to close some urgent gaps, and I very much hope that they will take it.
My Lords, I support all the amendments in this group, but I will speak to Amendments 479 and 480 in the name of the noble Baroness, Lady Kidron. I declare my interest as a guest of Google at their Future Forum, an AI policy conference.
These amendments are vital to ascertain the Government’s position on AI chatbots and where they stand in relation to the Online Safety Act, but I have to question how we can have been in a state of ambiguity for so long. We are very close to ChatGPT rolling out erotica on its platform for verified adults. Six months ago, the Wall Street Journal highlighted the deeply disturbing issue of digital companion bots engaging in sexual chat with users, which told them they were underage. Further, they willingly played out scenarios such as “submissive schoolgirl”. Another bot purporting to be a 12 year-old boy promised that it would not tell its parents about dating a user identifying himself as an adult man. Professor Clare McGlynn KC has already raised concerns about what she has coined chatbot-driven VAWG, the tech itself being designed to be sexually suggestive and to engage in grooming and coercive behaviours. Internet Matters found that 64 % of children use chatbots. The number of companion apps has rapidly developed and researchers at Bournemouth University are already warning about the addictive potential of these services.
The Government and the regulator cannot afford to be slow in clarifying the position of these services. It begs a wider question of how we can be much more agile in our response and continually horizon-scan, as legislation will always struggle to keep pace with the evolution of technology. This is the harm we are talking about now, but how will it evolve tomorrow? Where will we be next month or next year? It is vital that both the Government and the regulator become more agile and respond at pace. I look forward to the Minister’s response to the noble Baroness’s amendments.
My Lords, I shall speak very briefly. Earlier—I suppose it was this morning—we talked about child criminal exploitation at some length, thanks particularly to the work of the noble Baroness, Lady Casey, and Professor Jay. Essentially, what we are talking about in this group of amendments is child commercial exploitation. All these engines, all these technologies, are there for a commercial purpose. They have investors who are expecting a return and, to maximise the return, these technologies are designed to drive traffic, to drive addiction, and they do it very successfully. We are way behind the curve—we really are.
I echo what the noble Baroness, Lady Morgan, said about the body of knowledge within Parliament, in both Houses, that was very involved in the passage of the Online Safety Act. There is a very high level of concern, in both Houses, that we were perhaps too ambitious in assuming that a regulator that had not previously had any responsibilities in this area would be able to live up to the expectations held, and indeed some of the promises made, by the Government during the passage of that Act. I think we need to face up to that: we need to accept that we have not got it off to as good a start as we wanted and hoped, and that what is happening now is that the technologies we have been hearing about are racing ahead so quickly that we are finding it hard to catch up. Indeed, looking at the body language and the physiognomies of your Lordships in the Chamber, looking at the expressions on our faces as some of what we were talking about is being described, if it is having that effect on us, imagine what effect it is having on the children who in many cases are the subjects of these technologies.
I plead with the Minister to work very closely with his new ministerial colleague, the noble Baroness, Lady Lloyd, and DSIT. We really need to get our act together and focus; otherwise, we will have repeats of these sorts of discussions where we raise issues that are happening at an increasing pace, not just here but all around the world. I fear that we are going to be holding our hands up, saying “We’re doing our best and we’re trying to catch up”, but that is not good enough. It is not good enough for my granddaughter and not good enough for the extended families of everybody here in this Chamber. We really have to get our act together and work together to try to catch up.
(1 month, 2 weeks ago)
Lords ChamberMy Lords, I welcome the Minister to her new role, and I very much look forward to working with her. I further welcome the clarification that this Bill brings to the law on spiking and the new offence of taking non-consensual intimate images. I very much look forward to supporting my noble friend Lady Sugg on her amendments on honour-based abuse and my noble friend Lady Bertin on her amendments on online pornography. I want to take this opportunity to congratulate my noble friend Lady Bertin on her brilliant review and thank her for her tireless efforts pushing for comprehensive law on online pornography.
I turn now to the new taking offence. I greatly welcome the implementation of the Law Commission recommendation to update the pre-existing voyeurism and upskirting offences and implement a single taking offence. I am very pleased to see that it is vitally a consent-based offence, removing the unnecessary burden of having to prove the motivation of the perpetrator, which has featured in previous iterations of image-based abuse offences. However, it is vital that we further strengthen this offence, by increasing the time limits prosecutors have to bring forward charges, so that victims are not inadvertently timed out by the six-month time limit of a summary offence.
In February, the Government gave me an undertaking to extend the time limits for the non-consensual creation offence in the data Bill after it was highlighted by the campaign group #NotYourPorn. The extension of the time limit here means that, for the creation offence, victims have three years from when the offence is committed or alternatively from when the CPS has enough evidence to prosecute. Given that we have already achieved a legal precedent for extending the time limits on image-based sexual abuse, I would be grateful if the Minister, in his summing up, could commit to extending the time limits available in both the new taking offence and the pre-existing sharing offence, to ensure that all image-based abuse offences have parity within the law.
I was pleased to see the updating of the Sentencing Code to reflect the new taking offence and to clarify that photograph or film to which the offence relates, and anything containing it, is to be regarded as used for the purpose of committing the offence. However, I am keen that we look into further ways to ensure that this content is not kept by perpetrators and remains offline in perpetuity. Further, I will continue my work with survivors of this abuse and charities to explore ways in which this content can be removed from the internet as rapidly as possible.
Additionally, I was concerned that there does not seem to be a sufficient definition of what it is to “take” an image or video in the offence, and I would therefore also be grateful if the Minister could confirm that the definition of taking will include screenshotting. In the 2022 Law Commission report on intimate-image abuse, the example was given where a person may consent to being in an intimate state on a video call but not consent to the person screenshotting them. The Law Commission concluded that taking a screenshot of a video call should fall under the definition of taking, because this conduct creates a still image that does not otherwise exist.
I turn now to the issue of spiking, which my colleague in the other place, Joe Robertson MP, has highlighted, alongside the campaigners Colin and Mandy Mackie, whose son Greg tragically died after a spiking incident at university. While the clarification of spiking in this new offence is very welcome, I echo the point made by my noble friend Lady Coffey that there is concern that the intention element might be too narrow and that it might not allow for cases where a person has been spiked that do not fall into the categories of injure, aggrieve or annoy. This Bill is a positive step, and I look forward to working with the Government and noble Lords to strengthen it.
(1 year, 4 months ago)
Lords ChamberMy Lords, it is a great pleasure to speak in the debate on the humble Address and I welcome the new Ministers to their place. I was pleased to see this Government’s commitment to halving violence against women and girls. However, I am keen to understand whether the renewed focus on VAWG will include tech-facilitated abuse, as I was disappointed that no reference was made to the growing crisis of image-based abuse. Over the last few years, we have seen a piecemeal approach to legislating on this issue, with up-skirting, cyber flashing and the sharing of intimate images now illegal but the non-consensual taking of sexually explicit images, as well as the solicitation to create and the creation itself of sexually explicit deepfakes, remaining gaping omissions in our patchwork of law in this area.
I was pleased to see the Labour Party manifesto make a commitment to legislate on the creation of non-consensual sexually explicit deepfakes. Ninety-nine per cent of all sexually explicit deepfake videos feature women. If the Government are to succeed in their plan to tackle VAWG, they must not treat online violence in isolation. It can often form part of a much wider picture of abuse. Every day that we delay introducing this legalisation is another day when women have to live under the ever-present threat that someone will steal their picture to create sexually explicit images or pornographic videos of them. Every woman should have the right to choose who owns a naked image of her.
I have been privilege in my work in this area to meet Mariano Janin, who understands that sexually explicit deepfake videos were used to bully his beautiful 14 year-old daughter, Mia, leading to her tragically taking her own life. Sadly, this is not an isolated incident. I have also been entrusted by “Jodie”, whose case may be familiar, as she was brave enough to speak to the BBC about the trauma of being deepfaked by someone she counted as her best friend. Jodie discovered that pictures had been taken off her private Instagram page, overlayed on to pornographic images and posted on Reddit and other online forums, with comments asking people to rate her body. Jodie endured this abuse for five years, finding hundreds of pictures of herself, her friends and many other young women.
While it is illegal in the UK to share sexually explicit deepfaked images, it is still not illegal to create. In Jodie’s case, the perpetrator was soliciting the creation of images from others. It is of the utmost importance that solicitation becomes an offence in itself to prevent deepfakes being solicited from jurisdictions where they may not yet have legislated. We must not underestimate the real impact this digital content has on those such as Jodie whose image has been stolen. The content is often used to bully, harass and even extort money. It is not a one-off experience. Survivors often have to manage the trauma of this digital content trending, or being subject to further digital abuse, at any given moment.
We must become more agile in our response by ensuring that we view tech-facilitated abuse as a cohesive whole; we must work to find the balance between Parliament having legislative oversight and a regulator having the power to act quickly to not only remove harms but to anticipate and future-proof against them.
I am determined that we should close the gaps on the taking of non-consensual intimate images, as well as the creation of, and solicitation to create, non-consensual sexually explicit deepfakes. My Private Member’s Bill, being introduced on 6 September, seeks to address this. Urgent legislation is required as part of this new Government’s VAWG strategy to ensure the safety of women and girls online. It is not enough to react to this abuse; we must prevent it happening in the first place.
(1 year, 11 months ago)
Lords ChamberThe noble Lord makes an extremely important and welcome point. It is a fact that young men are less likely to record incidents of this sort of thing, for what reason I do not know, although I imagine that embarrassment and shame probably play a major part. Education has to be a factor in this, and we have to make it clear that, if you suspect that you have been a victim of spiking, it is necessary to get tested as soon as you can.
We are dealing with the culture behind some of these aspects in a much broader context. The Angiolini inquiry, which is looking into various incidents that have happened within the police over the last two years, will deliver its results soon. I hope that they go a considerable way to improving some of the cultural failings that have perhaps led to these things.
My Lords, given that the data collected by the NSPCC found that student was the highest-recorded occupation of those who had been spiked, does my noble friend the Minister agree that the Government should work with universities and colleges to offer support for students and raise awareness about attending events in non-licensed private premises, such as student accommodation?
I thank my noble friend for her question. She is absolutely correct, of course. As I have already said, we all have a part to play in tackling spiking and it is vital that we do this collaboratively. The Government and law enforcement have engaged with the sector, both through the Department for Education’s spiking working group, which is chaired by Professor Lisa Roberts, the vice-chancellor of the University of Exeter, and as part of a range of freshers-related communications activity carried out this year and last. As part of its most recent phase, the Government’s behaviour-change campaign “Enough” has partnered with more than 30 universities in the UK and produced a range of bespoke online and offline communications assets, which look to speak directly to student and university scenarios. Spiking assets form part of this package of work.
I could go on, but I completely agree with my noble friend and there will be a lot more to say on this. A consultation is ongoing with the Office for Students, which is due to deliver its report at the beginning of next year. We will have more to say then.