(1 week ago)
Lords Chamber
Lord Nash (Con)
My Lords, I thank the Minister for her kind words and for the statement. I thank the Government for their active engagement in the matter of social media, albeit rather last-minute, and for making a binding commitment to impose some form of age or functionality restrictions for children under 16, to be focused on addictive features, harmful, algorithmic-driven content, and features such as stranger pairing, which we know can be most damaging to children’s safety and privacy and have led to so much harm and a number of deaths.
This is very welcome to the millions of parents, voters, teachers, health professionals and others who have been asking for it, and it is exactly what my amendment would have achieved. I would just ask the Government to get these lines to all Ministers, so that when they are on the airwaves, they stick to them, rather than giving long and rather confusing answers—because it is to this statement that we will be holding the Government to account to deliver on as soon as possible.
I thank all noble Peers from across the House who supported my amendment, particularly the noble Baronesses, Lady Berger, Lady Benjamin and Lady Cass, who put their names to it originally. I also very much thank my team, Ben and Molly Kingsley of Safe Screens, Bella Skinner and Becky Foljambe of Health Professionals for Safer Screens, Simon Bailey and Ed Oldfield. I also thank Annabelle Eyre and Henry Mitson, who have advised me on the process. Having taken five Acts through your Lordships’ House as a Minister, I have discovered how different the gamekeeper-turned-poacher process is. I also thank Susannah Street and Connie Walsh in the Public Bill Office for being so available to help me navigate the intricacies of the amendment process.
Above all, I thank the 27 bereaved parents who have campaigned so tirelessly alongside me, particularly Ellen Roome. They did not have to do this; they did it so that no other family would have to live through what they have lived through, and they have ensured that, as a result, every child in the country will be safer because of their work, and I thank them for it. I do hope that the Prime Minister will meet with them, as they have requested, very soon.
Turning to the amendment of the noble Lord, Lord Clement-Jones, I share the noble Lord’s concern about timescale. I see no reason why the Government cannot act faster than the longstop they have allowed for, and I understand and have heard their statement that they intend to do so. I also share the concern of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, about Ofcom. Having met recently with Ofcom and heard the long-winded and convoluted process it has to go through before it can stick anything on the social media companies, I was confirmed—if I needed any confirmation—in my view that we have to put the onus on the companies to get their houses in order by restricting children’s access to harmful features, rather than hoping we can regulate our way out of this problem.
However, we need to improve substantially the Online Safety Act and to strengthen Ofcom’s ability—and, if I might say so, its capacity and boldness. It is disappointing that the rumour is there will be nothing in the King’s Speech which would enable us to do this. I hope we can live together to fight this battle another time, but so far as this Bill is concerned, I feel the moment has passed.
My Lords, I thank the Government for listening to the voices of concern, including those of the bereaved parents, for our children’s safety to be at the forefront of all our minds.
As we move forward to the next steps, it might be a bit late in the day to make this suggestion, but I have an idea to throw into the mix. It may sound radical, but it is for the tech companies and IT platforms to require a licence from Ofcom to operate in this country. It may sound like a crazy idea, but radio and TV companies need a licence, so why not tech companies and social media platforms? If they do not comply then their licence will be taken away from them or they will be fined huge sums. This is one way to get them to be focused. Are we bold or intrepid enough to do this? It could be the answer to keeping them focused and to keeping our children safe. Age assurance is the key which they need to operate to keep our children safe. As we move forward, I hope that everyone will make it their responsibility to do just that, in every way possible. Ofcom is vital to all this. I look forward to working with the Government on this important issue and to us keeping the focus of our minds on our children’s safety, happiness and contentment for the future.
(2 months ago)
Lords Chamber
Lord Nash (Con)
My Lords, as this is the first time I have spoken on Report, I draw attention to my interests in the register, particularly the fact that I am—and have been for many years—an investor in many technology companies, mainly software companies.
I do not think I need to spend too much time telling noble Lords of the appalling worldwide industry of child sexual abuse, as I know many noble Lords are only too aware of it. There have been many powerful speeches about it already today and I went through it in quite a lot of detail in Committee, but I will mention a few facts. It is estimated that in the Philippines alone, one in every 100 children is coerced into this industry, often with their parents’ consent, for the gratification of paedophile customers across the world. It is estimated that around 70 million child sexual abuse images are floating around the internet, many of which are of very young children and some—quite a few, sadly—even of babies, as the noble Lord, Lord Russell of Liverpool, mentioned earlier. Many depict incest.
Some of the victims in these images have been viewed tens of millions of times. Imagine what it is like as a young girl or an adult walking down the street and seeing a man—it would be a man—look at you and peer at you for a few seconds, and to wonder whether that man has seen you raped online. With the advent of AI, it is, as we now know, possible using just text to speech to generate increasingly appalling images.
Depending on whose statistics you look at, this country is the second or third-largest consumer of this dreadful stuff in the world. The National Crime Agency issued a report last month saying that it arrests 1,000 paedophiles a month in this country. There were tens of thousands of outstanding investigations, and it is estimated that there are well over half a million offenders in the UK alone. For some offenders, this online abuse is a gateway to real-life contact abuse, as the noble Lord, Lord Stevenson of Balmacara, has mentioned already. There is no doubt that some of this is fuelled by addiction to pornography and the desire for even more extreme content.
Under existing legislation, material can be taken down only once it has been seen—often by children. With livestreaming of this abuse, which is a very large industry, the images are watched in the moment and often immediately taken down. The tech companies already have methods of taking down much of this non-livestreamed material, but most of them are not using these methods effectively. Technology is now available to block on device the viewing of child sexual abuse images, or the making or livestreaming of them.
My amendment would mandate that this technology be installed on smartphones and tablets supplied in the UK. Of course, it would be open to manufacturers to develop their own technology to do that if they did not want to purchase a third-party product. Everyone I have spoken to, from regulators to technology experts and the companies themselves, is completely confident that that can be done. The problem is not the technology; it is achieving very high accuracy levels, at 99%, and very low false positives, at under 1%.
Of course, the Government will also need to be satisfied that the technology works effectively. Several discussions about this have already taken place between the Home Office, DSIT, the Internet Watch Foundation and the technology company I introduced to them. The Government may also initially, at least because of the difficulty sometimes of telling a 16 or 17 year-old from an 18 year-old, want to bring it in effective for a lower age. Since at least half of children being abused are under 13, that would be a very good start. My amendment would require the regulations to be brought into force within 12 months, but the regulations could mandate a further period for implementation.
Noble Lords will have noted that in place of my original Amendment 239, I now have down Amendment 239A. The difference is the addition of proposed new Clause 4(b) to ensure user privacy, which is perfectly possible under the technology because it is on the device; the data is not stored and does not go into the cloud.
We have the opportunity under the Bill to effectively hamper this appalling activity—indeed, industry—thereby saving and protecting many children from harm. I believe we have a moral obligation to pass this into law.
My Lords, I have put my name to Amendment 239A in the name of the noble Lord, Lord Nash, as I believe we need to protect our children, however and wherever we can, from child sexual abuse material being created and shared. Shockingly, over 70 million images—yes, 70 million—are being circulated around the world, far beyond these shores, via the scourge of the online world. There is sexual imagery involving children as young as seven to 11, being exploited and watched by an ever-growing audience. This is not only immoral but cruel, despicable and illegal. It makes me weep to think that children’s childhood is being snatched away from them as we speak.
Organisations such as the Internet Watch Foundation have helped to secure arrests of those for CSAM offences but, despite those arrests, the number of offenders continues to grow. Demand is not being diminished; it is being fed by sick-minded, perverted individuals. Heartbreakingly, where demand for new imagery grows, so does the abuse of real children to produce it.
Social media is central to how offenders operate. Some 40% of CSAM offenders attempted to contact a child after viewing material, with 70% doing so online, mostly through social media, gaming and messaging platforms, while 77% of offenders found CSAM on the open web, with 29% citing social media.
I have met young people who have remained victims of this vile practice years after they became adults. They describe the ongoing harm they suffer because the images of their abuse remain in circulation. They have had their abuse material viewed millions and millions of times. Research has confirmed that survivors with an online element to their abuse found significantly higher levels of long-lasting harm, including depression and anxiety, post-traumatic stress disorder, self-harm, substance abuse, social isolation and sexual dysfunction, compared with survivors whose abuse was never recorded or shared online.
The cruelty that these survivors must endure extends even further. Some are actively hunted in adult life by offenders seeking to see how they look today. Can your Lordships believe this? With AI, offenders are now generating new abuse imagery featuring adult survivors—in some cases producing material in which the survivor appears to be abusing their younger self. Does that not make you want to cry?
Imagine if it was your child or grandchild, and what it means to live that reality. Imagine a survivor, as the noble Lord, Lord Nash, described, walking down the street, catching the eye of a stranger and immediately, involuntarily, thinking, “Have you seen the image of me being abused?” Does that not make your heart bleed? This is the daily experience of people whose abuse is permanently accessible online.