(1 day, 6 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered the matter of tackling the digital exploitation of women and girls.
It is a pleasure to serve under your chairship, Ms Jardine. Online abuse and digital exploitation are extremely prevalent in the modern-day world. The targeting of women and girls in online spaces is growing into a market where legislation is not keeping up with the speed of the digital world, so much so that the world’s richest man considered it acceptable and a matter of free speech to have his personal artificial intelligence platform undress women without their consent. That is shameful.
There is a growing difference between in-person exploitation—including sex trafficking, grooming, domestic violence and coercive control—and digital abuse and exploitation of someone’s image, where victims are often not known to perpetrators. In most cases they may not have any knowledge that they are even being exploited, and these crimes often happen in a highly organised manner.
In Lancashire, the police and crime commissioner conducted a survey of 4,800 people on violence against women and girls—otherwise known as VAWG—which asked about digital abuse. Half of the women surveyed, 51%, said they had experienced unwanted or inappropriate messages or images online. Only 12% of those women reported it to the police or any official body. Only a third of survey respondents felt confident that the police would act if they reported an incident, and just 8% trusted the wider criminal justice system to deliver any kind of justice.
Research by the domestic abuse organisation Refuge states that almost every survivor they have supported was subject to some form of technology-facilitated abuse. Some 95% of survivors of technology-facilitated abuse said it had impacted their mental health. I work closely with many organisations in Preston that tackle VAWG, many of which I am pleased to say are here today to observe the debate: the Foxton Centre, Lancashire Women, Hope Prevails Preston, Girls Who Walk Preston and Trust House Lancashire. We are also talking about stalking and abuse. Most of us would think about conventional stalking, where a perpetrator knows the individual or there is often a real-life link between them. The digital world has transformed the ways in which perpetrators utilise online tools to commit intimate partner abuse and coercive control.
This is an incredibly difficult subject to address, and I thank the hon. Gentleman for doing it incredibly well. As a grandfather of three beautiful granddaughters and having seen how the online world has made so many women and girls vulnerable to despicable attacks, I certainly share his concerns, and I believe that we must do more to ensure that safety is paramount. Does he agree that not only do we need to make it digitally impossible to carry out exploitation, but we must ensure that our young people are taught the dangers of image sharing, which can lead to image replication online? The Department for Education, in co-ordination with parents, has a key role to play in that.
I totally agree. In fact, I have just been discussing with some of our visitors from Lancashire what needs to happen in schools so that young people are aware of digital exploitation and the damage and distress that it can cause. I hope the Minister, who is in her place, will look at ways in which the Government could facilitate legislation so that, in future, many of these digital crimes could be included in the statute books and not be regarded as things that the police can do nothing about.
Does my hon. Friend agree that consent is vital, teaching about consent and seeking consent is imperative, and that the influencers and politicians who say that is too woke and is unnecessary are actually putting our children in danger?
I totally agree. Without getting too much into the politics of this, there is a very right-wing argument about what free speech means. When I first came to Parliament in the early 2000s, the then Prime Minister used to talk about rights and responsibilities. We all believe in rights, but they need to be balanced against responsibilities. The idea that somebody can print or send a person’s image and not take responsibility for the consequences is ridiculous. A lot of the current free speech arguments do not give that balance. We need to make sure that the young people and older people who are carrying out these acts know fully what they are doing, are willing to take responsibility for it and that the legal system is equipped to deal with it.
As I said, research by Refuge showed that almost every survivor it supported was subjected to some form of technology-facilitated abuse. When we talk about stalking and abuse, most of us think of conventional stalking, but the digital world has transformed the ways in which perpetrators use online tools to commit intimate partner abuse and coercive control.
In relationships with coercive control as an element, technology is often used to stalk, track and watch a partner’s location at all times. If this were carried out in person and a perpetrator physically followed an individual, that would cross the threshold for criminal prosecution. However, doing the same thing through a device in the digital world has become normalised and is not considered criminal.
This encourages perpetrators to use technology to facilitate abuse as it is more convenient and often evades prosecution. Tech devices, such as Ring doorbells, AirTags and cloning devices are used to track and further stalk victims. In-person stalking is facilitated by the introduction of such devices. Lancashire constabulary has acknowledged the increasing use of social media and messaging apps in coercive control cases. All police forces in the north-west report a year-on-year increase in digital elements in VAWG cases.
We are also seeing the rise of non-consensual intimate image sharing, often referred to as revenge porn. These images are often shared digitally on websites. The person in the image does not have ownership of that image, meaning that it is almost impossible for it to be taken down at the victim’s request. The Revenge Porn Helpline recorded 22,275 reports in 2024, which is a 20.9% increase from the previous year, and 412,000 intimate images have been reported since 2015. We have also seen an increase in online stalking, often through social media, catfishing and doxing, to share personal information such as addresses, workplaces, children’s schools, benefits and immigration status and much more.
The issue of digital abuse is even more severe. Perpetrators are often unknown individuals or organisations, utilising online means such as dark web markets and unregulated servers. Strangers obtain images of women and monetise them. Some of the images are obtained or created to order, and there is increased use of AI to create deepfakes, undress victims and attach false bodies to victims’ faces.
There is also a market for perpetrators to obtain lists of leaked passwords, which can then be used to hack into victims’ personal accounts, where personal photographs are often kept. Widely used dark web servers are monetised to exchange explicit images of women and girls. That represents a significant shift from earlier patterns, where such images were more commonly shared in private by an intimate partner in their social circle.
In the majority of cases of digital abuse, the victims and perpetrators are not known to each other. The threat to share images also happens to under-age boys and girls, including threats to share falsified, AI-created images. That has led to suicide attempts, and, tragically, the completion of suicide in some cases.
One of the most concerning developments in digital exploitation is under-the-radar abuse, where victims have no idea or knowledge that their private images have been accessed, copied or distributed without their consent. We are aware that the non-consensual distribution of intimate images, even of children, has led to perpetrators planning to gang rape some victims in those images. Currently, there is no legal definition of technology-facilitated abuse, and I hope that the Minister will address that point in her remarks. It means that it is almost impossible to prosecute, leaving victims without the ability to seek justice before the court. That situation should not be allowed to happen.
Prosecutors have to rely on other legislation that is often outdated and does not take the growing digital world into account. That results in short sentences and no real justice for the victims. The difficulties of prosecution lead to a lack of understanding by both prosecutors and the police, as well as continued offences and reoffending, which pushes victims into stressful situations, contributing to mental health complexities. The volume of cases put to prosecution by the Crown Prosecution Service is remarkably low, and convictions are even lower. Although we are seeing an increase in the reporting of digital abuse, it often leads to victims feeling disappointed and isolated, as the criminal justice process fails to deliver meaningful outcomes.
The Computer Misuse Act 1990 is outdated, and the police are still having to apply the Police and Criminal Evidence Act 1984 when considering seizing and examining devices. That is also outdated, as mobile phones and cloud-based storage did not exist in the way that they do today. Unfortunately, in some cases, victims need to collate their own evidence, as police forces do not have adequate training or an understanding of the severity of these issues. For example, a member of Girls Who Walk Preston, who is in the Public Gallery, has had a stalker for the last six years who the police have been unhelpful with, because she has not received death threats. The stalker lives under a different police force and refuses to admit that their accounts belong to them. The victim has been forced to collate her own evidence, to prove that the accounts do in fact belong to the stalker. That should not be necessary. It is an outrage, and she should not have to endure it.
I supported a constituent who has been stalked at work, which again fell under a different police force. There is a clear gap in how different constabularies understand and can prosecute those cases. While the picture has been improving in that regard in Lancashire, it is not a country-wide push.
My hon. Friend is making a very powerful speech. Is it not also imperative—I think the Government have done some work on this—that women being stalked know who their stalker is? Sometimes they do not know, and it means that they could be at a bus stop, and their stalker could be behind them without them knowing. I am glad that the Government have done some work on that. It is important that, as technologies change, we have the legislation to keep up with them.
I totally concur; my hon. Friend makes some very strong points.
The examples that I just mentioned highlight the need for consistent and mandatory guidance for all police forces to recognise digital abuse and online stalking, and not just take it for granted that there is nothing they can do about it. I call for national mandatory education for all police forces to recognise and deal with victims of digital abuse and exploitation; exploration of a legal definition of technology-facilitated abuse; and the provision of up-to-date legislation for the prosecution of offences, modernised in line with an ever-evolving digital world.
I welcome the commitment from the Minister to work with technology companies to stop online predators and the spread of explicit images stolen through hacking. I also welcome the Government’s commitment to enact, through the Crime and Policing Bill, a new offence of taking or recording an intimate photograph or film without consent. I would be happy to support the Minister in any way to achieve this objective, working with my own local police force.
Several hon. Members rose—
I remind Members to bob if they wish to be called to speak. I will call the Front Benchers at eight minutes past 5, so will Members please keep their speeches to about two and a half minutes each?
I congratulate the hon. Member for Preston (Sir Mark Hendrick) on securing this very important debate. In 2018, I introduced the upskirting Bill, the Voyeurism (Offences) Bill. At the time, an alarming number of men did not consider it harassment or an offence to upskirt a female. Too often, behaviours such as upskirting are dismissed as a laugh or as not that serious. I reject that entirely.
These are not victimless acts. We know that these kinds of violations often cause long-lasting psychological harm to the victims. We must also recognise the strong link between online and offline abuse. After all, it was the offence of upskirting that first led to Dominique Pelicot’s horrific crimes being brought to light in 2020. We know that if perpetrators get away with lower-level offences, they move on to more serious crime.
The law must move as fast as technology does, but it feels as if we are constantly on the back foot in reacting to novel uses of technology that harm women and girls, for example the recent rise of AI-generated indecent images and deepfakes. We must develop more proactive measures, because by the time we legislate against one form of technology-facilitated abuse, another seems to emerge.
It is for Ofcom to hold social media companies to account, but in my view it is currently failing to treat the digital exploitation of women and girls with the seriousness that it deserves. That is why we Liberal Democrats are calling for a dedicated online crime agency to effectively tackle illegal content and activity online. I hope that the Government will take that seriously.
Another example of technology developing faster than regulation is the rise of covert filming using smart glasses. Across social media, footage is being uploaded of women who have been filmed without their consent. Often, it has been taken outside nightclubs and gyms, when women are out walking or running—as we heard in the earlier debate—or on beaches, violating the privacy of women without their even being aware that they are being filmed.
The Government must send a clear message to the tech sector that women’s safety is not optional. If they are serious about tackling the epidemic of violence against women and girls, we must create a safer online environment, backed up by strong legislation and enforcement.
Mr Richard Quigley (Isle of Wight West) (Lab)
It is a beyond fantastic pleasure to serve under your chairship, Ms Jardine. I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing such an important and extremely timely debate. As we have seen over the past weeks, months and years, the exploitation of women and girls comes in many forms; the digital landscape does not automatically protect them from physical harm. We have seen recently how easily predators can sexually exploit women and girls through deepfakes on social media, and how these actions are actively promoted by platforms. That exploitation can also translate into the exploitation of physical insecurities, which can lead women and girls down a dangerous and unattainable path.
Unrealistic body standards in advertising are not a new phenomenon, but AI has deepened the problem. It is beyond the pale: it now enables girls to be exposed to imagery promoting body types that quite literally defy the laws of physics. Such content is not something that they stumble on; it is actively pushed at them through targeted advertising. Once a social media platform identifies a user as a girl or a young woman, the algorithm begins promoting harmful material directly into their feed.
A recent Government report found that 19% of girls aged between 14 and 16 had been exposed to harmful content promoting extreme thinness—nearly three times the figure reported by their male classmates. A leaked internal report from Meta went further, indicating that its eating disorder-related algorithm is more likely to target users who have previously engaged with content about body dissatisfaction. That is a malicious example of how these companies, which by the way are fully aware of these risks, continue to exploit the insecurities of women and girls.
Some of the most distressing cases of deepfakes and technology-facilitated sexual abuse have resulted in women losing their job, their reputation and even access to their children. These incidents send an appalling message that perpetrators can wield overwhelming power over their victims. They confirm the very fears that many survivors live with every day.
If social media and AI companies are willing to stand by while their algorithms amplify content about dangerous eating disorders and give abusers free rein to harm women and their children, we must act. This is a significant fight, but one that I and many of my colleagues are absolutely committed to continuing. If we are not vigilant, we risk allowing our digital world to be shaped by the misogynistic impulses of figures like Elon Musk and the wider manosphere, rather than building an online world that safeguards and protects women and girls.
Lola McEvoy (Darlington) (Lab)
It is a pleasure to serve under your chairmanship, Ms Jardine. The argument for urgent and robust regulation to protect girls online has been won, thanks in no small part to the grit and resilience of survivors who have spoken out, and of the Minister herself, who is a formidable force in this area.
I began campaigning on this issue in defence of 14-year-old girls. Being 14 is tough: your hormones are wild, your body is changing, you take risks and you are desperate to belong. It has not been that long—although it is longer than I care to state to the House—since I was 14, so I do remember it. Some people advocate for an imaginary time gone by, filled with innocence, skipping ropes and cross-stitch, but that world was not real for most girls. Most 14-year-old girls will take risks, will keep secrets, will have a crush or 10, and will say mean things they should not say. That is part of growing up.
What we have allowed to happen to our girls, through the explosion of unsupervised stranger contact and self-published content, is utterly appalling. It is not normal, and we must take action now. I will outline as quickly as possible why I think we must take action to ban any form of stranger contact for under-16s online and why self-published content and functionalities that publish unregulated and unvetted content need to be banned for under-16s, to stop the exploitation.
The first meeting I had when I was elected was with the headteachers of secondary schools in Darlington, about online safety. I wanted to hear what the real issue was in Darlington and how severe it was, because so many parents had raised it with me. The results from the first forum, in which a thousand pupils came forward, were worse than I had feared: 60% of girls had known someone who had been bullied or blackmailed online, 53% of girls had been contacted by somebody lying about their age, 49% had been asked for pictures or personal information, compared with 28% of boys, and over 70% had been contacted by a stranger. One 14-year-old girl told me that they had been added to groups of strangers and that extreme content was then shared. That was on an app where you are not supposed to have contact from strangers.
The platforms say that adults should no longer be able to contact under-16s, but it is obvious that it is still happening. It is easy to pass through: recent analysis of the ban in Australia showed that a lot of children had drawn on moustaches and coloured in fake beards in order to pass the facial recognition age verification test. I urge the Minister to look into that and to offer her support for a more rigorous ban.
Obviously, strangers should not be able to contact under-16-year-old girls. The secondary point about enforcement is quite clear, but there is also a point that has not yet been made to the House, about self-publishing and online safety. Where we see self-published content, we see an organised criminal network of people grooming children through links and through more enticing content that might lead them into a darker space that is even less regulated. I urge the Minister to support work to address that.
Finally, we need to address the deep imbalance in who pays the price for online extortion with images of girls. Girls who make a single mistake are made to suffer a permanent price. That is obviously wrong. Their image could be circulated at any time. The threat of it is unbearable: it can be used anywhere and shared with any number of people throughout their life. That is brutal enough, but as one girl in the forum said to me, “It’s so wrong. The girl’s always blamed. She’s totally responsible.” Can the Minister outline how we can do more to support girls who are victims and survivors?
Joe Morris (Hexham) (Lab)
I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing this debate. It could not be timelier, because today represents a significant day for the constituency and community that I represent.
Three years ago, Holly Newton lost her life at the hands of her ex-boyfriend in Hexham. She was a much-loved daughter, granddaughter and sister, a loving friend and a talented dancer. She was just 15 years old. Her ex-boyfriend had become increasingly obsessed, coercive and controlling in the lead-up to her murder, attempting to isolate her from friends and relentlessly bombarding her. Three years ago, he used Snapchat to track her movements before fatally stabbing her. The exploitation of technology was used as one of a range of tools to abuse Holly and ultimately to end her life.
Devastatingly for her family, this abhorrent act of violence against Holly and the abuse that preceded it are not recognised as domestic violence, as Holly was under the age of 16. Micala, Holly’s mother, experienced a horrifyingly similar lack of support or acknowledgment of her own experience of domestic abuse as a teenager. Without recognition for domestic abuse victims under the age of 16, the system will continue to fail children across the country.
I support the family and the campaign for Holly’s law, which would change the age of recognition and the development of relationship education for all young people. It is a critical flaw that we are not legally recognising victims when we know that they exist and that perpetrators in the realm of digital exploitation and abuse are themselves increasingly under the age of 16 or 18. I thank the Minister for the time she took to meet us last week, when we discussed the next steps for the campaign and addressed potential routes to reform.
Technology is being weaponised against women and girls at a speed that far outpaces our systems to safeguard and support victims, prosecute perpetrators and intervene in cases before warning signs escalate into fatalities. I want to touch briefly on a case that my office has been working on with another constituent for well over a year. Not only has she suffered the most appalling digital violation, but she has been a victim of systemic flaws when it comes to this form of abuse. She discovered that her partner had spent several years taking non-consensual intimate photographic images of her and had posted them to websites and forums online. He was arrested, but while he was in custody he refused to share the PIN to access his device.
The investigating force did not have the technology required to effectively review the device, which was key to the perpetrator’s activity. It had no way to prove where the non-consensual images came from or prove their existence with any electronic footprint on the suspect’s devices. With only circumstantial evidence based on who had access to the images, and with the suspect denying the accusations against him, the police could not meet the evidential threshold required for the CPS to charge. After being released, he was free to go straight back into the community, holding the very device that could be used to further perpetrate abuse, which he did. He turned to AI nudification apps to continue to produce non-consensual imagery of the victim and cover his digital tracks in the process.
Investigation resources, appropriate technologies and the boundaries of the evidential threshold have all conspired against this innocent woman whose life has been devastated by digital and domestic abuse. I urge the Minister to look proactively at a cross-departmental approach to ensuring that our commitment to tackling digital exploitation is effective and addresses the systemic gaps.
Several hon. Members rose—
Order. I am sorry, but to get everyone in we will have to go down to two minutes each.
Michelle Welsh (Sherwood Forest) (Lab)
It is a pleasure to serve under your chairmanship, Ms Jardine. I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing this important debate.
With increased access to the internet, the exploitation of vulnerable children, predominantly young girls, has become easier for predators. Over the past few years, we have seen the rise of deepfakes on social media: the National Police Chiefs’ Council estimates that they have increased in prevalence by 1,780% between 2019 and 2024. Let me be clear: the toxic culture of misogynistic behaviour online is not banter. It is not free speech. It is abuse.
We know that the victims of online misogyny, abuse and exploitation are predominantly women and girls, so I welcome the Government’s bold action in making it a criminal offence to create or request the creation of non-consensual intimate images. How we respond to online abuse defines what kind of society we are and what kind of society we are prepared to be. We should be a society that stands up for dignity and equality for all women and girls.
The speed at which these images can be produced and shared is truly alarming. I worry that without social media platforms taking more responsibility to remove this content from their sites, we will never truly be rid of it. There needs to be more emphasis on stopping the predators who create the images and on ensuring that such images can be removed swiftly from sites to protect women and girls. That needs to be backed by legislation.
The answer should never be for girls and women to log off or stay quiet. Exploitation of women and girls online is not inevitable; it is a failure of choice and a failure of systems. If we have the power to design these systems, we have the responsibility to make them safe.
Jas Athwal (Ilford South) (Lab)
I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing this hugely important debate.
At the start of the year, many of us were horrified by the ease and speed at which thousands of women and children had their bodily autonomy violated, simply for existing online, but we should not kid ourselves that this was a one-off or an isolated scandal. Digital abuse of women and girls is not an anomaly; it is systemic. Only a few weeks ago, it was reported that Meta’s smart glasses are being used to secretly film women without their consent. This is not accidental misuse; it is foreseeable harm. Before these technologies are embedded in the fabric of everyday life, we have a duty to regulate and legislate so that women and girls do not become the tragic cost of a tech revolution that prizes innovation over safety. We must be proactive, not reactive, because the ways in which women and girls are degraded and denied bodily autonomy online are constantly evolving.
We must enforce safety by design, and it must mean more than administrative box-ticking. Tech firms must be able to demonstrate that they have seriously considered the harms their products may cause, and that they have meaningfully mitigated those risks. There is so much more I could say on this, particularly in the case of Grok and Meta’s smart glasses. Even a moment’s serious reflection would have made it glaringly obvious that these technologies could be exploited. This failure has already cost thousands of women and girls their sense of safety and dignity online. We must take a firm and sustained approach, and fully empower Ofcom, so that this moment of technological progress does not become a major step backward for women’s safety.
The recent horrific stories about Grok and other AI-enabled abuse have highlighted a trend with which far too many women are very familiar. The online world, for all its benefits, has also brought new opportunities for them to experience exactly the same degrading abuse that they are all too familiar with in day-to-day life.
There are three really important lessons for us all to take from the Grok case. First, given the platform's consistent inability to act on its own before being pushed, we need to continue to be proactive in taking on tech firms. Secondly, the fact that we were able to deliver such fantastic action so quickly teaches us that when tools can be delivered robustly and with confidence, change is possible. Thanks to the brave testimony of many women, many of whom knew that in speaking up they would be making themselves targets for abuse, we were able to drive robust action from Ofcom and get X to back down. Thirdly, if we are to continue to keep pace with developing risk factors, we need to find quicker ways to legislate. The Online Safety Act 2023, for all its strengths, took far too long. We have to get comfortable with more principles-based legislation or secondary legislation options to ensure finally that we can do far better to keep women and children safe online.
Across Government, we need to make sure that our own services cannot be used to enable people to perpetuate abuse. I have been working with one woman who, having escaped domestic abuse and relocated to my constituency, now finds herself unable to reject a simple planning application because, in doing so, the local authority requires her to be comfortable publishing her full name and address online. This important democratic right cannot be denied to women fleeing domestic abuse, of all people. I welcome the chance to talk further with the Minister, after this debate, about what wider work we can do across Government to put this right and to ensure that none of our services, online or offline, are enabling women to be anything other than safe in their homes and thriving in their lives.
Amanda Martin (Portsmouth North) (Lab)
It is a pleasure to serve under your chairship, Ms Jardine. I thank my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing this important debate. I speak today as the MP for Portsmouth North, but also as a former teacher, a mum of three young men and a victim of this crime.
In Portsmouth, our residents live much of their lives online for work, study, information, and socialising, but the reality for many users is worrying. In 2023-24, of the cyber-crimes reported in Portsmouth, 34% were online bullying or harassment, 30% malicious messaging, 10% stalking and 8% sexual offences. Disgustingly, that number includes 60 cases involving images of children. Most of the victims were women under the age of 45, and their perpetrators were known to them, yet only 13% reported these crimes, leaving too many to suffer in silence or to be told that their complaint did not meet the threshold.
The digital revolution has brought opportunity, but it has also brought new and relentless forms of abuse. UN Women warns that AI deepfakes, grooming and image-based abuse are escalating. The recent use of AI to generate non-consensual sexual images, as my hon. Friend the Member for Preston noted in opening this debate, shows how quickly technology can be weaponised, and how tech giants are complicit.
Let me be clear: this is not about blaming girls or young women, or boys and young men. It is about responsibility, consent and respect. It must be clear that technology does not remove consent, and anonymity does not remove accountability. Women and girls should never have to navigate fear, shame or harassment just to live their lives.
I welcome the strengthening of the law and the introduction of the long-awaited VAWG strategy. I am proud to be part of a Government who have presented it and will deliver it, but we all know that laws alone are not enough, because technology moves quickly. We need stronger enforcement, safeguarding at the source, and education for young people and their parents about respect and consent online as well as offline.
There must be joined-up Government working, taking young people, their parents and whole communities with us, so that we can change the landscape and the culture, ensure that young women and girls in Portsmouth and across the country are safe online, and give victims the confidence they need to come forward.
Thank you very much, we managed to get everybody in. I call the Liberal Democrat spokesperson, Marie Goldman.
Marie Goldman (Chelmsford) (LD)
It is a pleasure to serve under your chairship, Ms Jardine. I thank the hon. Member for Preston (Sir Mark Hendrick) for securing this very important debate.
All of us here recognise that rapid technological change is creating new risks and contexts for the exploitation of women and girls. The speed, anonymity and ease of communication provided by social media, combined with increasingly sophisticated AI tools, have led to new forms of sexual abuse emerging at alarming rates.
In 2025, the Internet Watch Foundation discovered 3,440 AI videos of child sexual abuse, a vast increase on the previous year, when only 13 such videos were found. That is why this debate, and our swift action, are so important. Far more needs to be done to keep women and girls safe online, and doing so is becoming at once more difficult and more urgent by the day. If we do not act swiftly, we cannot be surprised when new technologies exploit our lack of urgency.
Lola McEvoy
Does the hon. Member have any observations on the fact that technology has always outstripped legislation, and that this is actually about accountability and the enforcement of regulation?
Marie Goldman
Absolutely, and that has always been the case. Equally, we need to learn from the fact that it has always been the case and not be surprised when these things happen. We must not wring our hands and say, “There is harm being done—what could we possibly do about it?” We need to think smarter than that and bring in legislation that is much more forward-thinking and adaptable, and enables swifter action.
As hon. Members have already pointed out in this debate, digital abuse and exploitation are overwhelmingly targeted at women and girls. Research from Internet Matters found that 99% of new deepfakes are of women and girls. Moreover, according to the Revenge Porn Helpline, 98% of intimate images reported to its service were of women and 99% of deepfake intimate image abuse depicts women. It has also been discovered that many AI nudification tools do not actually work on images of boys and men.
We have now reached a point where AI tools embedded in major platforms are capable of producing sexual abuse material, demonstrating serious failings in our current framework. X’s AI tool, Grok, is a case in point. We have talked about this many times before. Grok facilitated the illegal generation and circulation of non-consensual sexual images, yet Ofcom’s response was, I am sorry to say, woefully slow. The executive summary of the violence against women and girls strategy states that it will
“ensure that the UK has one of the most robust responses to perpetrators of VAWG in the world.”
I agree with that intention, but we must recognise that Ofcom’s response was not wholly robust. We must do something about that; we owe it to women and girls in this country to act sooner and stronger. We need more effective legislation and a regulator with the capability and confidence to take appropriate and, crucially, swift action.
I welcome the move to make the creation of non-consensual intimate AI images a priority offence under the Online Safety Act, but that will be effective only if online platforms and services are held accountable under that Act. My Liberal Democrat colleagues and I have called on the National Crime Agency to launch an urgent criminal investigation into X, which should still happen, and to treat the generation of illegal, sexual abuse material with the seriousness it demands. We must act decisively when social media platforms refuse to comply with the law.
It is also time that we introduce age ratings for online platforms and limit harmful social media to over-16s. How can we expect to tackle violence against women and girls when the next generation is being drip-fed misogynistic content on social media?
The hon. Member is right. Does she agree that online pornography remains an issue that needs to be tackled? The statistics show that more than 50% of young boys aged 11 to 13 have already seen porn, and that it is shaping their minds about what consent is.
Marie Goldman
There are so many aspects to this problem. What we, the parents, saw in the fledgling days of social media is not at all what our children are seeing now. We need to recognise that and act against it. What our children see online is already affecting their worldview. Internet Matters research from 2023 found that 42% of children aged nine to 16 had a favourable or neutral view of the well-known misogynistic influencer Andrew Tate, and that older teenage boys were particularly susceptible. That is incredibly worrying. Decisive action to tackle the digital exploitation of women and girls is needed across the board. Online harm is genuine harm, and we must treat it as such. There is a lot of work to do, but I am keen to work cross-party to get it done. I hope the Minister is too.
Sarah Bool (South Northamptonshire) (Con)
It is a pleasure to serve under your chairmanship, Ms Jardine. Digital exploitation does not affect women and girls exclusively, but, given that four in five victims of online grooming are girls, it is an issue that we must focus on. As MPs, we are all aware of the risks and threats that women face in the online sphere. It is no surprise that the National Society for the Prevention of Cruelty to Children found that only 9% of girls feel safe in online spaces. The accounts of stalking given earlier are terrifying, especially those using Ring doorbells, which are designed to keep people safe; that they would be manipulated in that way is horrific. The case of Holly that involved Snapchat in the constituency of the hon. Member for Hexham (Joe Morris) is frankly horrifying.
There is no doubt that the complexity of the online world has resulted in significant digital exploitation. At this very moment, online content is being produced that takes advantage of women for financial gain. That is particularly worrying given that, according to Ofcom’s 2025 report on the time people spend online, women are spending more time than men across an array of websites. The issues around the digital exploitation of women and girls are particularly prominent on social media sites: over half of girls and women report receiving sexist comments about themselves online. This is a problem on an industrial scale.
The recent Grok sexual imagery debacle brought this into sharp focus. It demonstrated the dangers posed to women who had not even engaged with the technology. People merely used an existing image to take advantage of the technology and spread it using the power of social media. I welcome steps to stop it, but are we equipped to handle the changing digital landscape in the future? The Online Safety Act introduced key changes to the Sexual Offences Act 2003 and criminalised sharing intimate images of another person without their consent. The Government are now adding provisions to the Act to make it a criminal offence to create non-consensual intimate images. Do the Government believe that that will be sufficient, and that Ofcom has the necessary powers to stop this abhorrent practice?
What I have seen from the Government so far is a reactive approach to AI and how it relates to women and girls. The technology is undoubtedly here to stay, but given the uncertainty of its development, is the Minister confident that the Government’s approach is sufficiently agile to prevent people from taking advantage of the technology to exploit women and girls?
As we have heard, AI is only one part of the problem: social media is driving much of the digital exploitation of women and girls. Data from 44 forces provided to the NSPCC showed that the police recorded 7,263 “sexual communications with a child” offences in the last year— a number that has almost doubled since the offence came into force in 2017-18. Data from the crime survey of England and Wales showed an increase of 6% in child exploitation offences compared with the previous year, and that comes on top of evidence that these platforms are linked to the fact girls are twice as likely as boys to experience anxiety. Recent data shows that girls who use social media at the age of 11 report greater distrust of other people at the age of 14.
The problem is only growing. Every day that the Government delay is another day that millions of girls are left at risk. We do not need further reviews or consultations; we need a ban on social media for under-16s. It is time to grip this issue.
Lola McEvoy
Will the hon. Lady elaborate on her definitions of “social media” and “ban”?
Sarah Bool
In terms of social media, I mean platforms such as Facebook and Snapchat; I am not talking about WhatsApp, which is a communication platform that many families use, although we have to be careful how it is used, because images can be shared on it.
A ban is about ensuring that children cannot access these platforms. The issue has been raised at different levels. The problem is the content that children can see, and especially the way the algorithms are used. I recognise that the companies also need to take responsibility for what is being accessed and how people are accessing it, because this is going on at a scale larger than any parent could imagine. This is not the social media that we grew up on, where we used to post a little note on a wall for our friend’s birthday or upload photos from a night out—that is definitely not what children are seeing.
Lola McEvoy
My problem with the hon. Lady’s argument is that we have constantly said that our legislation is lagging behind technological advances, but the proposed solution is to name a number of platforms where there is evidence of exploitation, crime and damage. I agree that we need to do that, but is it not better to make evergreen legislation, as some Members have argued, than to have a list of examples that somebody else has come up with?
Sarah Bool
I agree, but we need to take action now on the ones that we are aware of. Our legislation absolutely needs to be much more agile for the future, and I am not saying that a ban will be a silver bullet, but it will protect many girls from digital exploitation. That is why I am asking the Minister to follow the policy set out by the Conservative party, which was accepted in the House of Lords, and prohibit those under the age of 16 from using social media. If we do not put our children into those arenas, they will be far less likely to experience the opportunities for exploitation that stem from the internet and target the young and the vulnerable.
If the Government support those measures, they could move fast and take action without delay. Let me be clear: the challenges posed by digital exploitation will not vanish if we prohibit the use of social media, but that would be a bulwark against the dangers that social media poses, particularly to young people. If we allow people to access these platforms when they are more mature and more educated, we can hopefully achieve reductions in exploitation.
As others have said, it is a real pleasure to serve under your chairship, Ms Jardine. I am very grateful to my hon. Friend the Member for Preston (Sir Mark Hendrick) for securing the debate, and to all Members who have spoken.
From the get-go, I want to set out my stall and talk about exactly how I feel about this issue. If someone makes their money through harming women, and if part of their business model is sharing terrible, sexualised, faked images of people like me—well, I am not really allowed to say what I think about that, but I want to make it completely clear that it is totally and utterly unacceptable. Discussions like this are essential, especially as we know that the technology is developing more quickly than we can write legislation.
The hon. Member for South Northamptonshire (Sarah Bool) asked why we cannot ban these things now. I remember the Online Safety Act going through Parliament, and I have to say that it is a triumph of hope over experience to think that I could just say, “Ban it now,” and that by tomorrow it would be banned. If only I wielded such a great ban hammer, I would be banning stuff all over the shop—no one would be listening to their phone out loud on the train any more. But that pace of change is not one that legislation easily keeps up with, and I say to other hon. Members who have spoken that we need to find backstops and ways to make our legislation more agile, so that it can change without having to go through some of the processes we have—I gave 10 years of my life to the Online Safety Act.
Is it not enforcement that is really lacking? Should legislation make enforcement the prime tool?
The hon. Lady makes an incredibly important point. She is absolutely right that we need to make sure these things are enforced. To Members who spoke about pornography, I would say that there are reasons to be cheerful about the enforcement by Ofcom. I could dance a jig because Pornhub has reported a 77% reduction in traffic since age verification stopped young people being able to access it so easily. We are in the foothills of what that legislation can do. Where pornography companies have not been undertaking age verification, Ofcom has issued £1 million fines, and changes have been made to companies’ roles in the UK, so that they meet our laws. So there are reasons to think that there is some enforcement, but I absolutely agree that we need to grapple with the agility, scale and scope of that enforcement.
I must come to the points raised by my hon. Friend the Member for Preston. Before I came to the debate, my colleague the Minister for victims was telling me how amazing my hon. Friend and his office have been in Preston in handling online abuse. People in our constituency offices often do not get praised for these things, but I hear that my hon. Friend has a legend working in his office.
My hon. Friend talked about the importance of education in this space, and about this being a country-wide push for change, and I could not agree more. The Government have invested in this issue, and it will be an absolutely fundamental part of the violence against women and girls strategy.
The National Centre for Violence Against Women and Girls and Public Protection will do exactly what my hon. Friend talked about, so that the good standards, for example, in Cheshire—it is not far from him, and the anti-stalking practices are amazing and world-leading—are the same for people in the west midlands and everywhere else. My hon. Friend used the example of stalking legislation and making sure there are standardised systems and standards that police forces have to live by, which will absolutely include upskilling, when policing the digital elements of these crimes, whether it is domestic abuse or online. Stalking online is as illegal as stalking in real life—just to be clear, they are the same crime.
My hon. Friend talked about the richest man in the world. I am not sure there are many people in this building who have quite such a claim against the richest man in the world as me. What happened is unacceptable, and anyone who has existed online will know about the Grok outcry.
Some hon. Members mentioned Meta glasses. If I had been in the meeting where they floated the idea of making Meta glasses, the very first thing I would have said would have been, “These are going to be used to abuse women.” Why is that not being baked into the design of such products?
One of the things the violence against women and girls strategy has absolutely committed to is working on safety by design. In the car industry, we now take safety features for granted. If we are talking about what it was like when we were kids versus now, my dad used to put us in the back of the car and purposefully go round the corners fast so that we would smack into the window. These things are not acceptable now.
We have to go on a journey with this technology. To me, a Ring doorbell is such an obvious way to stalk somebody, as is an AirTag. I see cases again and again. It does not matter what the new technologies are; perpetrators of these abuses will find a way to use them for that purpose, so we need to design in safety functions. On the issue raised by my hon. Friend the Member for Hitchin (Alistair Strathern) about planning, I will take that away and work with him.
The Government obviously took a strong stance—I felt pleased about this—against Grok. We can see that when we stand together and people speak up, we can make change in this area, but we need to make sustainable change. We absolutely are always looking at legislative changes. As people have said, there have been a number of those. There is the issue of Grok being added into the Online Safety Act, so that there can be accountability on that basis.
In the Crime and Policing Bill, we are also banning nudification apps. I have also had it shown to me that they do not work on men and boys, which I am glad about for men’s and boys’ sake, but if you are designing something that will nudify only women, you have a problem. I do not know who I can talk to, but there is something wrong with you. Have a word with yourself; otherwise, we will have a word with you. The ban will target firms and individuals providing and supplying tools that use AI to turn images of real people into fake nudes.
There is a raft of other legislation that we are putting through and that we hope will shift the dial. Obviously, in the violence against women and girls strategy, we have made a very clear commitment to ensuring that we make it impossible for children to take and share naked images of themselves—we will make it impossible for them to do that. My hon. Friend the Member for Darlington (Lola McEvoy) and others talked about children being taken from social media and on to other platforms. I have to say that encrypted spaces are the most dangerous for child abuse imagery. But to the hon. Member for South Northamptonshire, who was talking about that, I say this: 91% of all child sexual abuse images are self-made; they are made by children themselves. People have groomed them—exploited them—to make those images. It may be their peers.
We will not stop this just by looking at the issue of new AI. There is an issue with where our children can go and who has access to them. I agree with the hon. Lady’s sentiment. We have to make sure that we get this right. Even with the 10 years of work on the Online Safety Act, and with the level of detail and, I have to say, the arguments that went into it, it still has all the gaps that we are talking about, so we need to make sure we get this right and legislate in a way that can be agile for the future. That is why I think the Government need to take the time—not too much time, I agree—to make sure we do that.
Others talked about accountability and whether anyone ever actually gets punished for these things. As part of the work we are doing in the Home Office, we are expanding the use of covert officers to address violence against women and girls, and improving the capabilities to counter and reduce the highest harms. We operate a similar system with regard to child abuse online. We are now doing that also for women and girls online, recognising the level of organised crime that is behind this. The hon. Member for Bath (Wera Hobhouse) talked about people who are asleep and being filmed, like Gisèle Pelicot. These issues deserve a police force specifically looking at the covert aspect, and that is what this Government are doing.
I call Sir Mark Hendrick to wind up the debate extremely briefly—in about 5 seconds.
Thank you, Ms Jardine. I thank the Minister and all my colleagues for an excellent debate. I think we had a brilliant response from the Minister. I am sure she is determined to ensure that the digital space can be as safe as possible for women and children, and for everybody else as well.
Question put and agreed to.
Resolved,
That this House has considered the matter of tackling the digital exploitation of women and girls.