Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Hansard Text Watch Debate Read Debate Ministerial Extracts
15:37
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- View Speech - Hansard - - - Excerpts

I beg to move,

That this House makes provision as set out in this Order:

(1) On Monday 9 March 2026:

(a) Standing Order No. 14(1) (which provides that government business shall have precedence at every sitting save as provided in that Order) shall not apply;

(b) any proceedings governed by this Order may be proceeded with until any hour, though opposed, and shall not be interrupted;

(c) the Speaker may not propose the Question on the previous question, and may not put any question under Standing Order No. 36 (Closure of debate) or Standing Order No. 163 (Motion to sit in private);

(d) at 6.00pm, the Speaker shall interrupt any business prior to the business governed by this Order and call the leader of the second largest opposition party or another Member on their behalf to move the order of the day that the Online Services (Age Restrictions) Bill be now read a second time;

(e) in respect of that Bill, notices of Amendments, new Clauses and new Schedules to be moved in Committee may be accepted by the Clerks at the Table before the Bill has been read a second time;

(f) any proceedings interrupted or superseded by this Order may be resumed or (as the case may be) entered upon and proceeded with after the moment of interruption.

(2) The provisions of paragraphs (3) to (19) of this Order shall apply to and in connection with the proceedings on the Online Services (Age Restrictions) Bill in the present Session of Parliament.

Timetable for the Bill on Monday 9 March 2026

(3) (a) Proceedings on Second Reading and in Committee of the whole House, any proceedings on Consideration and proceedings up to and including Third Reading shall be taken at the sitting on Monday 9 March 2026 in accordance with this Order.

(b) Proceedings on Second Reading shall be brought to a conclusion (so far as not previously concluded) at 8.00pm.

(c) Proceedings in Committee of the whole House, any proceedings on Consideration and proceedings up to and including Third Reading shall be brought to a conclusion (so far as not previously concluded) at 10.00pm.

Timing of proceedings and Questions to be put on Monday 9 March 2026

(4) When the Bill has been read a second time:

(a) it shall, notwithstanding Standing Order No. 63 (Committal of bills not subject to a programme Order), stand committed to a Committee of the whole House without any Question being put;

(b) the Speaker shall leave the Chair whether or not notice of an Instruction has been given.

(5) (a) On the conclusion of proceedings in Committee of the whole House, the Chairman shall report the Bill to the House without putting any Question.

(b) If the Bill is reported with amendments, the House shall proceed to consider the Bill as amended without any Question being put.

(6) For the purpose of bringing any proceedings to a conclusion in accordance with paragraph (3), the Chairman or Speaker shall forthwith put the following Questions in the same order as they would fall to be put if this Order did not apply—

(a) any Question already proposed from the Chair;

(b) any Question necessary to bring to a decision a Question so proposed;

(c) the Question on any amendment, new clause or new schedule selected by The Chairman or Speaker for separate decision;

(d) the Question on any amendment moved or Motion made by a designated Member;

(e) any other Question necessary for the disposal of the business to be concluded; and shall not put any other Questions, other than the Question on any motion described in paragraph (15) of this Order.

(7) On a Motion made for a new Clause or a new Schedule, the Chairman or Speaker shall put only the Question that the Clause or Schedule be added to the Bill.

Consideration of Lords Amendments and Messages on a subsequent day

(8) If on any future sitting day any Message on the Bill (other than a Message that the House of Lords agrees with the Bill without amendment or agrees with any Message from this House) is expected from the House of Lords, this House shall not adjourn until that Message has been received and any proceedings under paragraph (9) have been concluded.

(9) On any day on which such a Message is received, if a designated Member indicates to the Speaker an intention to proceed to consider that Message—

(a) notwithstanding Standing Order No. 14(1) any Lords Amendments to the Bill or any further Message from the Lords on the Bill may be considered forthwith without any Question being put; and any proceedings interrupted for that purpose shall be suspended accordingly;

(b) proceedings on consideration of Lords Amendments or on any further Message from the Lords shall (so far as not previously concluded) be brought to a conclusion one hour after their commencement; and any proceedings suspended under subparagraph (a) shall thereupon be resumed;

(c) the Speaker may not propose the Question on the previous question, and may not put any question under Standing Order No. 36 (Closure of debate) or Standing Order No. 163 (Motion to sit in private) in the course of those proceedings.

(10) Paragraphs (2) to (7) of Standing Order No. 83F (Programme Orders: conclusion of proceedings on consideration of Lords amendments) apply for the purposes of bringing any proceedings on consideration of Lords Amendments to a conclusion as if:

(a) any reference to a Minister of the Crown were a reference to a designated Member;

(b) after paragraph (4)(a) there is inserted—

“(aa) the question on any amendment or motion selected by the Speaker for separate decision;”.

(11) Paragraphs (2) to (5) of Standing Order No. 83G (Programme Orders: conclusion of proceedings on further messages from the Lords) apply for the purposes of bringing any proceedings on consideration of a Lords Message to a conclusion as if any reference to a Minister of the Crown were a reference to a designated Member.

Reasons Committee

(12) Paragraphs (2) to (6) of Standing Order No. 83H (Programme Orders: reasons committee) apply in relation to any committee to be appointed to draw up reasons after proceedings have been brought to a conclusion in accordance with this Order as if any reference to a Minister of the Crown were a reference to a designated Member.

Miscellaneous

(13) Standing Order No. 82 (Business Committee) shall not apply in relation to any proceedings on the Bill to which this Order applies.

(14) (a) No Motion shall be made, except by a designated Member, to alter the order in which any proceedings on the Bill are taken, to recommit the Bill or to vary or supplement the provisions of this Order.

(b) No notice shall be required of such a Motion.

(c) Such a Motion may be considered forthwith without any Question being put; and any proceedings interrupted for that purpose shall be suspended accordingly.

(d) The Question on such a Motion shall be put forthwith; and any proceedings suspended under sub-paragraph (c) shall thereupon be resumed.

(e) Standing Order No. 15(1) (Exempted business) shall apply to proceedings on such a Motion.

(15) (a) No dilatory Motion shall be made in relation to proceedings on the Bill to which this Order applies except by a designated Member.

(b) The Question on any such Motion shall be put forthwith.

(16) Proceedings to which this Order applies shall not be interrupted under any Standing Order relating to the sittings of the House.

(17) No private business may be considered at any sitting to which the provisions of this Order apply.

(18) (a) The start of any debate under Standing Order No. 24 (Emergency debates) to be held on a day on which proceedings to which this Order applies are to take place shall be postponed until the conclusion of any proceedings to which this Order applies.

(b) Standing Order 15(1) (Exempted business) shall apply in respect of any such debate.

(19) In this Order, “a designated Member” means—

(a) the leader of the second largest opposition party; and

(b) any other Member acting on behalf of the leader of the second largest opposition party.

This afternoon is an opportunity for the House to come together to take urgent and meaningful action and to legislate within weeks—not months or years, but weeks—to keep our children and young people safe online, whether that is protection from harmful social media, artificial intelligence chatbots or addictive gaming. It is clear that we are at a tipping point, with widespread public and cross-party support for decisive action.

Every parent across this country knows the threat that social media poses to our children—to their mental health, to their physical health, to their sleep and to their concentration. They have written in their thousands to every single MP in this House—I want to take this opportunity to thank the 1,500 or so parents and carers in my Twickenham constituency who have written to me—and they are begging for a change in the law, so that they can better protect their children. They are not abdicating parental responsibility, as some people would like to suggest; they are pleading with the Government for help in providing the tools and safeguards that they need when faced with the might and the business models of enormous tech companies profiteering from our children’s attention.

For me, this is personal. My husband and I fight a daily battle at home with our children, aged 11 and seven, on screen time and what platforms and games they can access. Peer pressure is overwhelming for children—especially those just starting out on their secondary school journey, as my daughter recently has—who are desperate for belonging and connection. Parents are torn between wanting to ensure that our children are not left out of online spaces, which all too often we ourselves struggle to understand, and wanting to protect our children.

I believe that it is time we sent this message, loud and clear, to Musk, Zuckerberg and the other tech giants: “If your platform spreads harmful content or relies on addictive and harmful algorithms, you should not be allowed anywhere near our children.” That is why the Liberal Democrats have today introduced a Bill that would provide a range of protections for children from online harms, including the restriction of access to harmful social media.

Before I describe what we would ideally want to include in the Bill, let me emphasise that if the House were to support the motion, we would seek to work on a cross-party basis to introduce workable and effective legislation quickly, given that there is support for action across the House. This is not about one party winning or owning the issue; it is about us—as politicians, policymakers and parents—coming together to protect our children, their safety and their wellbeing.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

I thank and commend the hon. Lady for initiating the debate, and for her devotion to this subject. Does she agree that we should consider education and the role of school principals? In Northern Ireland the Education Minister, Paul Givan, has introduced a pilot scheme on phone-free schools, and I have held an event in my constituency to discuss that very issue. The aim is to prevent children from being harassed while at school, and from understanding things that they should not be understanding or doing. Does the hon. Lady agree that phone-free schools to help our children should be part of the policy and part of what the Liberal Democrats are trying to do?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

It is always a pleasure to give way to the hon. Gentleman, who is the first to intervene in the debate, and I entirely agree with him. I will touch on the point about phones in schools later, and I believe that we will have a chance to vote on that specific measure shortly, when the Children’s Wellbeing and Schools Bill returns to this House.

As I have said, we want to approach this legislation in a cross-party way, but let me now turn to what the Liberal Democrats would ideally like to see in it.

Anna Dixon Portrait Anna Dixon (Shipley) (Lab)
- Hansard - - - Excerpts

The hon. Lady is making her case very personally and passionately, describing the harms to young people’s mental health that result from the predatory algorithms that the tech giants have devised to create addictive content for children. I, too, think that there is cross-party agreement on the need to look very carefully at how we protect children. Today I was on a call with members of the campaign group “36 Months”, discussing how they are approaching the issue in Australia. Does she agree, however, that the right approach is to have a full public consultation—as has been proposed—so that parents, schools and the rest of us can get this right, learning from evidence and learning from places such as Australia in order to protect our kids?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I hope the hon. Lady will not mind if I call her my hon. Friend, although we are on opposite sides of the House. I thank her for her intervention, and I take her point, which I have also heard the Government express. I agree that we need to consult, but I think we should be consulting on how we implement some of these proposals, not on whether we do or what we do, because there is clearly a general consensus. When we look at the findings of every opinion poll—certainly when it comes to such measures as banning social media for under-16s—we see overwhelming public support. There is also cross-party support in this House and, as we have seen recently, in the other place. For me, if there is a consultation, it should be about how those things are implemented and not whether we do that or which ones we implement. However, I will touch on the Government’s approach towards the end of my speech.

We Liberal Democrats would introduce a film-style classification system, with social media rated at 16 as a default, and give Ofcom the powers to back up such a framework. That echoes the film and video classification system established in the 1980s, adapting a trusted framework for the digital age. Companies would be required to age-gate their platforms based on the harmfulness of their content, the addictiveness of their design and the impact that that can have on a child’s mental health. The onus would be on social media companies to stop children getting on to their platforms and to take steps to make their apps safer in the meantime.

Gareth Snell Portrait Gareth Snell (Stoke-on-Trent Central) (Lab/Co-op)
- Hansard - - - Excerpts

I am sympathetic to all the hon. Lady’s arguments. However, it appears that we are about to have a Second Reading debate on an as yet unpublished Bill, when the motion on the Order Paper is about whether we have a day for that Second Reading debate. I am conscious, because I have been to the Vote Office, that the Bill is not available yet. What are we debating this afternoon? If we were to vote with the hon. Lady this evening, what Bill would we be asked to look at on that future day?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

It is simple. As I have said, I want us to come together in a cross-party consensus on what should be in that Bill. I have heard what the Conservatives have had to say, I am about to set out what the Liberal Democrats have to say and I am keen to hear what Ministers have to say on what should be in the Bill. We do not have a Bill yet because we think there is an opportunity to work together on this issue.

There have been suggestions that there is party politicking on this issue. I do not think it is a party political issue; I think we all agree that children’s safety and wellbeing is a cross-party priority. The idea is that we agree to move forward, come together and work cross-party on a Bill which, hopefully, we can get through Parliament very quickly and on to the statute book to start protecting our children as soon as possible.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

I do not wish to make this a procedural debate, but the hon. Lady presented a Bill earlier for First Reading. We have been asked to consider that Bill for Second Reading on a future date. That Bill is not available. Although I absolutely respect her approach for a cross-party consensus to design the Bill, as I understand it the Bill is already written and we are being asked to give over Government time for that to be debated on Second Reading. Again, what are we debating today if the Bill is not available but has been written and we will not have a Second Reading debate until sometime in March?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

We are debating today the principle of bringing forward legislation quickly. I know that the Government are saying that they want to bring forward legislation sometime in the future. We do not know when that is. I am trying to put a timeframe on it, because we know that what will come back shortly from the other place in the Children’s Wellbeing and Schools Bill will not be accepted by this House. That is why I am trying to find an opportunity for us all to come together and get to a point that we all agree on. This is about agreeing the principle that we should have primary legislation sooner rather than later. I am happy to make time in my diary tomorrow to start those discussions.

Andrew Cooper Portrait Andrew Cooper (Mid Cheshire) (Lab)
- Hansard - - - Excerpts

The hon. Lady makes a passionate case and spoke movingly about the debate in her own family and how to protect her children. I recognise that. I have two young children and I think carefully about what they look at online. I worry about when they get older and how we will deal with that.

The hon. Lady talks about age-gating as the principle on which she wants to work, but I am concerned. We know that the algorithms are addictive and that they reinforce people’s worst prejudices. What evidence is there that that stops at 16? Is she not concerned that simply focusing on age-gating will diminish the pressure on social media companies to open up the algorithms so that we can have a look at how they affect society more generally?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I have not said that we should focus only on age-gating; as I continue through my speech, the hon. Gentleman will hear about the range of other things that I think should be in any legislation that is brought forward—quickly—to protect our children. The age-gating of certain platforms based on their harmfulness, which would be a key principle and part of the legislation, is part of our proposals, but so are various other things that I will talk about in terms of tackling the addictiveness of algorithms that is so damaging to our children.

Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - - - Excerpts

I wanted to ask the hon. Lady about the register and the ranking of age-appropriateness for content. We have sat opposite each other on many occasions discussing this matter. I have grave concerns about who will register those individual self-published bits of content and who will manage it and pay for it, and how it would actually work in practice.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I was about to expand further on that before I took the two preceding interventions. Perhaps the hon. Lady will allow me to continue and, if I have not addressed her concerns, she can intervene on me again.

Ofcom would be given the powers to force platforms that do not want to play ball to do so or to face serious consequences. We believe that that would mean a ban on harmful social media for under-16s. Family friendly services such as Wikipedia or Tripadvisor would be available at a lower age, as those sites fall under the current user-to-user definition in the Online Safety Act. We know, however, that even 16 could be too young to access the most harmful of sites—those that host violence and pornography—which is why our proposals would allow what we think are really harmful platforms, such as X, to be age-gated up to 18.

A harms-based approach, like the one we are proposing today, is supported by 42 charities including the likes of the National Society for the Prevention of Cruelty to Children, the Molly Rose Foundation and others, and would protect children from the worst of the web without breaking the parts of the internet that families actually rely on. Crucially, it is future-proofed and could be applied to chatbots, games and other emerging technologies.

I welcome the fact that the Conservatives’ Opposition day motion a few weeks ago, which we were unable to debate, moved towards the Liberal Democrats’ nuanced approach to keeping under-16s aways from “harmful” social media. I hope that the Conservatives will be able to support our motion today and this approach going forward, despite the fact that they were unfortunately unable to do so in the other place just a month ago.

Paul Holmes Portrait Paul Holmes (Hamble Valley) (Con)
- Hansard - - - Excerpts

The hon. Lady is right that we did table an Opposition day motion in Conservative time on this subject, but the difference between our motion and the Liberal Democrat motion is that ours contained proposals. This afternoon, she is asking us not to debate a motion on the topic in the title of the Bill, but merely to give the Liberal Democrats control of the Order Paper on 9 March. Why did she not choose to bring forward a Bill, allow the House to look at her proposals and have a solid, principled debate on it before she asked us to give her control of the Order Paper on 9 March?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

While the House would be giving me, or the Liberal Democrats, control of the Order Paper, I have made it abundantly clear that we would work together to bring forward legislation—[Interruption.] The Conservatives have proposals; the Government are consulting on something, although I am not quite sure what, because they have not published the consultation yet. We put forward proposals in the other place that the hon. Gentleman’s party unfortunately chose not to support. However, I do not think we are that far apart.

We have published proposals in the other place and would use those as a basis for discussion. The Technology Secretary has already told me and my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins) that she would happily work with us on our proposals. There are proposals out there in the public domain. This is about the principle of legislating soon and quickly to bring forward legislation that we can all agree on to protect our children.

Paul Holmes Portrait Paul Holmes
- Hansard - - - Excerpts

I am grateful to the hon. Lady for giving way again, because I have to push this point. She has outlined that her party has published proposals in the other place, but her party is called the Liberal Democrats—this is the democratically elected Chamber, and we should be debating a proposed Bill from the Liberal Democrats on their Opposition day. I agree with her that we need urgent legislation. Why is she depriving Members across the House of detailed proposals that we could vote on and instead asking the House through a procedural motion to give her party control of the Order Paper on 9 March?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

All I can do is repeat myself: I know that if I had published all these things that I am laying out as a piece of legislation, Members on both sides of the House would probably have voted it down. I have told the House that I am happy to come forward in the spirit of co-operation to draft something together—

None Portrait Several hon. Members rose—
- Hansard -

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I am going to try to move on now—[Interruption.] I am going to make some progress, because I think we have now tested the procedural approach to death.

It is important that we reach consensus on our approach and reject the unworkable blanket bans that have been proposed elsewhere that put enormous powers in the hands of an individual politician. I do not think any Reform Members are here in the Chamber, but given that Reform wants to scrap the Online Safety Act altogether, I shudder to think what future Ministers might deem acceptable if they were allowed to choose what our children and young people could access, which the amendment to the Children’s Wellbeing and Schools Bill coming from the other place would allow the Secretary of State to do.

Layla Moran Portrait Layla Moran (Oxford West and Abingdon) (LD)
- Hansard - - - Excerpts

In among the discussions around procedure, which are important in this place, I fear that we are missing the nub of what my hon. Friend is trying to get to, which is that this is a nuanced space. This is not a blanket “we say no to everything”. Some people are arguing that we should do nothing, and that it should just be down to parents to deal with it. Does she agree that the thoughtful way that she is putting this across, trying to get us all to come together around this issue with the public, is how we will create something that is future-proof? So much of legislation in this area involves chasing our tails, but this is an opportunity for us to get ahead of it, for once.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

That is indeed what we are trying to do. Putting forward a blanket ban on a particular list of social media sites determined by any Secretary of State at any given point in time is necessarily acting after the fact. That is not future-proof or particularly effective, and it is subject to politicisation. That is why our harms-based approach, which I want to negotiate to get into legislation soon, would be future-proof and work to act on things such as chatbots and games. I know from the discussions we have at home how addictive games such as Roblox can be, for instance.

We Liberal Democrats have long been pressing for a suite of measures that would make the online world safer and healthier for all. One measure that could be implemented overnight would be to ban tech companies from profiting from our children’s attention by raising the age of digital data consent from 13 to 16. This would end the hold that addictive algorithms have on children.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

On a point of order, Madam Deputy Speaker. The hon. Lady is making excellent points on the substance, but they bear no resemblance to the motion on the Order Paper. Are you able to give me guidance on what is up for debate this afternoon? Can the hon. Lady point me to where what she is debating sits in the motion?

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- View Speech - Hansard - - - Excerpts

What is clear is that there is a motion on the Order Paper on which Members will presumably be asked to divide in due course. That does not give any detail of the proposed Bill, but the motion on the Order Paper is orderly and it will be up to Members to decide how they wish to vote on that.

Paul Holmes Portrait Paul Holmes
- Hansard - - - Excerpts

Further to that point of order, Madam Deputy Speaker. I am grateful for your indulgence, and I suspect that I will get the same answer as the hon. Member for Stoke-on-Trent Central (Gareth Snell), but I have never, in my seven years in this House, been in a situation where a motion outlines the timetable for Monday 9 March—including the timings of proceedings and questions to be put on Monday 9 March and of consideration of Lords amendments and messages on a subsequent day—for a Bill that this House has not seen. How can Members vote for a motion that allocates separate procedures for a Bill that has not been published? I want my constituents to know what the Liberal Democrats are proposing in this space. The hon. Lady is now elaborating on the Floor of the House on what she wants her policies to be, but she is asking us to vote for a Bill that has not been put before this House. Can I therefore have your advice, Madam Deputy Speaker, on whether this debate should be going ahead if the House does not have a substantive Bill relating to this procedural motion?

Caroline Nokes Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I thank the hon. Member for his point of order. The motion on the Order Paper is perfectly orderly, so Members will be invited to vote on that, not on the substance of any Bill that might come on 9 March. I think it is important that the House is clear on that.

Chi Onwurah Portrait Dame Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- Hansard - - - Excerpts

Further to that point of order, Madam Deputy Speaker. How can I assess what is orderly for my contribution to the debate given that the substance of the motion is about process? To be frank, I do not want to speak about process; I want to speak about protections for children.

Caroline Nokes Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

The motion is to give consideration to a Bill on the specific matter which has been outlined clearly on the Order Paper: “Protections for children from online harms”. I reassure the hon. Lady that any contribution she chooses to make on that matter would be in order.

Munira Wilson Portrait Munira Wilson
- View Speech - Hansard - - - Excerpts

What I am setting out is what I would want to put forward as suggestions for the Bill. As you have helpfully pointed out, Madam Deputy Speaker, we will be dividing on whether there should be a Bill very soon on the broad subject of protecting children from online harms.

The other measure I would want to bring forward in any legislation is a doomscrolling cap, which would end the infinite scroll feature on short-form online platforms for young people, limiting the amount of time for which children are pushed to TikTok-style video content to two hours. I would also want to see health alerts on social media platforms for under-18s. Just like cigarettes and alcohol, these addictive products carry well-documented risks, especially for young people. The evidence is clear that excessive use of these apps exposes children to mental health issues, anxiety and sleep disruption, and causes real harm to attention spans. Do they not deserve to know that? When we pick up a packet of cigarettes, we expect to be told about the harm that product will pose to our health, so why is social media—a key driver of the crisis in our young people’s mental health—any different?

Given that young people themselves say they want a break from the stress of social media at school, and given the impact of phones on children’s concentration and focus, will the Education Secretary finally listen to her own Children’s Minister and put the Government’s guidance on mobile phones in schools into law to give teachers and headteachers the back-up and, crucially, the resources they need to restrict their use? That is also something that could be part of this Bill if the Government refuse to accept the amendment that will be coming from the other place to the Children’s Wellbeing and Schools Bill.

I recognise that the Secretary of State for Science, Innovation and Technology has announced a consultation on children’s online safety and that she will be tabling an amendment to the Children’s Wellbeing and Schools Bill to enable further legislation to come forward on something at some point in the future—all as yet to be determined. Frankly, the Government are kicking the can down the road.

Baroness Kidron in the other place, who is an expert and campaigner on children’s safety online, said the Government’s consultation

“does not concern itself with the gaps in provision or enforcement of the Online Safety Act, nor the emerging or future threats that we repeatedly raise. It does not seek to speed up enforcement or establish why non-compliant companies are not named in Ofcom research or while they are being investigated. The consultation is entirely focused on two amendments that this House might send to the other House, which its Back-Benchers might agree to. The consultation’s purpose is to stave off a Back-Bench rebellion. It is not about child safety or governance; it is about party management. The UK’s children deserve better than that.”—[Official Report, House of Lords, 21 January 2026; Vol. 852, c. 318.]

Those are not my words; they are the words of the esteemed Cross-Bench peer Baroness Kidron in the other place.

Ellie Chowns Portrait Dr Ellie Chowns (North Herefordshire) (Green)
- Hansard - - - Excerpts

I had understood that the hon. Member’s party was keen on public consultation, and there is clearly a lot of public concern about the very real problem of online harms and the need to protect children. I am therefore puzzled by the fact that she is seeking to control the parliamentary agenda in just a couple of weeks’ time with rushed-through legislation and without any substantive proposals or consultation. If she is concerned about the scope of the consultation the Government have announced, why not try to amend that scope? Why not emphasise the importance of parents in particular having their voices heard rather than rushing through legislation that will probably be quite flawed if there is not sufficient time to ensure that everybody’s voices are heard in this conversation? It feels like politicking, to be honest, rather than a substantive engagement with the details of the issue.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I am sorry the hon. Member feels that way. We have brought forward a lot of these proposals previously. It is not politicking; we have long been committed to this issue. A number of these things could be done tomorrow. They do not need to be consulted on. The age of digital data consent could be raised tomorrow without any further consultation. There was flexibility in European law on the age it was set at and the UK chose to set it at 13. A number of other countries have recently raised the age. Unfortunately, an amendment to the Data (Use and Access) Bill to do just that was rejected. The bit that probably needs consultation is how any ban or restriction on harmful social media would work, but we could legislate for the principle and consult on the operational detail. I do not think that is a problem.

On the hon. Member’s point about making sure that the voices of parents and young people are heard, I think they have been heard loud and clear up and down the country. They have been pushing and pushing for this. They are concerned that the consultation will just delay action further. Parents, teachers and young people are crying out for urgent action now. We need a smart approach that allows young people to benefit from the best of the internet—whether that is learning or staying connected to their friends and family online—while properly tackling the harms it can cause.

Paul Holmes Portrait Paul Holmes
- Hansard - - - Excerpts

Will the hon. Member give way briefly on that point?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I have given way to the hon. Member a couple of times. I am just about to finish.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I am confused about what the Liberal Democrats’ proposals are. The proposals laid out by the hon. Lady are not those introduced in the House of Lords. In the House of Lords, only user-to-user services were talked about, not addictive online gaming, for example. Are we discussing a Bill containing the proposals laid out in the House of Lords, or is the hon. Member putting forward new, ethereal proposals? I do not understand what this Bill is going to be. I was expecting to actually see it so that we could discuss it today.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

The harms-based framework that we proposed in the other place would apply to chatbots and gaming as well. The point is that, as I have already laid out, we would come together and come forward with proposals that we can all agree on.

Paul Holmes Portrait Paul Holmes
- Hansard - - - Excerpts

Will the hon. Member give way? I want to help her out.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I am very grateful that the hon. Member wishes to help me out, but I suspect that he does not have my best interests at heart. [Hon. Members: “Aw!”] Oh, go on; I am happy to take his intervention.

Paul Holmes Portrait Paul Holmes
- Hansard - - - Excerpts

The hon. Lady is being characteristically courteous in giving way, and I always have her best interests at heart. She is right to say that people are keen to be heard loud and clear, and she is rightly setting out her position about legislation she wants to see before the House. However, if she thinks that people have been heard loud and clear, can she tell the House whether the things she has outlined today are in a drafted Bill, sitting in a safe somewhere within Liberal Democrat HQ, and why she chose not to publish that this afternoon so that we could have a principled debate on her policy proposals?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

No, I will not give way; I would not expect the hon. Member to help me out.

At various points, we have tabled all the things I have mentioned as amendments in both Houses, so they have been drafted—although I am happy to admit that they have not been put together in one Bill for me to present today. I apologise on that procedural point, Madam Deputy Speaker, which I can see has upset many Members, but all the proposals that I have outlined have been tabled in both Houses as amendments to various Bills, including the Children’s Wellbeing and Schools Bill and the Data (Use and Access) Bill.

To reiterate, the only consultation that we should focus on now ought to relate to how the restrictions might work in practice, not whether they are needed at all—the public and campaign groups have made their views on that pretty clear already, whether they support or oppose a blanket ban. Although I have been criticised for coming forward without a Bill, the whole point was to say, “Let’s work together,” because I think there is cross-party consensus on this matter.

Bobby Dean Portrait Bobby Dean (Carshalton and Wallington) (LD)
- Hansard - - - Excerpts

There seems to be great confusion in the Chamber, even though the Liberal Democrats have time and again set out our proposals quite clearly in different places. I find it fascinating that the official Opposition accuse us of politicking when they probably agree with the substance of our proposals. They are contorting themselves to find a way not to support the motion, which is about urgency and acting more swiftly than the Government propose to do—I, for one, think that is a good thing.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

Discussing the substance of the issue is exactly what we are seeking to do. It has been a long time since this Chamber has had a proper debate on these issues. In a few weeks’ time, we will discuss amendments that suggest individual parties’ views on the way forward. We are proposing a discussion on what the proposals should be so that we can return with a piece of legislation that meets the needs and requirements of the public—our children and young people, and their parents and carers.

We Liberal Democrats say to Ministers and the official Opposition that we have a set of solutions, and we will work with them in the best interests of children. We need to act now, so they should vote with us today and make time for this Bill on the legislative agenda. If the Government do not want to make time for our Bill, perhaps they will make time for one of their own, but we need one quickly. We stand ready to work across parties to create the safer future that our children deserve—

Anna Dixon Portrait Anna Dixon
- Hansard - - - Excerpts

Will the hon. Member give way?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I was just about to end my remarks, but I will give way.

Anna Dixon Portrait Anna Dixon
- Hansard - - - Excerpts

I thank the hon. Member—I will call her my friend, as she gave way to me on the last sentence of her speech. She has made a powerful case for cross-party working, hearing different perspectives and bringing forward change quickly, but that is the point of consultation: to find out how we should do something, get the views of parents, schools and everyone else, and come out with something that will be effective in the long term. The Online Safety Act 2023 took far too many years, while this proposal bounces us into something when we are not even sure of what we are voting for. I say with huge respect for her that we should use the consultation process.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I say gently that although the Prime Minister has promised us legislation at some point following consultation, it would be secondary legislation, which gets far less scrutiny than primary legislation, and I am afraid that his track record for U-turning on commitments is not great—let’s face it. I have tried to be as consensual as possible and not make political jibes, but we have had 14 U-turns. He said just a few months ago that he did not want to bring in any sort of ban on harmful social media for under-16s because of the experience of his teenagers, but he made a speech last week in which he said that, because of his teenagers, he did want to do so. I am not sure which version of his comments to believe. I would like to press this issue so that the Government introduce legislation sooner rather than later. I think it needs to be primary legislation so that we can discuss it, debate it, amend it and look at it thoughtfully, and we need a clear and strong timetable for it.

Joy Morrissey Portrait Joy Morrissey (Beaconsfield) (Con)
- Hansard - - - Excerpts

Will the hon. Lady give way?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I will take one final intervention.

Joy Morrissey Portrait Joy Morrissey
- Hansard - - - Excerpts

We all want to find a solution to online harms—we would not be in this debate if we did not care about protecting children—but the way to do that is through a long consultation period outside the Chamber before we come forward with a Bill. Procedurally, this is not the way that we debate Bills, assess their merit or take them through the stages of becoming law. If the Liberal Democrats want to take this seriously, they should use the correct procedure for taking forward a Bill. We should all be able to debate it. There should be a long consultation process, and then we can take it forward together.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I thank the hon. Lady for her intervention about procedure. I say to her gently that her party and others have in the past used this mechanism to try to force Governments to introduce legislation on various issues.

Joy Morrissey Portrait Joy Morrissey
- Hansard - - - Excerpts

Will the hon. Lady give way?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I will not give way again, because I think we have tested this argument to death. I understand the hon. Lady’s concerns about procedure, but this mechanism is not unheard of. The Labour party did something similar on fracking a few years ago when Liz Truss was Prime Minister. I remember it well—I was in this place, as was the hon. Lady—and we voted on that. There was no substance, but Labour wanted to bring forward legislation.

Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I am about to come to an end.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - - - Excerpts

On a point of order, Madam Deputy Speaker. This is a procedural question. Given that the long title of the Bill is not in the motion, does that mean that the Bill can effectively cover any subject or theme if the Order Paper is seized on that day?

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- View Speech - Hansard - - - Excerpts

I thank the hon. Gentleman for that point of order, which I anticipated might come at some point. If he checks the Order Paper, he will see that paragraph (1)(d) says very specifically that it has to be a Bill on online services age restrictions that is brought forward on 9 March.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

Further to that point of order, Madam Deputy Speaker. Thank you for that clarification, but my understanding is that that is the short title, not the long title. Is it the case that the long title can be used to tag in any related subjects to expand the scope from the narrow one here?

Caroline Nokes Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I thank the hon. Gentleman for his further point of order. Clarification on that point had best be sought from the Public Bill Office. It is my understanding that any Bill brought forward will have to cover online services age restriction, but I appreciate the distinction that he makes between the long and the short titles.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I am perplexed, because I think there is support on both sides of the House for restricting online harms and protecting our children, and for the principle of bringing forward legislation, although I understand that people are vexed about the procedural point. I fear that there has been some contorting to find a way to justify voting against this motion. I am sorry that that is the case, because we Liberal Democrats are ready to work in a cross-party manner to create the safer future that our children deserve so that they can flourish and thrive in the online and offline worlds. I hope colleagues across the Chamber will support us.

Paul Holmes Portrait Paul Holmes (Hamble Valley) (Con)
- View Speech - Hansard - - - Excerpts

On a point of order, Madam Deputy Speaker. Members might be jaded by my making this point of order, but I am grateful to you for allowing me to do so; as a democrat, I like this Chamber to work properly. Will you clarify the procedural basis of the request by the hon. Member for Twickenham (Munira Wilson) for the Government to make time for the Bill? I ask this because if the motion is accepted, the Government will not be able to pick a time for the legislation; instead the Liberal Democrats would take over the Order Paper and force the Government to accept their legislation on 9 March, with the procedures that are outlined.

May I also ask your guidance, Madam Deputy Speaker, on the motion? It would make a number of amendments to the Order Paper on that day, including that

“No dilatory Motion shall be made in relation to proceedings on the Bill to which this Order applies...

The Question on any such Motion shall be put forthwith.”,

and that only a “designated Member” would be able to make any decision about the order in which a Bill was to be taken. In subsection 19 that designated Member is

“(a) the leader of the second largest opposition party; and

(b) any other Member acting on behalf of the leader of the second largest opposition party.”

Despite the protestations of the Liberal Democrats that they want this to be a cross-party approach, this is them taking over the Order Paper and giving their leader carte blanche to table what they like on 9 March. It does not give the Government the opportunity to table legislation on a cross-party basis at a timing of their choosing—it has to happen under the jurisdiction of the Liberal Democrat motion, does it not?

Caroline Nokes Portrait Madam Deputy Speaker
- View Speech - Hansard - - - Excerpts

I thank the hon. Member for his very long point of order—[Interruption.] Yes, he has made the point that he is trying to be helpful. To clarify, first, it is the House’s time not Government time, but the powers given as set out in the motion are as he has outlined them. May I further highlight that it is not without precedent to hold a debate on a motion taking over the Order Paper on a Bill, without the Bill having been published? It last occurred on 6 February 2024 when an Opposition motion was tabled to take over the Order Paper to discuss ministerial severance reform, and that Bill had not yet been published. So it is not without precedent, but the hon. Member is correct in his understanding of what the motion would do were it to be passed by the House.

I call the Minister.

16:21
Kanishka Narayan Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
- View Speech - Hansard - - - Excerpts

It is a pleasure to respond to this debate, not least to further my education in my personal passion area of parliamentary procedure.

Let me begin by responding to the motion, and then I will turn to the substance of the debate. The hon. Member for Twickenham (Munira Wilson) will accept that no Government could accept a motion such as that proposed by the Liberal Democrats. The motion goes against the Standing Orders of the House, which state that the Government as elected by the people control the Order Paper, apart from specific exemptions such as Opposition days. The motion would give the Liberal Democrats free rein to schedule the business on 9 March. Today they introduced a Bill. It is still not available to Members across the House, yet they are asking the House to hand them control of business to complete all stages of the Bill within a day. That is no way to make complex changes to the law in this area.

This is not just a procedural outrage; more than that I am sorry to see the Liberal Democrats join the Conservative party yet again in their usual coalition of putting political desperation on this question ahead of the interests of British children and families. I urge the Liberal Democrats to forget this approach, and to take part in the Government’s consultation, which is a true attempt at engaging across parties and across the country, so that we find the right solution for children and parents. This Government have already set out a way forward that considers those vital issues in a responsible way, and allows for swift action in response. That is how we will give children the childhood that they deserve and prepare them for the future.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

I do not know where the Minister has been, but my inbox has been inundated by families and parents who are calling for action. We are responding to the request of our constituents to take action. Do the Government not see the urgency with which we need to take action?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

The Government are seeing both urgency and responsibility in the correspondence that we are receiving and the consultation we are engaging with, not the desperate lurch to a specific answer that the Liberal Democrats are exemplifying in this instance. I want to take this opportunity to set out our approach.

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - - - Excerpts

I say gently to the Minister that if he were to look at the Liberal Democrat’s track record over the past few years, he will see that we have worked really hard to put forward concrete proposals about putting online safety first.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

But not today.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

No, but we have tried to push that agenda. It is not as if social media came into existence yesterday—Facebook was launched 22 years ago—and the Government brought forward the consultation after pressure from across the House. So I say gently to the Minister that we are trying to work together and that we want to continue to work together in that vein.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I take the hon. Member’s point about wanting to work together. The Government are committed to doing exactly that. It is not a question of whether we act, but how we implement specific changes to secure our children’s future. I encourage her and the entire Liberal Democrat party to engage with the consultation.

Caroline Voaden Portrait Caroline Voaden (South Devon) (LD)
- Hansard - - - Excerpts

On that point, will the Minister give way?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I will make a little progress having already given way twice to Liberal Democrat Members in short succession.

To be clear, it is crucial that we allow for a short, sharp consultation to allow the different parts of the debate to be heard, including crucially the voices of children themselves, who are too often under-represented in the debate. This is a complex area and it is vital that we get it right.

We have already announced that we will act both with speed and appropriate scrutiny to legislate based on the outcome of the consultation. Last month, the Secretary of State set out to the House that technology has huge potential for good: to create goods, to drive growth, to transform our public services and so much more. However, we have also been clear that in order to harness the potential benefits, parents need to have confidence that their children can benefit from the opportunities that the online world offers, ensuring that technology enriches, not harms, children’s lives.

Most children report benefits from being online, such as interacting with their peers, finding useful information or learning a new skill. But we also know that there are concerns about children’s online experience. This Government have always been clear that the protection of children online is our top priority. The Online Safety Act 2023 introduced one of the most robust systems globally for protecting children from harm online.

Anna Dixon Portrait Anna Dixon
- Hansard - - - Excerpts

I thank the Minister for his remarks, and I hope that part of the consultation will involve looking at research. The Born in Bradford study is a huge cohort study that has recently looked at social media use by 12 to 15-year-olds in the Bradford district. It found that they are using social media for 3.36 hours per day and that there are associated increases in anxiety and depression. Will the Minister ensure that the harms from social media that we already know about, including that research, will be factored in as he makes decisions, following the consultation, to act swiftly to protect our children from harm?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I commend my hon. Friend on her consistent commitment to evidence-based policy making in this place, and beyond it too. I commit to her that both the Born in Bradford study, which she mentioned, and wider research will be in the front of the Government’s mind.

Caroline Voaden Portrait Caroline Voaden
- Hansard - - - Excerpts

Will the Minister tell the House when the consultation will be launched?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

We will be very glad to come to the House as soon as the consultation is launched. It will be very soon indeed. As we have said, Members will expect not just a consultation—[Interruption.] I have not committed to debate the consultation today, prior to having published it. Perhaps the Liberal Democrats will take a lesson from that and follow appropriate procedure in this place.

The illegal content and child safety duties came into effect last year. Those duties represent a major milestone in protecting children from illegal and harmful content online, as well as helping them to have age-appropriate online experiences.

Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- Hansard - - - Excerpts

Consultation and timeframe is key, because while we procrastinate, online harm is continuing and our children are being put at risk. The statistics around online pornography show that up to 50% of boys aged 11 to 13 have already viewed pornography, and it is influencing their minds on a daily basis with regard to relationships and how they conduct their business. Will the Minister give the House an assurance that the consultation will come to this place very soon? Can he give timeframes thereafter, following the consultation, as to when we will see legislation brought before this House?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I can confirm to the hon. Member that the Government have committed to act robustly by the summer, which is about as short and sharp as a consultation can get. Instead of procrastinating on this question, I encourage her to engage intensively with the process of consultation and the national conversation.

I mentioned illegal content duties, as well as child safety duties. Under those duties, services must now conduct highly effective age assurance, precisely addressing the point raised by the hon. Member for Upper Bann (Carla Lockhart), to prevent children in the UK from encountering pornography, as well as content that encourages, promotes or provides instructions for self-harm, suicide or eating disorders. Platforms are also now legally required to put in place measures to protect children from other types of harmful content. That includes abusive or hateful content, bullying content and violent content.

Natasha Irons Portrait Natasha Irons (Croydon East) (Lab)
- Hansard - - - Excerpts

I thank the Minister for the decisive action that he took over the recent Grok incident. Given the scope of the consultation and the fact that we are talking about online harms, I want to flag the issue we have around content on YouTube, which is a video-sharing platform, not necessarily a social media platform. The type of content that our children are consuming on there is a quick succession of images, which is not very good for a child’s development, rather than the slow-paced stuff we get when we watch a broadcaster. Will the consultation look at the quality of content on these platforms? Not all screentime is equal; some screentime can be quite dangerous for a child’s development in general.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

Both of my hon. Friend’s points—on the scope of how we look at particular platforms and at their functionalities—are not just considered by the consultation, but deeply important. I engaged with the Australian Minister on this issue just last week, trying to understand their experiences of this and the uncertainty of getting those two things right. That is exactly why the consultation has been an appropriate approach in this context.

Where services fail to comply with their duties in the Act, Ofcom’s enforcement powers include fines of up to £18 million or 10% of qualifying worldwide revenue. Ofcom has indicated that it has issued financial penalties to six companies under the Online Safety Act amounting to more than £3 million. I can confirm to the House that just yesterday, Ofcom announced that it has fined a porn company £1.35 million for failing to introduce proper age verification on its websites—the largest fine levied so far under the Act. I welcome this strong action to protect children online.

We have always been clear that while the Online Safety Act provides the foundations, there is more to do to ensure that children live enriching online lives. Like all regulatory regimes, it must remain agile. That is all the more critical given that we are dealing with fast-moving technology. That is why this Government have already taken a number of decisive steps to build on these protections.

The first act of my right hon. Friend the Secretary of State was to make online content that promotes self-harm and suicide a priority offence under the Online Safety Act. That means that platforms must take proactive steps to stop users seeing this content in the first place. If it does appear, platforms must minimise the time that it is online. As well as that, both intimate image abuse and cyber-flashing are now priority offences under the Online Safety Act.

Last month, my right hon. Friend the Secretary of State stood in this Chamber and made it clear that the creation of non-consensual deepfakes on X is shocking, despicable and abhorrent. She confirmed that we would expedite legislation to criminalise the creation of non-consensual intimate images, and I am pleased to confirm to the House that that came into effect earlier this month. That will also be designated as a priority offence under the Online Safety Act, and it complements the existing criminal offence of sharing or threatening to share a deepfake intimate image without consent.

Alongside that, it was announced that we will legislate to criminalise nudification tools to make it illegal for companies to supply tools to be used as generators of non-consensual intimate images. Last week, we went further still and announced that we will introduce a legal duty requiring tech companies to remove non-consensual intimate images within 48 hours of them being reported. These measures will provide real protection for women and girls online.

However, we recognise the strength of feeling up and down the country and right across this House—not least in this debate. We share the concern of many parents about the wider impact of social media and technology on children’s wellbeing. The rapid growth of grassroots campaigns such as Smartphone Free Childhood highlights how concerned parents are about the pull of these technologies and what it means for their children. That includes the potential impacts on mental health, sleep and self-esteem.

We have set out our commitment to supporting parents and children with these issues. We want to find solutions that genuinely support the wellbeing of our children and to give parents the help that they need as they guide children through online spaces safely.

Ellie Chowns Portrait Dr Chowns
- Hansard - - - Excerpts

I have received contact from hundreds of parents in my constituency and from some young people sharing their huge concern about online harm caused by engagement with social media, so I fully understand the sense of urgency in the Chamber and the desire for quick action. The Government said in January that they would consult. They reiterated that they would consult, and they reiterated that commitment 10 days ago. I understand that the consultation is due to start in March, and the Minister has talked about bringing measures through before the summer. Can he commit to acting with real urgency and bring that consultation forward? What is the delay? Will he commit to bringing legislation—

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

Order. The hon. Lady has repeatedly made very long interventions. It was always open to her to attend the opening of the debate and to speak in it.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I totally agree with the hon. Member’s call for urgency. I assure her that first, the Government will act by the summer in robustly responding to the consultation. Secondly, we have been focused on getting the consultation right, and not just for the wider public; we are ensuring that it is designed for young people’s engagement, which requires particular design features. Thirdly, we are not waiting for the launch of the consultation to have the national conversation. I have been in schools and met parents, as have the Secretary of State and Ministers from across Government, so the conversation has very much started, and I am sure that the consultation is also imminent.

While there is consensus that problems remain, there is not yet consensus on the best way to address them. That is why the Government announced last month that we will be launching our short, sharp consultation and national conversation on further measures. We recognise that while some people support age restrictions on social media for children, there are diverse views on both the “what” and the “how”. Prominent voices in this debate, including the Molly Rose Foundation and the National Society for the Prevention of Cruelty to Children, are concerned that blunt age limits might not be the right approach and risk doing more harm than good. Even among those who support age limits, there are differing views on how to apply them, including which services restrictions should apply to. Those views are worthy of consideration, but we need to consider them properly and responsibly—we owe that to our children.

That is why the consultation approach is the responsible path forward for looking at these issues, considering in a swift and evidence-based way the full range of implications and the most effective way of protecting children and enhancing their lives online. We will consult with parents, the organisations representing children and bereaved families, tech companies and—crucially—children and young people themselves. None of that would be allowed under the motion we are considering today. This consultation, backed by the national conversation, will identify the next steps in our plan to boost and protect children’s wellbeing online. The consultation will include exploring the option of banning social media for children below a certain age, as well as a range of other measures. This will include gathering views and evidence on options such as restricting access to addictive functionalities and understanding what we can do better to support parents in navigating their children’s digital lives. We will also explore whether we should raise the digital age of consent, to give parents more control over how their children’s data is used, and how existing laws on age verification could be better enforced.

John Milne Portrait John Milne (Horsham) (LD)
- Hansard - - - Excerpts

The Minister is making lots of observations about the consultation that is going to go ahead—what is going to be in it, and how long it is going to take. What we do not know is when he will commit to bringing legislation before this House to act against social media.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I am happy to repeat to the hon. Member this Government’s commitment, which is that we will act by the summer. That is about as short and sharp as a consultation period gets. The Online Safety Act took seven years; we are simply asking for one quarter to make sure that young people, parents and families across the country are properly heard from.

John Milne Portrait John Milne
- Hansard - - - Excerpts

I understand the consultation, but what about actual legislation?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I will simply repeat the point I have made, which is that we are going to act by the summer. We have already sought permissive powers to ensure that the Government are able to act on the outcome of the consultation through rapid legislation. I hope the combination of those two commitments gives the hon. Member some assurance.

The engagement and consultation will take place alongside work with counterparts. We will be monitoring developments in Australia on its social media ban for under-16s to share learnings and best practice. We are steadfast in our belief that the right way to deliver the next steps to protect our children online is to be led by the evidence through our short, sharp three-month consultation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister has just said that the Government have already sought permissive powers. I understand that they are going to move an amendment in lieu to the Children’s Wellbeing and Schools Bill, but I am not aware that that amendment has been published yet, much less agreement sought from the House. When will that be published, so that we can see what those permissive powers are supposed to be?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I thank the hon. Member for that point, and commit to her that we are going to try to do that as soon as possible. She will be aware that the legislative process is already very tight, so I will come back to her and the House with the wording of the motion as soon as possible.

Last week, as I have mentioned, the Secretary of State confirmed that we will take new legal powers to allow us to act quickly on the outcomes of the consultation, delivering on our promises to parents. We will make sure that the wording is presented to the House at the earliest opportunity. We also recognise the importance of parliamentary scrutiny and the expertise that parliamentarians in both Houses provide, and have already committed that when regulations are brought forward, they will be debated on the Floor of the House and there will be a vote in both Houses, ensuring proper scrutiny. We are clear that the question is not whether we will act, but what type of action we will take. We will ensure that we do so effectively, in lockstep with our children and in the interests of British families.

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I call the shadow Secretary of State.

16:39
Julia Lopez Portrait Julia Lopez (Hornchurch and Upminster) (Con)
- View Speech - Hansard - - - Excerpts

Today we are debating something that is very important: the protection of children from online harms is vital.

I commend the hon. Member for Twickenham (Munira Wilson) on what I thought was a very heartfelt speech, but I fear that her good intent has been rather thrown under the bus by her party leadership. Setting aside the importance of this subject, let us look at their method of bringing it forward—a point which has been raised rather expertly by Members from across the House. Today the Liberal Democrats are doing what they do best: slightly nutty stunts. With all the menace of Captain Mainwaring they are attempting to seize control of the Order Paper and effectively declare themselves not only Government for the day but, with their loosely defined online services Bill, rulers of the internet. It is a gimmick. It is the parliamentary equivalent of boinging into the Chamber on a giant bungee.

Though the hon. Member for Twickenham put a little bit of flesh on the bones in her speech, the motion itself simply requests the power to barge through this House with a blank-cheque Bill for which we have no details and in so doing let the Government Benches clean off the hook. It has all gone a bit Benny Hill. It is a great shame because it is a distraction when the moment of truth on social media for children is coming to us imminently. They know that from the panicked recess briefings that the Prime Minister has been caught on the hop on an issue that is of deep concern to families, children, teachers and communities across the country.

Before too long the Children’s Wellbeing and Schools Bill will return to this House and Members will have the chance to vote on a credible proposition: an amendment tabled by the noble Lord Nash that no child under the age of 16 should have access to harmful social media.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If this is the Conservative’s stance, why when consideration of the Online Safety Bill lasted for so long—it was even referred back into Committee, which no Bill had been in 20 years—did the Conservatives not ban social media for under-16s through that Bill when they were in government?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

This is a Conservative amendment in the Lords that has gained cross-party support, so it will be coming back to us. The hon. Member raises an important point about why this policy was not brought in under the Online Safety Act. That Act tried to do many, many things. In many ways, it took so long because it risked becoming a Christmas tree Bill, and many good causes were hung off it. That did cause challenges.

I think that as the debate has moved on we have realised that it is not just about illegal content that children are being exposed to and some of the things that the Online Safety Act was trying to change. There is an issue in general about children being in this space: there are addictive algorithms, and it is not just about illegal material but the fact that it is changing how children are thinking about interacting. Maybe we have to stand back as a society and say, “This is simply not the right place for children to be. We can create adult online spaces, but for children we think that there are other ways in which they should be interacting with the world.”

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

You are talking about the Online Safety Act. Do you think the fact that—

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

Order. Does “she” think.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Apologies. The hon. Member talks about the Online Safety Act and what happened under the Conservatives. Do you think—

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Apologies. Does she think that the fact that the Leader of the Opposition tried to water down that Bill and said that we do not legislate for feelings has anything to do with the can being kicked down the road and us not having made the necessary progress?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

There were very real and important debates during the passage of that Bill about legal but harmful material and whether people should be able to speak freely online. Our approach was to seek to create a space where adults can speak freely while accepting that children should not be in some of these spaces. That was the point that the Leader of the Opposition was trying to make.

We were moving very dangerously into the realms of free speech, and it is not for an online regulator to start telling people what they can and cannot say online when it is not something that is illegal to speak of in the real world. That was the challenge that we got ourselves into as a Government, and that is why we changed parts of the approach that we were taking to the Online Safety Bill. I appreciate the concerns that are being raised, and I am trying to answer them as honestly and straightforwardly as I can.

When we consider the amendment from Lord Nash, this House will have its opportunity to make an unequivocal statement of principle: that when we believe that something is harming children at scale, we accept that it is insufficient to leave the status quo unchallenged or simply to commission a consultation. That applies especially when it is a consultation to which this Government have provided absolutely no political direction or view and that has been much trailed but still not actually launched. In truth, this consultation was not ready. It was a mechanism to get the Prime Minister out of another of his tight fixes.

The Tech Secretary might be very good at emoting and telling us all how impatient she is for change, how she cares, and indeed for how many years she has cared, but when she made her statement on social media for children in this Chamber a few weeks ago, she said nothing about what the Government would actually do, beyond seeking more time to take a position. I commend the hon. Member for Twickenham for pointing that out, and I have sympathy with why she is trying to use this mechanism today, because we are all trying to tease out what the Government are seeking to do.

It was extraordinary to listen to the Government Minister, who said with great sincerity, “We will act robustly in responding to a consultation.” What does he actually believe? What do the Government think we should do on this issue? Nobody has a clue. They are talking about a huge range of things that could be done, but it is for a Government to provide political direction; it is not for a Government to seek consensus. [Interruption.] It is for a Government to take a position and to take a view. It is for a Government to have opinions. It is for a Government to have policy positions. It is not for a Government to try to make sure that everybody in this House agrees. [Interruption.] It is pathetic to see those on the Labour Benches getting out of their tree about this.

Natasha Irons Portrait Natasha Irons
- Hansard - - - Excerpts

I sincerely thank the hon. Lady for giving way. When we talk about the consultation, it is not necessarily about seeking consensus in this place; it is about seeking consensus with parents and children, and with people outside this place. Banning social media for children is a good approach, but this is not just about that, is it? It is also about the time that our kids are spending on screens. That is what this is about: it is about having a digital childhood that we can all get behind and support.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I can agree with that. My point is that this Government are trying to suggest that a consensus can be found in the absence of their having a policy position. They are talking about a consultation, but what on earth are they consulting on? Nobody has a clue. They have not been able to say anything about what they actually want to do, because the Prime Minister has no opinions, which is why he is in such deep trouble. Those on the Labour Benches can get out of their tree and get all uppity about it, but this—[Interruption.] No, the Prime Minister is being blown around like a paper bag on this issue, and everybody knows it. First of all, he said that his children did not want to ban social media; now he says that his children are the reason why he wishes to ban social media. He said there is going to be a consultation, but it has not materialised. What does this man actually think?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that the hon. Member has been very clear that her position is that she supports the Lords amendment that seeks to ban social media for children. Is she aware that it would not apply in Scotland? The Lords amendment would not apply in Scotland, because the territorial extent of the Children’s Wellbeing and Schools Bill, apart from one clause, does not include Scotland. I take it that her position is that she only wants a social media ban for children who do not live in Scotland.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure the applicability of the legislation in Scotland is something that can be debated when the Bill comes before the House.

To give them credit, many Labour MPs understand the fact that there is an absence of any Government position, and they will not be taking their foot off the pedal. I suspect that many may have the guts to speak out today—although perhaps not. Those MPs recognised immediately that a consultation is a mechanism for a delay that goes beyond the summer and into another parliamentary year before the sniff of legislation. That holding position is now falling apart, as we have seen from the Minister here today. It is the threat of a very large group of Labour MPs backing the Conservatives’ Lords amendment that is pushing this Government into action—it is government by rebellion. We ask the Liberal Democrats not to let us be distracted from the moment of truth that is coming up, when we hope there will be cross-party support for the noble Lord Nash’s amendment.

For too long, the internet has been treated as a space that cannot be governed. It has functioned like a pioneer society, with extraordinary opportunity but minimal rules. However, pioneer societies improvise customs and eventually retrofit themselves with rules to sustain societies, often after hard-won experience and dispute. That is the process through which we are now going, and we are realising that, as the online society was built, we were not vigilant enough when it came to protecting childhood. We did not recognise that this new territory would bleed into the old world. [Interruption.] The Minister is shouting from the Front Bench that I am embarrassing myself. We as a Government brought forward the Online Safety Act, but there are gaps in it, and we have taken a clear position as the Opposition that we think children should not be on social media. He is looking very angry, but what is his view? Can he stand up and tell us what his personal view is? As the Minister with this responsibility, what does he think should be done, having launched his consultation with such earnestness? Come on, tell us! Would he like to tell us?

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

Order. Could I just be helpful? A lot of help has been needed this afternoon. The Minister has not asked to intervene, and the hon. Lady cannot force him to intervene on her.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. I was pointing out that the Minister has no manners, but wishes to shout from a sedentary position. I sat listening to him and waiting to see if I could decipher, in his very long and self-regarding diatribe, whether he actually has any opinions, but it turns out that he does not. He is very comfortable to sit on the Front Bench and chunter away at me. [Interruption.] You see, he again says that I am such an embarrassment.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

I have listened to what the hon. Lady has said, but last week I talked to a 15-year-old, who said to me, “We have no youth clubs. We go on the street, and I don’t feel safe and I get told I’m a nuisance. So I come home, and I interact with my friends online. Now I’m told I can’t do that.” I am not sure what the right answer is, and I sometimes think that not knowing the answer is as good as having absolute certainty all the time about everything. What would she say to that 15-year-old about the outcome for her? She is asking what she can do and how can she stay in touch with her friends. We do not have an answer to that yet, so what are the Conservatives offering?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I respect the hon. Member’s intervention for its politeness, but I do not think the answer is suddenly to encourage all children who are finding it hard to find purposeful and meaningful activities in the real world to retreat to their bedrooms. One of the challenges we have seen is that children have felt that the online space is the most stimulating for them. Unfortunately, that has led to an even greater retreat from the real world, and I think we can all recognise that that has been a negative for society.

Sam Carling Portrait Sam Carling (North West Cambridgeshire) (Lab)
- Hansard - - - Excerpts

The hon. Lady has been very clear that she wished the Government had just charged forward in some direction or other. I have had hundreds of constituents email me about this, from various perspectives and various concerns about the workability of certain solutions. I would like to listen to them, and I think it would be really helpful if the opposition parties tried to do likewise and to engage with this process, rather than just criticising whatever approach we take.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I appreciate where the hon. Member is coming from. I do not think it is wrong to seek evidence and ask for people’s views, but the Prime Minister should be honest about what he wants to do. The problem is that he has been floating various opinions, and he is being buffeted by Labour MPs and by the Opposition and others. If he does not think this is the right approach, he should feel confident in saying so. He has said a whole range of different things about this, and the Government are seeking to launch a consultation, but nobody actually knows what precisely is being consulted on.

If Labour MPs were honest with themselves, I think they would recognise that. I suspect they are having very serious conversations with the party’s Whips, saying, “Well, actually, we would like to know what the Prime Minister does think about this issue, because we’re not convinced by this consultation—we think it’s kicking the issue into the long grass, and we’re worried about the length of time that will mean before we get legislation to protect children from various challenges online.” That is the very reason why the Minister has stood up before them today to say, “We are probably going to do something—very definitely, maybe—in the summer.” He is saying that because the pressure is growing from Labour MPs. It is being briefed out that the Government are going to bring forward amendments to the Bill because they are being buffeted into doing so.

The problem is that nobody knows what this Prime Minister believes. On every single issue for the Government at the moment, and despite the very large Labour majority, this Prime Minister is being buffeted around, and that is the problem.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

I am very much enjoying the hon. Member’s speech, and I am wondering why she therefore cannot support our motion this evening.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I set out clearly at the beginning of my speech why we cannot support the motion, which is effectively a blank cheque. Notwithstanding the fact that the hon. Member for Twickenham tried to set it out in her speech, nobody actually knows what the Lib Dems are trying to do here. The proposal before us is that the Liberal Democrats take control of the Order Paper and then can say whatever they like on internet governance. I am sorry, but I do not think that is the way to conduct ourselves in Parliament. There have to be clearer proposals.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

I agree with the hon. Lady on this point. The other problem is that the motion caps the amount of debate at four hours—two hours for Second Reading, and then two hours for Committee and Third Reading. This will presumably have to be a meaty, multi-clause Bill to deal with an issue as complex as internet governance regulation, and it will be unamendable by this place because of the timescales available. It will not have the line-by-line scrutiny that would normally happen in Committee, and most of the amendments that get tabled will fall because there will not be time for Members to propose them. This is not a solution that brings consensus; this is the Lib Dems railroading through policies on a really complex issue that they cannot get through in conventional manners.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I agree with the hon. Member wholeheartedly.

Until now, we have implicitly decided that childhood must simply adapt to an environment that we as adults find totally overwhelming, undermining of our own sense of self and completely irresistible. We have been exposing our children to this place of no settled social rules where that exposure is constant, the boundaries are porous and responsibility is diffuse. Behaviour that would never be tolerated offline is normalised, monetised and then algorithmically amplified. The Online Safety Act, which we have discussed already, has been a step forward in trying to wrest back control, but it is, of course, an imperfect one. It focuses primarily on illegal content, seeks to keep the most extreme material offline and introduces age-gating for pornography and other over-18 content. That work does matter, but the problem before us today goes well beyond illegality and explicit material. There are also many concerns about the complexity of policing content, in terms of both the implementation and intent.

The central question is not just what children see but how social media works. Social media platforms are addictive by design. Their algorithms are engineered to maximise engagement and stickiness. They reward outrage, comparison, emotional intensity, competition and repetition. They draw children away from purposeful activity and into feedback loops that erode attention and resilience. Not all platforms operate like this globally, funnily enough. The Chinese version of TikTok is time-limited and feeds children content of scientific or patriotic value. In the west, it is emotional arousal that is fed to our kids.

Children are not simply consuming content; they are being shaped by the environment itself. It is happening when their brains are still developing. Their impulse control, emotional regulation and ability to assess risk are not the same as for adults. We recognise this everywhere else in law—in alcohol limits, in safeguarding rules and in age of consent protections—yet online we have decided to suspend that logic, and the consequences are increasingly visible.

Natasha Irons Portrait Natasha Irons
- Hansard - - - Excerpts

I am new to this place and clearly still learning, but I am wondering why, in that case, measures on designing out at source the harms that the hon. Member is talking about were watered down in the Online Safety Bill. She is absolutely right: we are creating online worlds, and they should be designed to be safe. Just as we design clothes for children that do not have toxic materials in them, we would hope that the spaces they inhabit online also do not have toxic material in them, so why were those protections not strengthened in the Bill that the Conservative party passed when it was in power?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I have set out before what we were trying to achieve with the Online Safety Act and why certain things were in it and others were not. I do not want to go over that again.

The consequences of these design features are increasingly visible, including rising anxiety and low mood, poor sleep, shredded attention spans and cyber-bullying that follows children home.

Freddie van Mierlo Portrait Freddie van Mierlo (Henley and Thame) (LD)
- Hansard - - - Excerpts

When I was growing up, social media was genuinely social—we would spend our time on it speaking to our peers and classmates. I remember MSN Messenger and Facebook when it first arrived. Social media has evolved to become this addictive, content-driven place where we are fed information. Does the hon. Member think we should perhaps differentiate between social media platforms that are genuinely for peer-to-peer interaction and help young people, and those that just feed content to them?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I thank the hon. Member for that intervention—I went off on a nostalgia trip in my brain, thinking about MSN chatrooms and all the rest of it. That was a time when people were not really aware of the power of the internet, and the predatory behaviours subsequently started to become normalised and industrialised. Although it might be tempting to want to try to go back to that place, I do not know whether we can actually get there, but it is certainly something we can aim towards and aspire to. The hon. Gentleman has made an important point. The essence of social media does not involve bad intent; the problem that we are seeking to solve is the way in which it has been manipulated and changed over the years to amplify negative behaviour.

Freddie van Mierlo Portrait Freddie van Mierlo
- Hansard - - - Excerpts

What the hon. Member has just said suggests that she might actually support the Liberal Democrat policy of age-rating social media platforms. That might lead to a new ecosystem of genuinely peer-to-peer, lower-harm products, which would be a good thing for young people.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

We think that the current priority is ensuring that under-16s are taken off harmful social media platforms, but I am sure that there is room for a market to develop, over time, that will not feature negative algorithms and activity, and that there is a world in which new products could retain the essence of positive social interaction.

Claire Young Portrait Claire Young (Thornbury and Yate) (LD)
- Hansard - - - Excerpts

Is the hon. Lady not concerned about the possibility that if we simply ban a list of social media platforms, we will provide an opportunity for new ones to develop and cause a problem while not allowing existing ones to develop in ways that will be less harmful?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure that the issue of the functionality list can be explored as time goes by.

It is important to point out that this is not a moral panic but a structural problem. Today the Leader of the Opposition gathered a panel of grieving parents who had lost their children, and in that context negative online activity was recognised to have real-world and utterly tragic consequences. The children had been drawn into dangerous challenges, coercive relationships, bullying and bribery, all of which created despair in those young minds.

That showed us plainly why the pioneer phase must now come to an end, at least where children are concerned. Pioneer societies do not remain lawless forever; eventually they are retrofitted with rules and boundaries, and protections for the vulnerable. It is striking that, after years of the problem building up, countries around the world are reaching the same conclusion with remarkable synchronicity—not because it is fashionable, because Governments are copying one another or because anyone thinks that this will be particularly easy to impose and enforce, but because the evidence has accumulated to a point at which denial is no longer credible. If social media were broadly harmless for children, this would not be happening, but Governments with very different. political traditions are acknowledging the same reality: that when it comes to children, some control must be wrested back. I suspect that this trend will be reflected vividly in the Chamber today, with examples from across the nation of what is happening in the real world because of the laxity in the online world.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I asked the hon. Lady’s Government to ban suicide forums that encourage young people to harm themselves. I asked her Government to ban eating disorder forums that encourage eating disorders. Her Government refused to do that in the Online Safety Act 2023, despite our asking for it to happen. How can she stand there now and take the moral high ground when her Government refused to ban the worst, most egregious, most harmful platforms? The Conservatives do not have a moral high ground on this issue.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am not seeking to occupy a moral high ground. I am seeking to set out a way towards keeping children under 16 off social media platforms, because trying to legislate for specific different activities is very challenging, as I think we saw with the Online Safety Act. There are very good causes and there are very important activities that we sought to stop online, but turning that into a workable law is a huge challenge. That is one of the reasons why we think it important to take a “whole of society” approach that tries to shift the debate and say that certain types of online space for people under 16 are simply not appropriate—a principles-based approach to governing the online world that tries to steer away from some of the difficult debates about how to write implementable law to stop nasty and negative behaviour.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

I thank the shadow Minister for giving way again; she is being very generous. I confess that I have not made my mind up on this. Let us suppose that there was a blanket ban preventing anyone under 16 from accessing material of this kind. How does the Minister envisage that being enforced? Will enforcement sit with the parents ultimately, and if they are not able to carry out that enforcement, what will be their criminal liability? There are genuine challenges when it comes to what children can access, and who is made ultimately responsible for enforcing a simple approach that could be quite complex to implement.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I would not envisage that parents would be responsible for that. There are mechanisms to make sure that platforms would not be permitted to provide accounts to under 16-year-olds and they would have to have highly effective age-assurance techniques. In fact, I have spoken recently to representatives of a major platform who said that they had very effective techniques for testing whether somebody trying to open an account is the age that they say they are. I will not take further interventions for a little while so that I can make progress, as I know other people want to speak.

There are serious arguments against implementing a ban, some of which have been heard, and they deserve to be addressed and not dismissed. We are likely to hear more about those doubts today and they must be listened to respectfully. Indeed, I hold some of those anxieties and reservations myself. The first argument is that a ban would be unworkable and that teenagers would find workarounds through virtual private networks, foreign platforms or fake credentials. They will, of course, because teenagers have always tested boundaries. Fake IDs, sneaky booze and under-age rule-breaking are traditional parenting challenges, but we do not abandon age limits simply because they are imperfect. Instead, we impose them because they change norms, shift behaviours and offer parents reinforcement rather than resistance. Of course, the mandatory age limit will not remove every child overnight, but it will remove a critical mass and that matters.

Some fear that such a ban would require de facto compulsory digital ID, undermining anonymity and civil liberties, and again, that concern must be taken extremely seriously. However, as I have just suggested to the hon. Member for Stoke-on-Trent Central (Gareth Snell), age verification does not require a single state-mandated digital identification system. Other jurisdictions have explicitly prevented platforms from requiring accredited digital ID and instead mandated multiple verification techniques, with responsibility placed on platforms and not citizens. As I said, I was speaking to a major tech platform recently that set out some of those techniques, which can now be used very accurately to assess a user’s age. However, we must be clear that we do not have a surveillance state simply because 13-year-olds are kept off Facebook.

A third argument, and a point that has been made, is that social media provides vital support and connection for many children, particularly those who feel isolated offline. That can be true, but it is not an argument for leaving the entire system untouched. This is not about banning the internet, messaging, educational platforms, health support or professional development services; those places can and should remain accessible, and that is happening in other jurisdictions. This is about a specific category of platforms whose business models depend on maximising attention and emotional arousal and which are demonstrably harmful at scale. Another concern is the unintended consequence that children may be pushed into darker corners of the internet. That needs to be included in the Government’s consultation when it eventually sees the light of day, particularly whether there needs to be parental consent required for downloading certain apps.

Doing nothing already leaves children exposed, in plain sight, on platforms that we know are optimised against their wellbeing. Protection will never be perfect, but neither is inaction benign. Doing nothing is not neutral. It leaves parents despairing, schools firefighting and children navigating a digital frontier with no one by their side. There is also a broader freedom argument, which is that by keeping children off adult social media platforms we can restore freedom to adults online and will no longer need to contort those digital spaces to be universally child-friendly, which is where some of the challenges have come in.

Finally, this is about leadership. As I said earlier, a consultation without direction is not leadership, and a consultation that pushes real change 18 months down the line is, in truth, a decision to do nothing now. Labour MPs know that, which is why the coming moment will not rest on this rather nutty Lib Dem takeover attempt. Instead, it will rest on the Nash amendment, when this House will have a clear choice: to accept that the pioneer phase is over; to recognise the sanctity of childhood, which deserves clearer rules; and to acknowledge that giving parents support is not the same as the state stripping them of their ultimate responsibilities. Parents will and must always be the first line of defence. When harm is real and growing, leadership requires a decision, even when the answers are not perfect.

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

As Members will know, the debate has to conclude by 7 o’clock. There are slightly more than 10 people bobbing. I plan to move to the wind-ups at 6.40 pm, which should leave everyone plenty of time.

17:09
Chi Onwurah Portrait Dame Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action.

I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.

The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows.

As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform, which shall remain nameless, around 2005. The company executive commented that they had come to the UK from silicon valley on a six-month contract to sort out Government affairs, and they could not understand why, two years later, discussions were still ongoing. Did we not realise that Government had no role in what they did?

I say that to illustrate that tech platforms have their origins in a libertarian, small/no-government tech bro bubble that has spread globally. TikTok, as a Chinese company, has a different background, but public accountability is not necessarily part of it. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media and life online has exploded. That is why I consider the Tory position in this debate to be a superb example of hypocrisy.

Monica Harding Portrait Monica Harding (Esher and Walton) (LD)
- Hansard - - - Excerpts

The hon. Lady is making a powerful speech about the evolution of social media platforms. I have four children; the first was born in 2004 and the last was born in 2011, so their births have spanned that evolution. Facebook began in 2004; TikTok began in 2016. If that evolution was the industrial revolution, we would be around the spinning jenny stage, with AI chatbots the next destination. Those chatbots are terribly dangerous for our children, and we need to regulate them now. That should be within the Online Safety Act.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.

Matt Rodda Portrait Matt Rodda (Reading Central) (Lab)
- Hansard - - - Excerpts

My hon. Friend the Chair of the Select Committee is making an excellent speech. Her background in this area is really showing in the detail with which she is exploring these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face. For me, part of the importance of the consultation is to allow parents to think more deeply about this difficult issue; there are often different opinions from campaigners who have had the most painful experiences.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

My hon. Friend makes an excellent point. It is for that exact reason that I support a consultation: this is part of a debate, and we all need to improve our understanding of the impacts of this technology. Parents are in a difficult position. I do not believe parents should have to be technology experts in order to give their children the best start in life, but unfortunately there is so much pressure in the online world that that seems to be the case right now, and that is why it is right that Government take action and consult on the action they take.

Let us think about the evolution of these technologies. I remember that when I joined Facebook in 2005 I had to use my university email address to join—that meant I had to be over 18. Some 20 years later, 13-year-olds and younger are having their lives and brains formed by almost uninhibited access to social media. In the UK, the number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter Molly took her own life at the age of 14 following exposure to self-harm content online; I have spoken to the bereaved parents of children bullied to death online; and I have spoken to the Internet Watch Foundation about the horrendous images its staff see of child exploitation. The fact that the Conservatives did nothing in all those years in government is, in my view, a form of political negligence of the highest order.

As part of my Committee’s inquiry into social media and algorithms, Google, Meta, TikTok and X told us that they accepted their responsibility to be accountable to the British people through Parliament, which I thought was quite a step forward from previous utterances, and ongoing utterances, by some tech billionaires who shall remain nameless. Our inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important and positive contributions, including helping to democratise access to a public voice and to connect people far and wide, but it also has significant risks—and those risks can evolve with the technology. We spoke about AI as an evolution, and one of the main failings of the Online Safety Act is that it regulates particular services rather than establishing principles that remain true and can be part of a social consensus as technology evolves.

Bobby Dean Portrait Bobby Dean
- Hansard - - - Excerpts

The hon. Lady is making an excellent speech. Should one of those principles be related not only to content but to the addictive nature of these platforms? One of the changes I have witnessed on social media over time is algorithmic addiction. The greatest minds in the world are now working out the circuitry of our brains and driving content towards us so that we look at our screens for longer so that they can sell more ads. Does she agree with that point?

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.

This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.

Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.

Natasha Irons Portrait Natasha Irons
- Hansard - - - Excerpts

I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.

The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.

Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.

I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a

“failed state where laws are ignored and crimes are tolerated”.

There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.

17:26
Danny Chambers Portrait Dr Danny Chambers (Winchester) (LD)
- View Speech - Hansard - - - Excerpts

This week is Eating Disorder Awareness Week, so I would like to pay tribute to the amazing staff at Leigh House in Winchester, an in-patient unit that cares for people with eating disorders in Winchester and the surrounding area. Eating disorders are possibly some of the most serious mental health conditions people can suffer from, and the most frustrating to treat and care for. They existed before social media, but social media is certainly making things more difficult. The body images that young people—teenagers and younger—are exposed to and the normalisation of AI-altered images that are impossible to attain, but which are presented as normal and aspirational, is hugely unhealthy.

We know that AI chatbots, which are often integrated into social media, are giving people mental health support and advice. I am really concerned about reports that patients with eating disorders are managing to get advice on how better to lose weight or even gain access to weight-loss drugs, which would make their condition much worse. I bring that up because there is a specific problem with AI chatbots. Some research shows that children do not recognise that a chatbot, which is often presented within social media as a companion, friend or cartoon character, does not have feelings, is not a person and does not care for their health and wellbeing.

It is very possible that, with the right regulation, AI and AI chatbots could be part of extending mental healthcare to people in the community at some point in the future. At the moment, it is dangerous and unregulated, and people accessing it are not even aware that it is giving them information that is potentially harmful to their health. I do not want this to fall through the cracks of regulation. Whatever we come forward with, whether it is about social media specifically or broadcasting licences, we should bring forward principles. Banning or regulating specific social media platforms or chatbots will be very unhelpful because of the speed at which these things are developing. It is a bit like whack-a-mole: once we regulate one, another pops up.

I draw everyone’s attention to the GUARD Act—guidelines for user age-verification and responsible dialogue—which was passed in the US last year. Very unusually, it had cross-party support despite the very fractious politics in the US at the moment. It regulates AI chatbots by requiring them to remind users regularly that they are not human or qualified to give medical advice, and it ensures that chatbots are not allowed to provide sexual content or have sexual or grooming-type discussions, and that users do not believe that they are speaking with therapists. I hope that we can focus our minds—especially during Eating Disorders Awareness Week—on the potential danger of young people being given what they believe to be medical advice by chatbots, which may be presented to them as friends or cartoon characters. That advice could be hugely harmful to their health.

I urge the House and the Government to move with extreme speed to address the problem. About two or three years ago, most of the general public had not really heard of ChatGPT. Now, we hear that around 50% of professionals use it regularly, over one in five people use it daily, and one in three adults have already turned to chatbots for mental health advice or emotional support. That is a huge and sudden change. It is penetrating our culture and daily use. We must ensure that we do not look back on this as we did with smoking. We knew for years the damage that smoking was doing to people, but action was not taken, evidence was obfuscated and lawmakers were lobbied. They delayed, and people died needlessly. We must get ahead of this and take action as quickly as possible.

17:31
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - - - Excerpts

This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day, which is just appalling. We know the suffering and pain caused by seeing images tagged with the terms “ana”, “thinspiration” and other terms that should go. The promotion of such content is now a category 1 offence, and Ofcom should be weeding it out. The hon. Member for Winchester (Dr Chambers) is absolutely right to say that that measure should be extended to bots.

I thank the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), for her fantastic speech. We have taken this matter seriously since the very beginning of the parliamentary Session, and we have done a lot of work on it. I echo her call for Ministers to look again at the recommendations in our Committee’s “Social media, misinformation and harmful algorithms” report, which goes well beyond misinformation and into how the damage is done.

Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. In the spirit of consultation—I know that we will get to that—I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, what will surprise the House is what young people consider social media profiles to be. We consider them to be Facebook or Instagram, while they consider them to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. Let me remind the House that, at that age, the brain development of young women is close to finished, while for young men, whose brain development does not finish until they are about 25, it is nowhere near complete. We know that from the science—just to be clear, that is not an opinion. Brain development in young women and girls happens differently, so should we therefore have different rules for young women and men?

Fifty-nine per cent of the 14 to 16-year-olds have been contacted by strangers, and more than a third of that was through Roblox, which is not covered by the Australian social media ban. Thirty-three per cent have been bullied, and a third of those was on Roblox. The Australian social media ban—which I assume is what the Liberal Democrats are talking about when they say they are in favour of a ban—does not cover YouTube or Roblox, and we have not even looked at whether it is effective. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. The Online Safety Act was a missed opportunity. It also took seven years to get through this House, but we do not have seven years to wait.

There would also be unintended consequences to a ban. I had the pleasure of meeting Ian Russell the other night, and we had a really powerful discussion. My heart goes out to him, as one parent to another, given what his family have been through. He does not jump to the easy solution of a social media ban. The Molly Rose Foundation has done a brilliant briefing paper, which every MP should read, about why it does not support a ban: it wants the online world to be safe for children, but a ban does not make it so.

Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I commend her work in reaching out to young people; it sounds superb. The lesson may be that we should all do exactly that. I am running a survey myself. She mentioned the Molly Rose Foundation, and I have met some of its staff to discuss its work. A family in my constituency of Reading suffered a terrible incident—their son was murdered in an incident of online bullying—and they have a different view. Does my hon. Friend agree that it is important that we properly listen to the families and consider the different views in the consultation?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.

Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.

A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.

Monica Harding Portrait Monica Harding
- Hansard - - - Excerpts

I was interested in the hon. Member’s survey. I have done my own very unscientific survey of young people, and all of them seem to want some form of regulation. With that in mind, we must hurry up—does the hon. Member agree?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.

Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.

As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?

We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.

We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.

17:44
Wera Hobhouse Portrait Wera Hobhouse (Bath) (LD)
- View Speech - Hansard - - - Excerpts

Across the country, the dangerous synthetic drug Spice is being brazenly marketed to children over social media. Many vulnerable young people believe that they are buying the less harmful, though still illegal, drug THC only to discover, too late, that what they have been sold is a far more potent and unpredictable substance. In schools, the consequences are already visible. One in six vapes confiscated from pupils now contains Spice—one in six! If we walk through parts of our towns and cities, we see the human cost: Spice users slumped in doorways, trapped in a semi-conscious state, stripped of dignity and control. How terrifying it is that this drug is no longer confined to our streets and prisons, and has entered our classrooms.

Children are collapsing in school corridors. Some are rushed to intensive care and others begin a battle with addiction that may follow them for life. Spice is not simply another illegal drug. Its extreme potency and addictive grip is a fast track to exploitation and criminality. It is always a tragedy when someone falls victim to substance abuse, but when it is an uninformed child who has been misled and targeted over social media, it is not just tragic; it is a profound failure to protect.

I have raised the issue in the House and with this Government repeatedly over the past year and a half, but in that time the situation facing vulnerable children has not improved, but deteriorated. Gone are the days when a young person had to meet a dealer in a dark alley to buy drugs. Today, a child can purchase them from their bedroom, with a few taps on a phone. The marketplace has moved online and our children are paying the price. But do not just take my word for it. The Metropolitan police have warned about children accessing illicit vapes through social media platforms, such as Snapchat and Telegram. A recent BBC investigation revealed how effortlessly an illegal vape laced with Spice can be purchased over Snapchat.

This is not a few small-scale individuals. We are dealing with a global, industrial supply chain, with major chemical suppliers in China providing materials to markets in the UK, the European Union, the United States and Gulf states. Researchers at the University of Bath have identified nearly 10,000 accounts involved in the supply and distribution of Spice, many using TikTok to advertise and communicate. I have met a number of Ministers about this issue, most recently the Minister for Online Safety, who is in his place. I know he understands the scale of the problem and is sympathetic to our concerns, but words are not enough: we need action.

Selling drugs is already a priority offence under the Online Safety Act, and Ofcom has a statutory duty to enforce that. Yet despite clear, sustained evidence that these substances are being openly advertised and sold online, we have not seen the decisive enforcement that the law requires. Instead, the burden is falling on members of the public to report these accounts, effectively asking individual citizens to do the regulator’s job for them.

What happens when an account is removed? Within hours, a near identical profile reappears. An account named “Spice Sales 1” is reported and taken down, only to resurface as “Spice Sales 2”, then “Spice Sales 3” and so on. The name changes slightly, the branding shifts marginally, but the criminality remains the same. This revolving door of reactive takedowns is not a strategy—it is an admission that the current system is not working. If a shop in Bath were openly selling drugs through its front window, the police would intervene immediately. There would be no hesitation and no suggestion that the public should simply keep reporting it. So why, when the shopfront is digital and when the customers are children, are we not treating this with the same seriousness? It is time that we confronted this reality. Social media companies have developed incredibly sophisticated algorithms, as we have already heard this afternoon, that are capable of targeting advertisements to individuals with remarkable precision. They know what we watch, what we like and what we linger on, so it cannot be beyond their capability to deploy artificial intelligence to detect and prevent the sale of illegal drugs on their platforms.

Active detection must replace endless reactive reporting. The technology and resources exist, and the evidence is overwhelming; what is missing is political will and enforcement. It is time to hold social media companies to account, because the safety of our children demands nothing less.

Chris Vince Portrait Chris Vince
- Hansard - - - Excerpts

Will the hon. Lady give way?

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

I was about to finish, but yes, I will.

Chris Vince Portrait Chris Vince
- Hansard - - - Excerpts

This is a genuinely friendly intervention. I am raising this point because I know that the hon. Member does a lot to champion and support people with eating disorders. I am completely changing the subject, but does she think that the rise of social media and online platforms has had an increased impact on people with eating disorders?

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

I could go on forever about online harm, particularly with regard to eating disorders. It is Eating Disorders Awareness Week, and we will be having a debate on that. I hope that the hon. Gentleman will attend that debate, as he can then raise that point again.

Today I am talking about spice and the responsibility of social media platforms and how we protect children. I therefore support the provision to bring in a Bill on protecting children from online harms, as proposed by my hon. Friend the Member for Twickenham (Munira Wilson). As I have said before, it is time for action; we can no longer dither and delay. I do not accept all the debates saying, “Oh! Process this, that and the other.” If we really mean it and are really serious about this issue, we need to act now. I am pleased that my party is prepared to act and show the public that we want change.

17:49
Sam Carling Portrait Sam Carling (North West Cambridgeshire) (Lab)
- View Speech - Hansard - - - Excerpts

I commend the hon. Member for Twickenham (Munira Wilson) on bringing forward this debate, which is a really valuable opportunity to talk about this issue. I also thank the many hundreds of my own constituents who have written to me about this from a variety of perspectives—if I have not got back to them yet, I will do so shortly.

Social media has rightly been described as a wild west. I come to this debate as someone who grew up with it—it has been there all my life—but who thoroughly dislikes traditional social media. Were it not for the importance of it in my job, I would spend very little time looking at it.

We must start by clarifying what problem we are trying to solve when we talk about online harms. The way I see it, there are three main categories. First, there is harmful online content itself. Algorithms are feeding people things that they never asked for, and the evidence that misogynistic and other extreme and deeply wrong content is being pushed on to people is overwhelming.

Secondly, there is the online grooming of children. Everyone knows how serious an issue that is, particularly on some online gaming platforms aimed at younger children, such as Roblox, which has been mentioned previously. It is so bad, and the reaction of the relevant company is so poor, that vigilantes are now active on some of these platforms, conducting sting operations to catch paedophiles. Appallingly, Roblox responded to one such user, who has a YouTube channel under the name Schlep and got six child predators arrested, by banning him from the platform and threatening legal action. Clearly we cannot encourage vigilantism, but if that is the platform’s response when someone is trying to deal with their own failures, something is deeply wrong.

There is also the problem of addictive content. That has gotten far worse in recent years, with the rise of short-form content and the algorithms that fuel it. Apps such as YouTube can in some circumstances automatically default to their “shorts” function when opened to be maximally addictive. Other addictive features are rampant, such as Snapchat streaks, which encourage children to open the app first thing in the morning and last thing at night to keep them going.

As the hon. Member for Bath (Wera Hobhouse) just mentioned, there are rampant drug-dealing problems on Snapchat. Some of that is due to the way that it recommends friends to people. There are accounts with the most obvious pseudonyms that we imagine, such as “snowforsale”, which clearly mean, “Add this account if you want to buy drugs from someone.” So little action is being taken on that issue.

Snapchat is not the only platform with this problem; it is rampant on Instagram as well. I remember quite recently that I came across an account that was so clearly selling marijuana-infused food, so I reported it, and Instagram did absolutely nothing. There is a real complacency and a lack of willingness to act in these companies that we have to deal with.

I should mention the rise of AI-generated fake content, designed either to mislead people or keep them hooked by showing fantastical things that do not work in reality. There is also the related issue of faked content more generally. There are horrendous examples of viral fake cooking recipes that do not work and could cause serious harm to people, such as by encouraging them to use a microwave in a way that could create something explosive. Online content creators such as Ann Reardon, who is an Australian YouTube creator, are doing amazing work to call that out and try to educate people, but the platforms do not have their backs; in many cases, they are actively undermining those creators’ work because the content they are trying to deal with is what is generating the most money for those platforms, due to its addictive nature.

I was very impressed by a video that my hon. Friend the Member for Bangor Aberconwy (Claire Hughes) put out the other day, exposing—if memory serves—a situation in which people working for an estate agent were recording videos in the homes of people who are from ethnic minorities, then packaging them to look like they are asylum seekers and saying, “Look at the great lives that asylum seekers have.” I encourage everyone to have a look at that video. The way that that content has been able to propagate online is atrocious, and I am so glad that my hon. Friend has been able to call it out.

All of these issues point to a situation that cannot go on. However, like my hon. Friend the Member for Milton Keynes Central (Emily Darlington), I am concerned that trying to solve them in one fell swoop with a ban for young people will not work, and could make some of the issues worse. Young people are incredibly digitally literate and digitally agile, and I am afraid to say that when a platform becomes unavailable to them, they can rapidly switch to another. A recent letter—I was glad to hear the Minister mention it in his speech—signed by the NSPCC, the Centre of Expertise on Child Sexual Abuse, the Molly Rose Foundation and a long list of other child safety experts raised exactly that concern, and referred to blanket bans as

“a blunt response that fails to address the successive shortcomings of tech companies and governments to act decisively and sooner.”

In my view, taking a named-platform approach to a ban is unworkable. I fear that young people and Ofcom will end up in a perpetual game of whack-a-mole, with children moving to other apps as Ofcom tries to follow and shut them down. I know which side I would have my money on in that game; our regulator is nowhere near quick enough. In March last year, I spoke in this Chamber about an app I was aware of others using from when I was a teenager, which essentially functions as a dating app for children but masquerades as social media. The app I refer to has now finally had its age limit increased to 18, but it took 10 years.

This is not just about Ofcom, either—I do not wish to criticise just Ofcom. Regardless of how quickly a regulator moves, I guarantee that our country’s children will move more quickly. Their digital literacy is far higher than they are often given credit for, which will make it much harder to regulate platforms and deal with harms. It will likely become easier for groomers to hide and to find victims, while it will become harder to regulate addictive features and to take action on harmful content. Even if that turns out to be less of an issue than I fear, arbitrarily setting the age at 16 just shifts the cliff edge that we already have to deal with—it does not teach people to deal with and recognise the problems. Frankly, the problems that social media creates are by no means limited to young people. I talked previously about AI-generated and faked content; it is primarily older people who are struggling to identify that content and are not equipped with some of the necessary skills.

In my view, blanket bans also risk serious damage to children aged between 13 and 16 for whom the ability to connect with others online is particularly important. Let me give a very personal example. Around the age of 13, I started to realise that my sexuality was not like those around me—I was not straight—but I was living in a rural community where there was not really anyone else to talk to about that who would understand. It was made worse by some of my early childhood being immersed in a deeply homophobic religious community. As such, finding people with similar experiences online to talk to and be able to provide mutual support was incredibly valuable for me, as it is for other LGBT people, as well as for neurodiverse children and others.

So what do I believe is the solution? We need a functionality-based approach. Through limiting certain functions and features, I am of the view that we can deal with harms without creating a situation where children—they are going to seek out ways to connect online regardless of the law—move on to less regulated platforms. In my view, we should look to restrict addictive functionality on those platforms; that might be linked to age, or it might be something we want to consider for people of all ages, because as I said, it is affecting people of all ages. The explosion of addictive, algorithmically driven short-form content over the past few years, as well as features such as Snapchat streaks that are actively designed to keep people hooked, provide no discernible benefit to society. Social media companies have proven unwilling to act on that front, so we must.

We also need to enforce existing age restrictions much better. We all know that plenty of people under 13—which is usually accepted as the current limit—are already using social media platforms that they should not be using. Earlier, my hon. Friend the Member for Milton Keynes Central mentioned the example of children on their parents’ accounts, which is so widespread.

There is a definitional problem about what actually is social media. Are we counting online gaming platforms such as Roblox? That can be joined from age five. There are ways to prevent children from accessing chat functions—parents can prevent younger children from doing so—but with the continuing prevalence of child abusers on the platform, those measures are clearly not working. We need to be very clear on what we mean by social media because of how much the definitions differ.

To conclude, I really agree with the Government’s approach in opening a meaningful consultation—a national conversation—on how we tackle online harms and on where the pitfalls in workability are, so that we can identify and deal with them. I look forward to engaging further with it alongside my constituents.

None Portrait Several hon. Members rose—
- Hansard -

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

Members will realise that time is knocking on. If they could keep their contributions to between five and six minutes, we should be able to get everyone in.

18:00
Claire Young Portrait Claire Young (Thornbury and Yate) (LD)
- View Speech - Hansard - - - Excerpts

As a society, we are raising the first generation of children who spend less time outdoors than prisoners do. Ministry of Justice guidelines state that all prisoners in the United Kingdom should have a minimum of one hour in the fresh air each day, yet research tells us that a worrying number of our children do not meet even that threshold—because they are confined not by bars and locks, but by screens.

Astonishingly, the Centre for Social Justice found that up to 800,000 children under the age of five are already using social media. The Association of Play Industries has released a report that highlights just how little our children are moving. The research places adolescent social media use at three to five hours every single day—a figure that has grown by 50% in under a decade. They socialise through social media, form their identities through social media and build their understanding of the world through social media, but what they find there is not a safe or nurturing space.

Children on TikTok encounter harmful content every 39 seconds. By the age of nine, one in 10 children has already been exposed to pornography. By age 11 that figure rises to more than one in four. This is not accidental exposure; it is a predictable consequence of placing children in unregulated digital environments and hoping for the best. Beyond the content itself, the very act of compulsive scrolling is taking a toll. Children are exhibiting the hallmarks of addiction anxiety when separated from their devices. They are experiencing declining attention spans, disrupted sleep and a growing inability to engage with the world in front of them.

Every hour spent staring at a screen is an hour not spent outside—not spent running, exploring, falling over and getting back up. It is an hour not spent in the kind of unstructured, unscripted play that builds resilience, creativity and social intelligence in ways that no algorithm can replicate. This was beginning to happen when my own children were young. At that time, a significant push factor was the fear that children were not safe roaming freely in the physical world. It is a sad irony that the very social media platforms that are providing an ever-increasing pull factor have created an online environment that puts children at risk of harm. Stranger danger is now online.

A 2025 University of Exeter study of 2,500 children aged seven to 12 found that 34% did not play outdoors at all after school on school days, and 20%—one in five—did not play outside on weekends either. Many children do not even get the hour promised to prisoners. The years that children spend outside are crucial. To replace that with an unregulated and harmful social media environment is to actively harm our children. That is why I support stopping under-16s from accessing harmful social media and why it is crucial that we get this right.

Getting it right means focusing on reducing harm. A blanket approach risks removing children’s access to helpful user-to-user platforms such as Childline. We must also ensure that we do not let young people loose at age 16 into a wild west of social media without any training in the safer foothills. These are concerns that charities such as the NSPCC have raised and which a film-style age rating would address.

It also means ensuring that there is flexibility to deal with future developments, such as AI chatbots, which have already been mentioned by a number of Members. We got into this situation because change in the tech world has far outpaced both our ability to adjust as a society and our legislative process. That is why I welcome the careful, collaborative approach being put forward. The 643 constituents who have written to me on this issue would expect nothing less.

18:04
Sojan Joseph Portrait Sojan Joseph (Ashford) (Lab)
- View Speech - Hansard - - - Excerpts

This is an issue in which I have taken a close personal interest, because I have spent 22 years working in mental health services in the NHS. During that time, I have seen a gradual increase in mental health conditions, especially among young people. I do not want to say that this is all because of social media—there could be various reasons, such as 14 years of austerity, the cutting of NHS services or the closure of youth hubs—but I believe, and many studies show, that social media has played a role in the recent increase in mental health conditions and mental illness among young people.

This is not just because of online content; as many Members have said, it is also because screentime takes away young people’s social interactions with the rest of society. I am particularly concerned about the high rates of depression and anxiety caused by cyber-bullying and exposure to the dark side of the internet, to which our children have almost totally unfiltered access through the devices in their pockets.

I pay tribute to the work that has been done in schools across my Ashford constituency. In my visits to local schools, I have seen how effective measures, such as students locking their phones away in sealed pouches at the start of the school day, can ensure that mobile phones do not disrupt learning. Some of the studies done in those schools show that children’s academic work and behaviour have improved, especially their attitude towards teachers and fellow students.

Tom Hayes Portrait Tom Hayes (Bournemouth East) (Lab)
- Hansard - - - Excerpts

My hon. Friend makes a really important point about the restriction of phone use in schools, and he mentions pouches. I met Naomi from Smartphone Free Childhood Dorset last week, and she is concerned that the use of pouches reinforces the idea that children can have smartphones at school. Although access is mediated, this approach still accepts that smartphones can be present. She would prefer children to have brick phones at school. Will my hon. Friend comment on that?

Sojan Joseph Portrait Sojan Joseph
- Hansard - - - Excerpts

Local studies show that locking away smartphones leads to students feeling that they have not been taken away. They are still able to hold on to their phones in their pockets, but they are not able to use them. I agree with my hon. Friend’s suggestion that another option is for children to have brick phones, so that they can still make contact or send text messages but are not able to access online content. These sorts of things need to come out in the consultation, so that we know what works and what does not work. That is why it is important to have the consultation. Local studies have shown that locking away smartphones helps to improve students’ ability to concentrate, learn and socialise during the school day, and it has been welcomed by teachers and the overwhelming majority of students.

Last year, I went to see a performance of a play by young people in Kent. The play, “Generation FOMO”, explores the impact of smartphones and social media on young people, with a script drawn from interviews with people aged between 10 and 17. It is a powerful and moving piece of work that highlights some of the harms associated with smartphones and social media, as told by young people themselves. “Generation FOMO” has been performed in schools across Kent, and I know that it has been incredibly well received by teachers, young people and parents. After seeing the impact that the play has had locally, I was delighted to bring the cast to Parliament at the start of January, so that they could perform it to parliamentarians and other policymakers in Westminster.

Following that performance, I joined many of my colleagues in writing a letter to the Prime Minister to ask the Government to take steps to look into this area. The letter set out why technology firms, not parents or teachers, should take responsibility for preventing under-age access to their platforms. I therefore welcome the Government’s announcement of a swift consultation on what further measures are needed to keep children safe online. Ministers have been clear that the consultation is not about whether the Government will take further action, but about what the next steps should be. So I am particularly pleased that, alongside the formal consultation, the Government will run a national conversation to ensure that the views of parents, teachers and young people themselves are placed at the centre of future action.

Although I want to see further measures introduced, I believe that, in order to be truly effective, they must be evidence-based. As the consultation takes place, it is right that Ministers look at what other countries—particularly Australia—are doing to protect their children. Some Members have mentioned the loopholes and how the children work around the restrictions, and this will be an opportunity to look at what is and is not working there, so we can get it right from the beginning.

In the meantime, last week the Government announced immediate action to make the online world safer for children, including a crackdown on illegal content created by AI. Some Members have talked about AI chatbots, with young people and other members of the public have been accessing for mental health help, which is dangerous.

We all want to see our children grow up healthy, confident and safe, which means ensuring that the digital world they now inhabit is built with their wellbeing in mind. The actions already announced by this Government are welcome, and I look forward to Ministers returning to the House soon with meaningful, evidence-based measures that will further strengthen the protections on the platforms that shape so much of young people’s lives.

18:11
Liz Jarvis Portrait Liz Jarvis (Eastleigh) (LD)
- View Speech - Hansard - - - Excerpts

I would like to start by thanking the hundreds of parents in my constituency who have written to me about this important issue.

For too long, tech companies have treated children as data to be mined, rather than young people to be protected. We cannot let social media bosses off the hook for the way they have normalised harm, prioritised profit and ignored warning signs. We cannot keep allowing them to act with impunity in putting our children at risk or continuing to escape scot-free while the consequences of their business models are borne by families, schools and already overstretched public services.

As we have heard, harmful content and addictive algorithms are taking a profound emotional and psychological toll, contributing to rising levels of anxiety, depression and self-harm. The dangers cannot and should not be underestimated. It beggars belief that the tech companies have been able to operate without proper regulation.

At the heart of this debate must be the children and young people who have been subjected to appalling online harms. ChatGPT has reportedly given extremely harmful answers to young people experiencing a mental health crisis, while other AI chatbots possess capabilities to foster intense and unhealthy relationships with vulnerable users and to validate dangerous impulses. As we have heard, it has also been reported that ChatGPT and Grok chatbots are advising children with potential eating disorders on dangerous meal plans of just 600 calories a day. This is terrifying.

Although I recognise the difficulties in policing everything online, will the Minister consider establishing a cross-Government approach to ensure that mental health support is expanded and to equip public services to respond effectively to social media-related harms? Can he also clarify whether Ofcom is being given the resources to meet the scale of the challenge and oversee the rapid evolution of online technologies?

Liz Jarvis Portrait Liz Jarvis
- Hansard - - - Excerpts

I am going to carry on.

My constituent Anne, who is a teacher, told me that she sees the mental health ramifications and the impact on education of mobile phone usage in schools, and believes the only way to protect the future of children is to ban their exposure to harmful apps. After meeting a local headteacher a few weeks ago, I asked the Secretary of State what interim measures the Government are considering to help schools manage pupils’ access to social media on mobile phones. She stated in her response that phones should not be used in schools, but schools do need support to enforce this. Without stronger enforcement tools, clear national standards and practical support, it will be very difficult for schools to get a grip on social media use during the school day without spending money they simply do not have.

We must also be clear-eyed about the potential shortcomings of age verification schemes. Early reports from the Australian scheme have highlighted issues with security and privacy. Teenagers can still migrate to smaller apps, borrow credentials or find ways around age verification technology, all of which pose risks to their online safety. Can the Minister confirm that the Government’s consultation will rigorously examine how systems can be designed to minimise data collection and safeguard the privacy of young people?

Social media and online content have changed what it means to be a young person today. One constituent told me that her 11-year-old daughter travels to school every day with fellow pupils, but because those girls have smartphones and like to scroll on the journey, she feels they are not interested in becoming friends with her, resulting in low self-esteem, isolation and not having local friends. My constituent dreads the emotional impact that smartphones will have on her other child when he starts secondary school.

The onus should not be on children and young people to protect themselves from online harms—it should be on platforms to prevent it in the first place. As long as the owners of tech companies allow harmful material to flourish, children are essentially being asked to build resilience in environments engineered to expose them to harm. We can teach children about online safety, respect, decency, courtesy and healthy relationships at home and in the classroom, but that work is actively undermined online by algorithms that reward extreme content and by platforms that are too slow to remove illegal and abusive material. Ofcom should be strong in enforcing clear regulations that protect users.

Liberal Democrats were the first to call for a ban on harmful social media, alongside a future-proof film-style age rating system that focuses on the harms platforms pose. The widespread consensus in this House and across the country that something urgently needs to be done to stop children accessing harmful online content reflects the pressing desire for the Government to get a grip of this crisis. We must act now to protect children, hold tech giants to account and ensure that all children are safe online.

16:59
Gareth Snell Portrait Gareth Snell (Stoke-on-Trent Central) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I will constrain my comments to three themes, and I want to start with policy. This has been a very interesting and wide-ranging debate. We have heard from many speakers across the House who have articulated the heartfelt and thoughtful concerns that all of us have about the pervasive way in which social media can influence our children, our friends, our families and young people in our society. I am the parent of a 15-year-old. I know what that battle is like—hearing the chirp of Snapchat going off every few seconds, it sounds like, some weekends, as my daughter and her friends communicate in the modern way, and trying to understand what she is doing on Roblox, the games she is playing, who she might be interacting with and the other platforms that, frankly, are alien to me, as someone who is past the age when that stuff makes much sense or is of interest.

The simple answer is to say, “We should ban it all—just lock them all away until they’re 16, and it will all be fine.” I worry about my daughter walking down the street—I worry about who she is going to meet when she is walking to school and her interactions in the physical world—but simply saying, “Right, you’re staying in your bedroom until you’re 35”, which we discuss on occasion, is not a solution to those real-world problems. Part of it is about how we help young people to understand the misinformation and disinformation that they are coming across, and it is also about the way in which we regulate the content that platforms share.

The part that has been missed today, in the many wonderful contributions from Members across the House, is that this is about not just the platforms that share the content but the creators who make that content in the first place—the people who go online to sow the seeds of hate and division: the homophobic content, the Islamophobic content, the antisemitic content that all too often is passed off as criticism of the Israeli Government, and the many far-right commentators in this country who put out toxic masculine culture commentary as though it is a reasoned point of debate. I understand what Conservative Members say about free speech, but we have always been a country and a society where it is not consequence-free speech—there are consequences to the things we say and the actions we take, and that is how we come to understand what the social norms are. We seem to have abdicated our responsibility for that in the online world.

I turn to my second point. The 15-year-old I mentioned in an intervention earlier was, in fact, my daughter, who has now given me permission to out her in that sense. The facilities that I enjoyed when I was in my teens simply do not exist any more. My daughter’s world is as much her online friends and sphere of activity as it is the physical world in which we live. Disconnecting people from that because we think it is unsafe does a disservice to them. I am also slightly worried about the impact of the fact that we are soon to legislate, I understand and hope, on giving 16 and 17-year-olds the right to vote—a policy that I think will mainly get cross-party support.

I like to think that the political literature that I push through letterboxes in my constituency is of such compelling interest that every young person will snatch it from the letterbox, read it and think, “That is why I am going to vote for Gareth at the next election.” I am sure that the Liberal Democrats’ Focus leaflets have the same impact on young people in their constituencies. The reality is, however, that young people do not read the direct mail that we send out. They do not read our leaflets, or at least not as much as they should. Many young people derive their information, news and views from social media. If we say, “You know what? We are going to cut it off”, where will we force those young people to go?

Chris Vince Portrait Chris Vince (Harlow) (Lab/Co-op)
- Hansard - - - Excerpts

I have not mentioned Harlow yet today, so I feel that I should. When I spoke to some young people at Mark Hall Academy in my constituency of Harlow—there we are, I have done it—about the potential social media ban, I was interested to hear what they had to say. They said, “We don’t care about Facebook”—because only old people like us use Facebook—but they did not want us to ban platforms like WhatsApp, which I had not thought of as being social media, although I suppose it is. Does my hon. Friend agree that it is important for young people’s voices to be heard during the Government’s consultation, so that we can understand their views on this issue?

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

Absolutely. I understand that my hon. Friend was a teacher in a previous career.

When I think of social media, I think of my Twitter account, which has been dormant for years; my Facebook account, which I use for the clips that all of us in this place are obliged to put out and then deal with the comments beneath them; and my WhatsApp, which it seems that every political party has to run with, because without it we would all stop talking to each other. My daughter would think of her Snapchat account. I too now have a Snapchat account with just one friend—her—and we use that to communicate when I am here and she is at home. It means that I get voice notes and little videos from her, and it is how we keep our weekend conversations going during the week.

We must ask ourselves where we draw the line. Members have mentioned access to YouTube. My daughter will freely use YouTube to help her with her homework. She goes to an all-iPad school, so much of the homework is set on iPads. Apparently the subject of screentime will form part of the consultation, and that should be genuinely considered. Will young people be told, “You cannot use your phone—it is the worst possible thing to have—but here is an iPad to look at for six hours a day, and if you get stuck on question 6, go to YouTube video 4 and follow the methodology”? On one hand we are sending one message, and on the other is something that is inconsistent with that approach. Let us be honest: the first job that all the children and young people we are talking about will have is going to be based on the use of some form of AI assistance, such as Copilot, and will depend almost entirely on the use of technology. We are going to have to think about how we integrate that sort of future-proofing into whatever regulation we produce.

My final point is about procedure. I am very sorry to return to that subject, because this has been an excellent debate. I went to the Public Bill Office—there is no Bill that is referenced in the motion. It is completely blank. I understand that the Liberal Democrats intend, if the motion is passed, to engage in a consensus-based process of writing a Bill in the next two weeks that we can debate and pass in one day. It is clear from what we have heard today—from the hon. Member for Winchester (Dr Chambers), who spoke so eloquently about the perils of eating disorders, from the hon. Member for Bath (Wera Hobhouse), who talked about the ability to sell drugs online, and from those on the Government Benches, including my hon. Friend the Member for Milton Keynes Central (Emily Darlington), who talked about the way in which young people interact—that, as I said earlier, this will be a complex piece of legislation.

The idea that we can complete a Second Reading debate in two hours and the full Committee and Third Reading stages in two hours, on a single day, which will include the discussion of amendments, is simply impractical. I genuinely hope that the content of today’s debate will lead to better legislation, as part of the national consultation that the Ministers are leading, but I think that doing this in such a truncated way, through a single motion and on a single day, will lead to bad legislation.

Scott Arthur Portrait Dr Scott Arthur (Edinburgh South West) (Lab)
- Hansard - - - Excerpts

It is not just impractical; it is also anti-democratic. As Bills proceed through this place, there is interaction with our constituents who want to influence how we are thinking and how we are voting, so it is important for us to have time to discuss these matters with them as well as in the Chamber.

Gareth Snell Portrait Gareth Snell
- Hansard - - - Excerpts

I agree with my hon. Friend; however, I would not say that it is undemocratic. I will be clear: I do not like the principles of Opposition parties taking over the Order Paper. I did not like it or vote for it when my party tried to do it when we were in opposition during the Brexit years, so I will not support it now. I will say that the next time a Minister stands up and says that we are moving at pace, I might pull my hair out—or what is left of it. What we need are some actual timescales for when things will happen. Otherwise, we will find ourselves talking in circles.

Today, we have been able to establish the core principles, which we would agree on. That is a good thing. I hope that when the Minister winds up, he can give a little flavour as to when the consultation will start and how we can all get involved. My hon. Friend the Member for Stafford (Leigh Ingham) and I will be doing events across our two constituencies with our colleges. That way, we can try and make sure that those views are harvested and fed in, and that a complex and nuanced issue gets the hearing it deserves so that we get the legislation right first time.

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

Order. We have three more speeches left. I will start Front-Bench speeches at 6.40 pm, so let us start with a five-minute time limit.

18:23
Susan Murray Portrait Susan Murray (Mid Dunbartonshire) (LD)
- View Speech - Hansard - - - Excerpts

We have all learned that if an online service is free, we are the product. That is the business model used by the social media giants. They track what we watch, what we click, what we like and what we fear. They then build detailed profiles of our behaviour and they turn that behaviour into data and advertising revenue.

These platforms do not just host content; they actively shape what we see. They promote material, target adverts and keep users online for as long as possible because the longer we stay, the more money they make, and that is where the problem lies. The algorithms used by social media giants are designed for engagement, not wellbeing. They feed on outrage, division and shock to keep us scrolling. Users are pushed towards more extreme content, not because it is true and not because it is healthy, but because it is profitable. We can see the consequences in the real world.

Young people in particular are vulnerable to these pipelines of harmful content. Misogynistic and extremist figures, including Andrew Tate, rise to prominence through social media ecosystems that reward provocation and repetition. What starts as healthy curiosity or pushing the boundaries in young people can quickly lead to radicalisation. Given the serious harms caused by under-regulated social media, we have a responsibility to act quickly in a defined timeline to protect children and young people.

Helen Maguire Portrait Helen Maguire (Epsom and Ewell) (LD)
- Hansard - - - Excerpts

Last week, the chief executive officer of Meta took the stand in Los Angeles as part of a landmark trial examining Instagram’s impact on the mental health of young users. This highlights the confusion about who is responsible for what in the online space. We know that we need Government legislation, but we also need clarity on what social media companies are responsible for. Does my hon. Friend agree that it is time that we establish a clear framework and proper accountability so responsibilities are understood and the right people are held to account for any failings?

Susan Murray Portrait Susan Murray
- Hansard - - - Excerpts

I absolutely agree; it is important that it is clear who is accountable for the harms that occur. That is why I urge the Government to work with the Liberal Democrats to introduce age ratings for social media. As the hon. Member for Milton Keynes Central (Emily Darlington) so clearly laid out, ratings help parents and carers to keep children safe. We already accept them in other areas of life—not every film is suitable for every age group, so we rate them; not every game is suitable for every child, so we rate them. Social media should be no different.

If a platform, unasked, can expose a child to violent content, misogyny, self-harm content or extremist propaganda, it must not be treated as if it were harmless by default. This is not about banning social media or saying it has no value. It can be a brilliant tool for learning, as we have heard. It can help people stay in touch with friends and family, as we have heard. It can open up access to information, support and communities that people might not otherwise find. Yet those benefits do not cancel out the harms. We are not trying to get rid of social media, but we must take a sensible approach to ensure that multibillion-pound companies do not push products that maximise profits while our children pay the price. We regulate risk in other areas. We cannot be beholden to tech giants; we have a responsibility to regulate here.

18:29
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - - - Excerpts

This makes me more frustrated than just about anything else in this place: the levels of ignorance, stupidity and hypocrisy from so many people in here, specifically about children’s access to social media. I fully intended to support the Lib Dems’ position, but the longer their spokesperson, the hon. Member for Twickenham (Munira Wilson), spoke, the less I wanted to do so.

I do not believe that the Government’s position on this is 100% right. I am glad that they are having a consultation, but I do not like the way that they are amending the Children’s Wellbeing and Schools Bill, which is a devolved Bill, to change the territorial extent to bring that into scope. A Bill that we have not scrutinised, because it is a devolved Bill, will now have a reserved section in it. At the moment, the Bill does not apply to Scotland, apart from one clause; now, it will apply to Scotland, because it will include this. As we have not had the opportunity to scrutinise it, we have not been involved in that process. I do not think it is right that the proposed amendment should come forward in this way, although I appreciate why the Government are doing it. That is why I am asking for the amendment to be shared with us as soon as possible so that we can see it, because we have not had a chance to look at the Bill as it has gone along.

The Lib Dems have said that they have made their position clear. I have so far been able to find three amendments and new clauses to the Online Safety Act put forward by Lib Dems during its passage through the House. One of them was put forward by the Lib Dem spokesperson, who asked for an independent evaluation within 12 months of whether more platforms should be subject to child safety duties—this is the same party that is currently accusing the Government of kicking the can down the road, despite asking for a 12-month independent evaluation. There is hardly anything in the Lib Dems’ previous positions that helps me to understand their current position.

The Tory party’s position is totally incoherent, too. The Tories refused my amendment on reducing habit-forming and algorithmic features. They also refused my amendments on livestreaming.

By the way, before the Minister’s “Dear colleague” letter, livestreaming had been mentioned 53 times across the two Houses. A third of those mentions were me talking about how livestreaming for children should be banned. Before today, Roblox had been mentioned 32 times across both Houses—15 of those mentions were me saying that Roblox is not a safe platform for children.

I am massively in favour of improving the online world for children. I think social media should be about looking at videos of cats. I love videos of cats—they are absolutely brilliant. That is what it should be for. I also think it is a great place for children to interact with one another.

Like some others in the Chamber, I have been making the case that there are dangers on social media that can be easily tackled by changing the Online Safety Act. We could have got rid of those algorithmic features for children, for example. We could have got rid of livestreaming for children through the amendment I tabled. We could have got rid of children’s access to private messaging features with people they do not know, through another amendment that I tabled to the Online Safety Act.

I do not like the way the Government are doing this, though. They are proposing an amendment to the Children’s Wellbeing and Schools Bill, and then we will have secondary legislation that will, possibly, amend the Online Safety Act—I am not 100% clear on how it is going to go. I appreciate that there needs to be a consultation.

Before the 2024 Parliament, there were about three people in this entire place who had any grip of what the online world might have been like for children. One of them was the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), who talked about some of these things. I asked the Minister at the time whether Fortnite would be included in the scope of the Online Safety Act, and they said, “If there’s text chat.” Text chat in Fortnite—it is an online game! There is not enough expertise in this place. Much as I hugely appreciate the people who work on writing Bills and the work of some of the experts at Ofcom, they are not experiencing the online world that children are experiencing. That is why we need to listen to ensure that any changes that are made tackle the most harmful behaviours, places and functionalities on the internet.

I appreciate that the Government are trying to take action on this now. However, one of the few things that has made me cry in frustration in this place was one of the first things this Government did when they came in, when they brought in secondary legislation to categorise platforms and refused to include the small, high-risk platforms that had been added in the House of Lords. They said they were categorising as category 1 only platforms like Facebook, which meet a certain threshold. I was so frustrated by that choice by the Government.

There needs to be more listening and learning about where the actual dangers are, and taking action on them. Please, do that in consultation with those of us who do understand this. Please, listen to experts on this.

18:34
Caroline Voaden Portrait Caroline Voaden (South Devon) (LD)
- View Speech - Hansard - - - Excerpts

It is clear that all of us here today want to see legislative change to protect our children online. There is no doubt about that. The only debate left is how we do it. We Liberal Democrats want to see some urgency, yet more than a month on from when the Secretary of State announced the consultation, we have seen absolutely nothing from the Government. The Minister today failed to answer my question about when we could expect to see that consultation. Given how the Government put pressure on the hon. Member for Whitehaven and Workington (Josh MacAlister) to water down his safer phones Bill back in 2024, they will forgive me for having little confidence that they are serious about legislating for this anytime soon.

There is a disconnect in our society where we assume that as soon as our children come home through the front door, they are safe from whatever harms exist in the outside world. We assume that they can happily hop on to the computer, boot up the PlayStation or relax and scroll on their phone, but in reality, the harms that children face online are far more significant, constant and pervasive than any that they face outside in the real world. Children, particularly vulnerable children, are at greater risk of grooming, seeing something violent or harmful, forming an addiction or damaging their mental or physical health from being online than they are from playing outside.

A 2025 survey by Internet Matters found that two thirds of children said they experienced harm online, that one in five had encountered violent content and that over a quarter of children had been contacted by strangers online. These days there are probably far more paedophiles sitting in a dark room in their underpants in front of a screen than there are waiting in the local park for a child to walk by. When Charlie Kirk was shot in the US, children as young as eight were watching that video within hours here in the UK. The bottom line is that parents are simply not aware of the dangers that their children are being exposed to online.

We have heard some brilliant contributions about the lack of socialising and playing outside, the damage to physical and mental health, the impact on eating disorders, the ability to buy drugs online and far more. I have talked a lot in this House about my belief that we should have a ban on phones in schools. One reason for that is that the evidence shows that when a secondary school has a total ban on having or bringing a phone into school, parents of children at the feeder primary schools are not under pressure to buy their children phones at the age of 10 or 11. By delaying giving children a device till 12, 13 or even 14, we give them some precious extra years when they can grow up a bit in the real world. Every single year counts at that age.

Mike Martin Portrait Mike Martin (Tunbridge Wells) (LD)
- Hansard - - - Excerpts

That is exactly what is happening in Tunbridge Wells. The secondary schools there have moved together to be smartphone-free, and now the primary schools are having that conversation with the parents.

Caroline Voaden Portrait Caroline Voaden
- Hansard - - - Excerpts

I thank my hon. Friend for his intervention.

Liberal Democrats are calling on the Government to ban harmful social media for under-16-year-olds by introducing age ratings similar to film classifications, so that we can rate the platforms according to the harm they present. We have talked a lot about this, and the hon. Member for Milton Keynes Central (Emily Darlington) and others raised the issue of Roblox. The harms of Roblox are clearly something that we need to be aware of. Our approach would include all user-to-user platforms such as forums and online gaming, including Roblox, to ensure that children were properly protected from harm wherever they were engaging with others online.

I want to say one thing about the importance of the online world to children. Back in 2003, my husband died and I was a very young widow. I did not know any other young widows, and I joined an organisation that had a chatroom. This was back in the dark old days when we had very static chatrooms; some Members are too young to even know what that is. Late at night, when I was on my own, that place was a real lifeline for me and a real connection to other people who had been through the same tragedy. My children got to know other bereaved children who had lost a parent. The charity that I later became chair of now has online forums where those bereaved children can speak to each other. They are probably the only kids aged five or six in their school whose dad has died, so it is really important for them to be able to have those conversations with other children. Although we talk a lot about the LGBT community, I know from my personal experience that my kids would have benefited from being able to stay in touch with the other kids that they met occasionally on weekends away if they had been able to chat to them online. So I absolutely know the value of these online spaces for children, but I am also aware of the danger.

Our approach is supported by 42 charities and experts, which work with children, on violence against women and children and online safety. We believe that it is a valid proposal. We want everybody to come together in this House. We want to work cross-party. We know that we need to legislate, and we want to do so together because we owe it to our children—we are the adults in the room. We have to protect them, and we have to do it now.

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

I call the Liberal Democrat spokesperson.

18:39
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - - - Excerpts

I have been quite shocked at some of the procedural discussion for several reasons. First, we are acting like this has just come up, but even in the House of Commons under this mandate, as my hon. Friend the Member for South Devon (Caroline Voaden) mentioned, the safer phones Bill was put forward in 2024. As Liberal Democrats, we put forward amendments to change the age of data consent to ban addictive algorithms. There have also been calls to act on doomscroll caps, and we have highlighted the harms of AI chatbots. Yet we are at a point—I absolutely respect what the hon. Member for Aberdeen North (Kirsty Blackman) was saying on this—where a consultation was proposed by the Government over a month ago, but we still do not know the details. There are things going through the House of Lords that, again, we do not know the details of. At the very least, Liberal Democrats are trying to give the space for that and say, “Yes, we need to start putting forward that legislation.” If there is another chance to debate that, what is the harm in this motion because this is such a crucial issue?

Secondly, it is not as if this is an issue that turned up yesterday. As the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah) talked about, these harms have been happening for years—over 22 years for Facebook. I will go on to say more about that in a moment. Other countries around the world are showing leadership on this and saying that we have to act now. My point is that at the very least, a consultation could have been launched earlier. This is not something new in this Parliament. We are saying that action needs to be taken.

Most importantly, the parents, children and experts watching this debate want to see us taking this issue seriously. Children and young people are at the heart of this. I think back to the first time I met some of the sixth-form students at Ashlyns school in Berkhamsted. I will never forget sitting around that table with one sixth-former—let’s call him James. He told me about his fears for the mental health of his friends. He warned about the self-harm that he was seeing among his peers, which his teachers were not even aware of, and he talked about the role of social media. A few weeks later, I was pulled to one side at St George’s school in Harpenden, where some young women shared with me their concerns about the growing misogyny lived out by young men, which started on social media.

Since then, I have carried out a “Safer Screens” tour meeting young people. Students have talked about brain rot and seeing the extreme content that the algorithm continues to push on them, even when they try to block it—the hon. Member for North West Cambridgeshire (Sam Carling) talked about that. One student said, “It is as addictive as a drug”, and they see the harms of it every day.

This is the tipping point, and I am surprised that many Members think that it is not. This is that moment. Parents, teachers, experts and even young people are crying out for action, and have been for a long time, to tackle the social media giants that have no care for their mental health. As I said, this tipping point has been years in the making. Facebook was launched 22 years ago. Indeed, a Netflix documentary from six years ago started to highlight the warnings from people who worked in tech about social media. One expert said that it is

“using your own psychology against you.”

Having worked in tech myself, I have read the books and received the training on how these social media giants get us hooked—it is built in.

Awareness is growing. I thank Smartphone Free Childhood, Health Professionals for Safer Screens, the Molly Rose Foundation, the Internet Watch Foundation and the Online Safety Act Network, along with projects such as Digital Nutrition—the hon. Member for Milton Keynes Central (Emily Darlington) and others have made the analogy of an online diet—that have worked to ask what the guidance should be. Those are just a few of the organisations I could name that have worked tirelessly to ensure these voices are heard.

I also thank pupils in my constituency from Roundwood Park, St George’s, Sir John Lawes, Berkhamsted and Ashlyns schools, and students who have openly shared their experiences, hopes and concerns about the online world. Their concerns are not just about content; they are also about addiction. Let me be clear: as my hon. Friend the Member for Mid Dunbartonshire (Susan Murray) mentioned, the core of this issue is that this is the attention economy, so our children are the product. Their attention, focus and time are being sold to line the pockets of tech billionaires. Governments around the world are taking finally action. This is a seatbelt moment where we need to say, “Enough is enough.”

The hon. Member for Stoke-on-Trent Central (Gareth Snell) talked about trying to get this right. I respect that, but I often think that if we were able to walk down the street and see a 3D version of what young people are seeing in their online world, action would have been taken much sooner. My hon. Friend the Member for Eastleigh (Liz Jarvis) talked about holding tech companies to account. We need to start unpacking what children are seeing and finally take action.

The Online Safety Act has done great work, but it does not go far enough. It sets out illegal harms and a code for inappropriate content for children and over-18s, but not a framework of legal harms or age-appropriate content. The social media age of 13 is based on data processing that is managed by the Information Commissioner’s Office and has nothing to do with what is age-appropriate in that context. Dr Kaitlyn Regehr, the author of “Smartphone Nation”, talks about how the Act is reactive, not proactive, and leaves it up to the user to report problems rather than putting the burden of safety on tech giants.

We must ensure that we build on the OSA and learn the lessons from Australia. The hon. Member for Milton Keynes Central talked about this. In Australia, a wide definition of social media has left it to a small group to decide what is appropriate. That has meant that YouTube has been banned for under-16s, but YouTube Kids has not, with no real framework for why apart from the fact that they deem YouTube Kids safer. WhatsApp has not been banned, which is possibly the right thing, but legislators are left to play whack-a-mole as new social media apps pop up. There is no framework for harm from AI.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the hon. Member give way?

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Very briefly; I want to leave the Minister time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Australia just bans children from holding accounts; it does not ban them from using any of the platforms. They can still use YouTube; they just cannot have an account.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Absolutely. YouTube is everywhere. It is embedded in almost every website that has videos.

The hon. Member for Aberdeen North (Kirsty Blackman) asked about AI chatbots. In the proposals we put forward in the Lords, the user-to-user services are the AI chatbots. We have highlighted for a long time that potential harms from AI chatbots are not covered. That is absolutely the case, but Ofcom has clarified that AI chatbots are the user-to-user service. The harms, such as AI psychosis, which my hon. Friend the Member for Winchester (Dr Chambers) alluded to, are not covered. That is why the harms-based approach we are putting forward is so important.

As my hon. Friend the Member for Twickenham (Munira Wilson) said when she opened the debate, the Liberal Democrats have been leading the work on online safety in this Parliament. We were the first party to push a vote on banning addictive algorithms. We have called for health warnings and a doomscroll cap. Today, we are calling for a vote on the age for social media and online harms. We are calling for a ban on harmful social media based on a film-style age rating. That harms-based approach holds tech companies to account, sets a pioneering approach to online standards and prepares for the future of AI chatbots and games like Roblox, which has already arrived.

In the offline world, anyone buying a toy for young children at this point would expect age ratings so that they know it is appropriate and safe, and films have had age ratings for over 100 years, yet we have not had that in the online world. The harms-based approach is backed by 42 charities and experts who work to protect children, stop violence against women and girls and make the internet a safer place.

We are also calling for a reset, because enough is enough. That includes a minimum age of 16 for social media and real accountability for tech companies with film-style age ratings. We need to make sure that we get the best out of the internet for young people and protect them from harms.

For me, it comes back to James, his friends and the young women and children I have spoken to around my constituency. We do not have time to waste—that is why we are pushing for these Bills. We are calling for action, and I call on MPs across the House to put children before politics, exactly as we did in the Lords. The amendment in the Lords could mean a blanket ban. We were uncomfortable with that approach—we much prefer ours—but we knew that the future of children came first. We must help the next generation to get the best of the online world—including those young people who have spoken out and shared their concerns and horror stories—and protect them from the worst of it.

18:49
Ian Murray Portrait The Minister for Digital Government and Data (Ian Murray)
- View Speech - Hansard - - - Excerpts

I thank the Liberal Democrats for securing this debate, although I am slightly disappointed by the way in which that was done. I will not concentrate on procedural issues, but it seems to me that the argument is to give the Liberal Democrats the freedom of the House to introduce a piece of legislation that they want to work on while already having all the answers.

The use of a procedural motion for this serious debate is rather unfortunate. I think that has been demonstrated in the strength of feeling in the debate. I am completely and utterly split. The shadow Secretary of State, the hon. Member for Hornchurch and Upminster (Julia Lopez), asked us to give an opinion, but I do not really know what the best thing to do is. I have a five-year-old girl and a one-year-old girl. The jobs that they will do when they are any of our ages have probably not even been invented yet. I want them to be able to live their lives, and to exploit, experience and enjoy social media and what new tech has to offer, but I want them to do so safely. Denying them that opportunity might not be the answer, but that is why consultation is put in place.

The hon. Member for Twickenham (Munira Wilson), who opened the debate, mentioned her own children and the daily fight between screentime, online and doing other things. I am sure that I had the same fight with my own parents when they tried to turn the television off at night, so this is not a new battle, but it is a battle that parents will win—whether we negotiate or treat them with sweets or something else. This highlights the importance of the approach that we are taking, which allows proper consideration of a range of views—that is urgent.

I have a whole stack of incredibly sobering but also very contradictory statistics here, which is why consultation and national conversation are important. Some 99% of 12 to 17-year-olds reported that they benefit from being online. One statistic suggests that half of parents think that the benefits of children being on social media outweigh the risks, while another says that just three in 10 parents think that.

Looking at that data—there is a whole host of it in that context—we might think that this is a difficult issue to resolve, but then we come to the child sexual exploitation and abuse issues. There were 41,000 obscene publication offences in December 2024—an 860% increase in a decade. In September 2025, there were 42,000 obscene publication offences—a near 1,000% increase since 2013. Some 91% of child sexual abuse material found online is self-generated, often under pressure from manipulation. Let us be quite clear, because some Members do not get this: it is illegal to create, possess or distribute child sexual abuse images, including those generated by AI, regardless of whether they depict a real child. That is already against the criminal law. The Online Safety Act requires in-scope services to assess the risks to their users of child sexual abuse material. That cannot be clearer.

John Hayes Portrait Sir John Hayes
- Hansard - - - Excerpts

Of course, the Minister is right that there is all kinds of data about online activity, but what is plain and truthful is that academic evidence suggests not only that there are risks of the kind that he has just described—of abuse, exploitation and so on—but that children’s very consciousness is being altered, including their ability to socialise, to learn and to comprehend. That of itself requires the Government to act, for a generation of children are being exploited by heartless tech companies that are careless about the damage they do.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I appreciate the right hon. Gentleman’s intervention. [Interruption.] I am sorry to upset my hon. Friend the Member for Stoke-on-Trent Central (Gareth Snell). The Government are acting at pace, but we want to act in the right way. We must act in the right way because this is such a complex and serious issue. It is important for children to be able to seize the opportunities that being online can offer. We have heard about iPad-only schools. Parents must be confident that their children are safe—that is key. If we do not want to exclude children from age-appropriate services that benefit their wellbeing, we must act on the evidence and ensure that we strike the right balance between protecting children’s safety and wellbeing, and enabling them to use technology in positive and empowering ways.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

Does my right hon. Friend share my disappointment that, in this debate on protecting children from some of the most obscene abuse, not one Reform Member is present?

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

And the Reform party wants to dismantle the Online Safety Act. We are trying to resolve its potential imperfections, get it implemented and take it forward through the consultation, but Reform wants to scrap it. I have already said that anybody who thinks that the illegal possession and creation of child sexual images is acceptable under the banner of free speech does not deserve to be sitting in this House.

I have only a few minutes, and many Members made superb contributions; let me run through some of them. I have no idea whether the shadow Secretary of State supports the Online Safety Act or taking this proposal forward to a consultation, but we have to be evidence led. We share a desire to keep children safe online, but she has politicised the whole issue. It is not contradictory to want our children to have access to social media and benefit from it, and want to protect them; in fact, most Members said that they want to do that.

Lots of Liberal Democrat Members asked about timelines. Let me just say to them that there are not too many sleeps to go before they will see the consultation. Crucially, the Government will table amendments to the Children’s Wellbeing and Schools Bill that will allow us to implement the outcomes of the consultation through secondary legislation within months—before the summer, as my hon. Friend the Member for Vale of Glamorgan (Kanishka Narayan) said. There is a primary legislative vehicle there already, and we can introduce the secondary legislation later.

The hon. Member for Aberdeen North (Kirsty Blackman) talked about the legislation not having been scrutinised. That is the position of the SNP—it does not scrutinise legislation on matters that are devolved to the Scottish Parliament—but that is their decision. It is not the decision of this House; its procedures do not require that.

I pay tribute to the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), and the Committee members for all their work on this issue. They are experts, and I have enjoyed listening to their speeches.

The hon. Member for Bath (Wera Hobhouse) mentioned advertising and drugs policy. As a joint Minister for the Department for Culture, Media and Sport, I chair the online advertising taskforce, which is trying to deal with issues relating to online advertising. We are working with the Home Office on a fraud strategy and are trying to ensure that online advertising spaces are well regulated and looked after.

The hon. Member for Aberdeen North was absolutely right to talk about eating disorders and suicide. I have heard some absolutely horrific stories—unimaginable stories that nobody could write in their wildest dreams—about online grooming, mainly of young girls, and mainly on platforms on which they are already vulnerable due to eating disorders. We have to deal with that. The live streaming of suicides is creeping up, and we have to do something about that. That is what this consultation is about.

The hon. Member for Winchester (Dr Chambers) raised huge issues including eating disorders and AI chat. That is why chatbots will be included in the consultation. My hon. Friend the Member for Milton Keynes Central (Emily Darlington) does a lot of work in this area—she is one of the pioneers in bringing these issues to the House. Algorithms will also be part of the consultation.

However, young people do need to see what social media is. To me as a 50-year-old, it is Facebook and WhatsApp, but it is not that to younger people. In fact, they frown when I talk about Facebook because they think it is old technology like slates and chalk.

My hon. Friend the Member for North West Cambridgeshire (Sam Carling) said that the platforms must do better to police themselves—they do have to do more—and, like many other Members, talked about their addictive features. Addictive features will be part of the consultation as well, so please do get involved in that consultation.

My hon. Friend the Member for Ashford (Sojan Joseph), along with many hon. Members, spoke about the correlations with mental health. The Department of Health and Social Care has a key role to play in that, as was mentioned by the Liberal Democrat spokesperson.

Ninety-eight per cent of primary schools and 90% of secondary schools already have “no phones in school” policies. We are clear that the guidance was not strong enough, which is why the Department for Education has published updated guidance on the use of mobile phones in schools. Because we want to be as clear as possible to schools, parents and young people that phones should not be used in schools, we reserve the right to put that on a statutory footing, should we be required to do so.

My hon. Friend the Member for Stoke-on-Trent Central (Gareth Snell) said that he uses Snapchat to communicate with his one and only friend on Snapchat: his daughter. I hope that she will not unfriend him after his contribution or he will have no one. He was right to talk about the procedural motion. This is such a serious issue, so we must ensure that we get that right.

Let me finish by talking about international social media bans and incidents from previous countries, as many hon. Members mentioned.

In closing, protecting children online and ensuring that online life is as fulfilling as that offline is a responsibility that this Government take incredibly seriously. No child should have to navigate unsafe digital spaces or experience negative impacts on their health and wellbeing, and no parent should feel alone in making decisions to protect and support their children. Members should please get involved in the consultation, which is there to give proper consideration to the most effective ways forward to deal with these problems. It will be short and sharp. It will last for three months, and we have a legislative vehicle to take forward the proposals from that.

Question put.

19:00

Division 434

Question accordingly negatived.

Ayes: 69

Noes: 279

Deferred Divisions
Motion made, and Question put forthwith (Standing Order No. 41A(3)),
That, at this day’s sitting, Standing Order No. 41A (Deferred divisions) shall not apply to the Motion in the name of the Chancellor of the Exchequer relating to Charter for Budget Responsibility.—(Gregor Poynton.)
Question agreed to.