Debates between Catherine McKinnell and Roger Gale during the 2019 Parliament

Support for Ukraine and Countering Threats from Russia

Debate between Catherine McKinnell and Roger Gale
Wednesday 2nd March 2022

(2 years, 1 month ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Catherine McKinnell Portrait Catherine McKinnell
- Hansard - -

People want to do everything they can to help. Local communities are working incredibly hard to support those communities in Ukraine in every way possible here in the UK and in the neighbouring countries. I think everybody should do what they can to help through local organisations and advertised means. BBC Radio Newcastle, for example, has published a list of places in the north-east where people can offer support and donations. Everybody who wants to help can and should do so, because that is something we can all do today.

Roger Gale Portrait Sir Roger Gale
- Hansard - - - Excerpts

I discussed exactly that circumstance with the Secretary of State for Levelling Up, Housing and Communities yesterday. He will be issuing details about how we can go about that, because many communities clearly want to help. The hon. Lady will find that it is in the pipeline.

Catherine McKinnell Portrait Catherine McKinnell
- Hansard - -

I thank the right hon. Gentleman. That is an important example of how important it is to work together on a cross-party basis in this House. We are all working in unity to stand up on the issue.

The debate is important because we know that President Putin is banking on cynicism and apathy to win the day. He has doubted the west’s outpouring of solidarity. He thinks that it will not last and that it will wane, and that in the longer term, we will not want to bear the economic costs of what it will take to continue to stand in solidarity with the Ukrainians. We need to show the world that we are better than that and that we will not wane. I say in all support that our Government need to ensure that any economic pain that we have to shoulder as a country is borne by those who can bear it. That is the responsibility of our Government.

Our country has done what is necessary to defend democracy on this continent before and we will do it again. I stand today to declare my support and that of the thousands of constituents who have contacted all hon. Members, and to ensure that it is known that we have that support.

Online Abuse

Debate between Catherine McKinnell and Roger Gale
Monday 28th February 2022

(2 years, 2 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Roger Gale Portrait Sir Roger Gale (in the Chair)
- Hansard - - - Excerpts

I remind Members to observe social distancing and to wear masks as appropriate, please.

Before I call the hon. Member for Newcastle upon Tyne North to move the motion, I wish to make a short statement about the sub judice measures. There are issues pertinent to the debate that may be relevant to specific cases. I remind Members that, under the terms of the House’s sub judice resolution, no reference should be made to legal proceedings that are currently live before UK courts. For clarification, that applies to coroners’ courts as well as to law courts.

Catherine McKinnell Portrait Catherine McKinnell (Newcastle upon Tyne North) (Lab)
- Hansard - -

I beg to move,

That this House has considered e-petitions 272087 and 575833, relating to online abuse.

Thank you for chairing this incredibly important and timely debate, Sir Roger. The way that we view social media in this country has changed dramatically in the 12 years that I have been in Parliament. In the early days, people saw it as a force for good, whether they were activists using Twitter and Facebook to organise when the Arab spring began in 2011 or just groups of likeminded people sharing photos of their cats and clips from their favourite video games. Many people took the anonymity it offered to be an unqualified positive, allowing people to share deeply personal things and be themselves in a way that, perhaps, they felt they could not be in the real world. All that potential is still there.

Social media has been an invaluable tool in keeping us connected with friends and family during these incredibly challenging two years, but the dark side of social media has also become depressingly familiar to us all. We now worry about what exactly these giant corporations have been doing with our personal information. We read research describing echo chambers that fuel political polarisation and we see the unfolding mental health impact, particularly on young girls, of the heavily edited celebrity images that always seem to be just one or two swipes away.

As the Putin regime’s disastrous invasion of Ukraine proceeds, Ukrainians do not just face the Russian troops who have entered their country. Moscow has, predictably, stepped up its misinformation campaigns on social media, with the intention of sowing confusion and undermining Ukrainian morale. Meanwhile, the ease of creating, editing and sharing videos that appear to document events has left many uncertain what to believe or where to turn for reliable information. The Ukrainian people are seeing their worst nightmare unfold in a tragic, bloody war that they did not want. Before I make my main comments, I am sure I speak for us all here in saying that our thoughts, our solidary and our resolve are with Ukraine today. I know that many Members wanted to be in this important debate, but events and issues in the main Chamber have rather taken over.

What prompted the Petitions Committee inquiry into online abuse and the report we are debating is the growing concern expressed by petitioners about the abuse that people receive on social media platforms and the painfully slow progress in making online spaces less toxic. The scale of public concern is shown by the popularity of the e-petitions that our Committee has received on this subject in recent years, particularly the petitions created by television personalities Bobby Norris and Katie Price.

Bobby, who is sitting in the Public Gallery, is a powerful advocate and has started two petitions on this issue. The first called for the online homophobia of which he has been a target to be made a specific criminal offence. The second, which prompted our inquiry, was created in September 2019 and called on the Government to require online trolls to be blocked from accessing social media platforms via their IP addresses. It received over 133,000 signatures before closing early due to the 2019 general election.

Members will also be aware that Katie Price has spoken movingly about the vile abuse to which her son Harvey has been subjected. She and her mum Amy told the Committee that platforms fail to take down racist and anti-disability abuse aimed at Harvey and continue to respond poorly to complaints and reports about abusive posts. Katie’s petition was created in March 2021 and called on the Government to require social media users to link their accounts to a verified form of ID, so that abusive users can be more easily identified and prosecuted. The petition said:

“We have experienced the worst kind of abuse towards my disabled son and want to make sure that no one can hide behind their crime.”

It received almost 700,000 signatures by the time it closed in September 2021, with more than 500,000 coming in the weeks after the deplorable racist abuse aimed at England footballers last summer.

The inquiry that we have just concluded took place in the context of the Government’s draft Online Safety Bill. There is not time for me to talk through the Bill in detail today, but I know it will be the subject of intense scrutiny and debate in the coming months, and it is expected to impose new legal requirements on social media and other online platforms, including any platform that enables UK users to post content such as comments, images or videos, or to talk to others via messaging or forums online. I look forward to hearing the Government’s comments on that, although I appreciate that we await the publication of the Bill.

Online abuse is not something that just affects people in the public eye; it is something that most of us have at least witnessed, if not been subjected to ourselves. Ofcom’s pilot online harms survey in 2020-21 found that over a four-week period, 13% of respondents had experienced trolling, 10% offensive or upsetting language, and 9% hate speech or threats of violence. It is not an unfortunate side-effect of social media that victims can just shrug off. Although the abuse takes place online, we know that it can have a significant and devastating impact on the lives of victims offline. Glitch, which we spoke to as part of our inquiry, collated testimonies of online abuse as part of its report, “The Ripple Effect: Covid-19 and the Epidemic of Online Abuse”. One woman said:

“I shared a post on the rise in domestic abuse during lockdown and received dozens of misogynistic messages telling me I deserved to be abused, calling me a liar and a feminazi. My scariest experience, however, was when I shared a photograph of my young son and me. It was picked up by someone I assume to be on the far right, who retweeted it. Subsequently, throughout the day I received dozens of racist messages and threats directed at my son, at his father, and at me. It was terrifying.”

Sadly, distressing accounts of fear, isolation, difficulty sleeping, anxiety and depression are alarmingly familiar for people who are targeted for online abuse and harassment. However, the abuse is not directed equally, and the online world does not stand apart from real-world inequalities. Our inquiry found that women, disabled people, those from lesbian, gay, bisexual and transgender communities, and people from ethnic minority backgrounds are not only disproportionately targeted for abuse; often it is their very identities that are attached. International research conducted by the Pew Research Centre found that 70% of lesbian, gay and bisexual adults have encountered online harassment, compared with about 40% of straight adults.

We heard not only that incidents of antisemitic abuse have increased, but that Jewish women are disproportionately singled out for abuse. Similarly, although women are generally subjected to more online bullying than men are, ethnicity further influences a woman’s vulnerability. Amnesty International’s research suggests that black women are around 84% more likely than white women to be abused online. In this way, online abuse can reflect and amplify the inequalities that exist offline. It also reinforces marginalisation, discouraging the participation of such communities in online spaces. Demos, which we spoke to as part of our inquiry, catalogued the effect of misogynistic abuse on women’s mental health as part of its 2021 report, “Silence, Woman”. Many women quoted in the report talked of wanting to stop their social media presence altogether and leave activities that they otherwise enjoy. One said:

“At the moment, it makes me want to quit everything I do online.”

Another said:

“I can’t even look at social media because I’m so scared that I’ll see more sexism. It’s really affecting my mental health.”

It is essential that any measures to tackle online abuse also recognise and respond to inequalities in the volume and severity of that abuse. Therefore, our report makes several recommendations to Government. First, we recommend that a statutory duty be placed on the Government to consult with civil society organisations representing communities most affected by online harassment. These organisations best understand the needs of victims, and such consultation will ensure that legislation is truly effective in tackling online harms. Their involvement is an important counterbalance to the lobbying efforts of social media companies.

Secondly, we believe that the draft Online Safety Bill should align with protections already established in the Equality Act 2010 and hate crime laws, and should include abuse based on protected characteristics as priority harmful content. It should list hate crime and violence against women and girls offences as specific relevant offences within the scope of the Bill’s illegal content safety duties and specify the offences covered under those headings.

Finally, platforms should be required in their risk assessments to consider the differential risks faced by certain groups of users. Those requirements should be made explicit in the risk assessment duties set out in the draft Online Safety Bill. The evidence is clear: if someone is female, from an ethnic minority or from the LGBT community, they are much more likely to be abused online. Any legislation that assumes online abuse affects everybody equally, separate from real-world inequalities, does not address the problem. For the draft Online Safety Bill to be effective, it must require platforms to assess the vulnerability of certain communities online and tackle the unequal character of online abuse.

The related issues of online anonymity and identification of users also emerged as important and controversial issues, not only in our inquiry and the petitions that prompted it, but in the wider public and policy discussion about online abuse. The evidence we heard on the role of anonymity in facilitating abuse was mixed. Danny Stone of the Antisemitism Policy Trust, with whom I have worked closely as chair of the all-party parliamentary group against antisemitism, told us that the ability to post anonymously enables abusive behaviour and pointed to research demonstrating disinhibition effects from anonymity that can lead to increased hateful behaviour. Danny cited a figure suggesting that 40% of online antisemitic incidents over the course of a month originated from anonymous accounts. Nancy Kelley from Stonewall and Stephen Kinsella from Clean Up The Internet also argued that anonymity should be seen as a factor that increases the risk of users posting abuse and other harmful content.

However, other witnesses took different views, arguing that evidence of a causal link between anonymity and abusive behaviour is unclear. Chara Bakalis from Oxford Brookes University argued that

“focusing so much on anonymity and trying to make people say who they are online”

risks misconstruing the problem as a question of individual behaviour, rather than the overall toxicity of online spaces. We also heard that the ability to post anonymously can be important for vulnerable users. Ruth Smeeth from Index on Censorship told us how valuable it is for victims of domestic abuse to be able to share their stories without fear of being identified, and Ellen Judson from Demos warned that there is no way to reduce anonymity in a way that only hurts abusers.

Tackling the abuse itself, whether or not it comes from anonymous users, should therefore be the focus of efforts to resolve this problem. Allowing users to post anonymously always entails a risk. We recommend that online platforms should be required to specifically evaluate the links between anonymity and abusive content on their platforms, in order to consider what steps should be taken in response to it.

A related question is whether users should be required to identify themselves if they want to use social media, as a way of preventing online abuse. On Friday, the Government announced that the draft Online Safety Bill would require the largest social media companies to allow users to verify their identities on a voluntary basis, and users will therefore have the option to block other users who choose not to verify their identity. This is a positive forward, giving users control over who they interact with and what they see online.

However, that would not be a silver bullet and should not be presented as such. It is an extra layer of protection, but it should not be the main focus for tackling online abuse. It absolutely does not absolve social media companies of their responsibility to make online spaces less toxic, which must be our focus, nor is it without risks. The Committee heard counter-arguments about users having to choose to use the option to block unverified users, which could mean that domestic abuse victims and other vulnerable users might be less likely to want to verify themselves, and therefore their voices will not be heard by other users. When Ofcom draws up its guidance, it must therefore offer ways to verify identity that are as robust but as inclusive as possible.

Bobby Norris’s petition argues that it is “far too easy” for social media users who have been banned to simply create a new account and continue the abuse. Katie Price and her mum Amy also raised the issue of banned users who seemingly have no problem returning and behaving in the same appalling way. The major social media platforms told us that they already have rules against previously banned users returning, as well as the tools and data to identify users and prevent them from starting new accounts. However, the evidence that we heard does not support that.

Our inquiry found that preventing the return of abusive banned users is not a priority for social media companies, and some users are taking advantage of the lax enforcement of bans to continue abusing their victims. That is a significant failing, and platforms must be held accountable for it. Robust measures must be put in place to require social media platforms to demonstrate that they can identify previously banned users when they try to create new accounts and must discourage—or, even better, prevent—unstable accounts from posting abusive content.

Where a platform’s rules prohibit users from returning to the platform, they should be able to show that they are adequately enforcing those rules. The regulations must have teeth, so we also recommend that Ofcom should have the power to issue fines or take other enforcement action if a platform cannot demonstrate that.

We also took evidence from the Law Commission, which has recommended the creation of two new offences covering online communications. The proposed introduction of a harm-based offence would criminalise communications

“likely to cause harm to a likely audience”,

with harm defined as

“psychological harm, amounting at least to serious distress”.

An additional new offence covering threatening communications would criminalise communications that convey

“a threat of serious harm”,

such as grievous bodily harm or rape.

We also heard that if the proposed new offences were introduced, some abusive content may be treated as

“a more serious offence with a more serious penalty”

than if it were prosecuted under existing law. The Committee believes that is a positive step forward that would better capture the context-dependent nature of some online abuse. A photograph of someone’s front door, for example, may be entirely innocent in some contexts, but can take on quite sinister connotations in others, where it quite clearly implies a threat to a person’s safety.

The Government should also monitor how effectively any new communications offences, particularly the Law Commission’s proposed harm-based offence, protect people and provide redress for victims of online abuse, while also protecting freedom of expression online. The Government should publish an initial review of the workings and impact of any new communications offences within the first two years after they come into force. We have to make sure we take this opportunity to get this right and review it within two years to make sure it is as effective as it can be.

The Law Commission also recently concluded a review of hate crime law. It acknowledges two points highlighted in the Petitions Committee’s 2019 report: the unequal treatment of protected characteristics in hate crime law, and the failure to classify abuse of disabled people as a hate crime in cases where the offence may have been motivated by a sense that disabled people are easy targets, rather than being clearly motivated by hostility to disabled people.

The commission recommended extending existing aggravated hate crime offences to cover all characteristics currently protected under hate crime law, and reforming the motivation test for an offence to be treated as a hate crime, proposing an alternate test of motivation on the grounds of “hostility or prejudice”. The Government have stated that hate crime offences will be listed in the draft Online Safety Bill as priority illegal content. That means that the legislation will require platforms to take steps to proactively prevent users from encountering hate crime content.

There is some confusion, however, as we do not yet know if this will be limited to the existing stand-alone racially and religiously aggravated and stirring up hatred offences, or if the intention is to require platforms to proactively prevent users from encountering, for example, communications that involve hostility based on a protected characteristic such as disability. When the Minister responds, will he tell us what the Government expect the practical impact to be on how platforms are required to deal with, for example, the abuse of disabled people online?

Our inquiry heard again and again that changes to the law on online abuse risk becoming irrelevant, when we lack the capacity to even enforce the law as it stands. The uncomfortable truth is that, despite the dedication of our officers, police resources have been diminished to the point where even relatively simple crimes in the offline world go unsolved more often than not, according to Home Office statistics. Meanwhile, the proportion of reported crimes leading to successful prosecutions has reached an all-time low.

It is not surprising that we found such scepticism about the state’s capacity to enforce a law against criminal online abuse, which, in many cases, will be complex and time-consuming to investigate. Ruth Smeeth gave the following evidence to the Committee:

“When I got my first death threat in 2014, at that point the police did not have access to Facebook. It was banned…Although they can now see it, they do not have the resources available to help them prosecute. Whether the legislation is amended or not, it is so incredibly important that the criminal justice system can do its work. To do that, they need resources.”

While we believe the Law Commission’s proposals are eminently sensible, we are deeply concerned that the inadequate resourcing of our police and criminal justice system is the real elephant in the room. It could prevent us from dealing with the most serious forms of online abuse, such as death threats, the sending of indecent images and illegal hate speech.

I suspect that the Treasury is unlikely to look favourably on this resourcing issue any time soon, but the Committee would be neglecting its duty if we failed to draw attention to it. Resources in the police and criminal justice system have to be an essential part of the conversation on tackling online harms. If the Government are serious about tackling the most serious forms of online abuse, they must ensure that our police and courts have the resources to enforce the laws against it.

Although we talk a lot about Twitter, Facebook and TikTok in these discussions, abusive content hosted on smaller platforms also plays a significant role in encouraging prejudicial attitudes and real-world harm. Some of these platforms have become safe havens for some of the most troubling material available online, including holocaust denial, terrorist propaganda films and covid-19 disinformation. From an internet browser today, anyone can easily access videos that show graphic footage of real-world violence and allege the attacks are part of a Jewish plot, or find an entire channel dedicated to the idea that black people are a biological weapon designed to wipe out western civilisation—I could go on. Danny Stone of the Antisemitism Policy Trust told the Committee:

“It is not just the Twitters and Facebooks of this world; there are a range of harms occurring across a range of different platforms. It is sinister, we have a problem and, at the moment, it is completely unregulated. Something needs to be done.”

We have heard no evidence to suggest that the negative effects of abuse on people’s wellbeing or freedom of expression are any less serious because the abuse comes from a smaller platform. Failure to address such content would risk significantly undermining the impact of the legislation. The duties set out in the draft Online Safety Bill relating to content that is “legal but harmful” to adults must apply to a wide range of platforms to ensure that abusive content is removed from the online sphere, not merely shifted from the larger platforms to darker corners of the internet.

The Committee therefore recommends that the draft Online Safety Bill require smaller platforms to take steps to protect users from content that is legal but harmful to adults, with a particular focus on ensuring that such platforms cannot be used to host content that has the potential to encourage hate or prejudice towards individuals or communities. They do not get a free pass just because they are smaller platforms.

The Minister has previously indicated that the Government have considered amending the conditions for classing a platform as category 1, so that it covers platforms with either a high number of users or posing a high risk of harm to users, rather than both conditions having to be met, as is the case in the draft Bill. We would welcome an update on whether the Government are minded to take that forward.

Legislators have a way of making the debate around online safety sound incredibly complicated and inaccessible. However, the fundamental issue is simple: too many people are exploiting online platforms to abuse others, and not enough has been done to protect the victims and create online spaces where people are free to express themselves in a constructive way. In the offline world, there are rules on acceptable behaviour and how we treat other people. We invest huge amounts of time and energy into ensuring that those rules are followed as much as possible. The same simply cannot be said of the digital sphere.

The online world has changed dramatically in such a short time. Our laws and regulations have not kept up. They have allowed a culture to develop where abuse has become normalised. It was deeply troubling to hear in every single one of the Committee’s school engagement sessions that pupils believe that abuse is just a normal part of the online experience. Is that really what we want our children to grow up believing? We can do so much better than that.

Social media companies make so much money. It is not too much to ask that they invest some of that in ensuring that their platforms are safe, and that people cannot inflict enormous harm on others without consequences. Of course, there will always be some abuse and inappropriate behaviour online, and nobody expects any Government to prevent it all, just as no home security system could stop every clever and determined burglar, but we can certainly do a lot better.

The Committee welcomes the opportunity provided by the draft Online Safety Bill, and our report sets out several ways the Government can improve the legislation. Ministers must recognise the disproportionate way that women, ethnic minorities, people with disabilities and LGBT people are targeted, so that nobody feels they cannot express themselves or engage with others online. We need to hold the platforms accountable if they fail to prevent banned users from rejoining, and we must ensure our police have the resources they need to tackle the most dangerous forms of online abuse. We look forward to the Government addressing our recommendations when their formal response to our report is published, and to the Minister’s response today.

Social media offers such fantastic opportunities to connect with others and is a real source of positivity and enjoyment for so many people. If we get the Bill right, we will be taking the first step towards bringing some much-needed light to the dark side of social media and amplifying the benefits of the unprecedented connectivity of the world we live in. Our report and today’s debate are important steps in bringing to Parliament the concerns of hundreds of thousands of members of the public who want a safer and more equal online world. We will continue to hold Ministers to account on behalf of the petitioners, so that the draft Online Safety Bill makes its way through Parliament and achieves what we know petitioners want.

Childcare

Debate between Catherine McKinnell and Roger Gale
Monday 13th September 2021

(2 years, 7 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Roger Gale Portrait Sir Roger Gale (in the Chair)
- Hansard - - - Excerpts

Good afternoon, ladies and gentlemen—welcome back. Before we begin, I encourage Members to wear masks when not speaking, if possible, in line with Government guidance and that of the House of Commons Commission. I apologise to Members for the fact that, having given you that advice, I may not be able to adhere to it myself because my glasses steam up and I might not be able to see anybody. Please give each other and members of staff space when seated and when entering and leaving the room.

Please send speaking notes by email to hansardnotes@ parliament.uk. If in any doubt, come and ask and we will repeat that for you. Similarly, officials should communicate with Ministers electronically, where possible.

Catherine McKinnell Portrait Catherine McKinnell (Newcastle upon Tyne North) (Lab)
- Hansard - -

I beg to move,

That this House has considered e-petition 586700, relating to funding and affordability of childcare.

The petition is about the need for an independent review of childcare funding so that we can really think through what we want our childcare and early education sector to be, and what we hope it can do for the families who need it and for us as a society. So many economic and social benefits flow from the sector that it is difficult to summarise in the time we have, but I think most of us would agree on three key reasons why it is so important to support high quality early education.

First, we know from international evidence that so many important life outcomes—from health to wealth and wellbeing—have their origins in the early years. Quality early education can benefit children’s academic and social development, and evidence shows that those benefits are often stronger for children from disadvantaged families, as it starts them off on a more equal footing with their peers when they go to school.

Secondly, access to childcare is crucial for working parents. Closures during the pandemic have served as a real reminder of just how important it is. The pre-school years are a particularly significant time for new mothers: regrettably, decisions around their childcare in that short period can have a huge impact on their lifetime earnings and, consequently, on the gender pay gap.

Finally, helping with the cost of childcare and early education is one of the best ways for the Government to ensure that families with young children—particularly those on low incomes—are not financially crippled by high costs. As the petitioners point out, childcare in the UK is expensive. Statistics from the OECD show that, however we look at it, we are close to the top of the list of developed countries for childcare costs.

I think that most of us would agree on what we want our early years sector to deliver and on those broad criteria, but some may place different emphasis on them. Analysing whether we are meeting those objectives, and how we can improve on them, is a huge task that touches on many complex areas, such as funding, training, accountability and outcomes. I do not think this House has the expertise or the time to cover those in depth, which is why we need an independent review.

During the debate, I want to look specifically at funding, which is the focus of the petition. In that key area, there is strong evidence that we are letting down children, parents and providers, and I will make the case to support the petitioners’ call for an independent review. Determining the right level of funding for the early years is of course the subject of long-running disputes between the Government and sector representatives, but it goes to the heart of what early years really means to us as a country.

Childcare is as necessary for parents to get to work as the roads and the rail network, so why do we not approach and fund it as the vital infrastructure investment that it clearly is? I am sure the Minister will point out that spending on free entitlements—the 15 and 30-hour entitlements for three and four-year-olds, and disadvantaged two-year-olds—has more than doubled to around £3.4 billion since 2010, but it is important to look at what has driven that increase. Most of it has come from successive expansions of eligibility, which are of course hugely welcome. However, what providers are concerned about is a discrepancy between the cost per hour of delivering the free entitlements and the funding per hour that they receive.

The Institute for Fiscal Studies’ latest annual report on education spending shows that funding per hour of childcare is now only about 13% higher in real terms than in 2004, despite an increase of about 150% in total spending. In recent years, funding per hour has declined from its 2017-18 peak, showing that even the modest increase introduced alongside the 30-hour entitlement in 2017 has not been maintained.

Even more importantly, we know that it is not enough just to provide for the costs of delivering childcare. The Department for Education’s publication in June of a much-delayed freedom of information response to the Early Years Alliance showed that the Government were aware of the consequences of introducing the 30 hours policy with an insufficient level of investment. Ministers knew that the investment would meet only around two thirds of costs—meaning higher costs for parents—and force early years staff to look after the maximum legal ratio of children, with significant impacts on quality. With a lack of proper investment in the free entitlement, providers are forced to cover their costs by charging more for the non-funded hours. That means spiralling costs for parents and carers, whose fees have risen three times faster than earnings since 2008—and that is just the average. For the parents of two-year-olds in some parts of the country, childcare costs have risen seven times faster than their wages.

As a working mother both before and since becoming an MP, I have my own experiences of the heart-wrenching stress and pressure of getting the right childcare and support, and of the enormous costs. Our childcare costs are now the highest of almost any developed country. In a Petitions Committee survey earlier this year, 77% of parents agreed or strongly agreed that cost had prevented them from getting the kind of childcare they really needed. One respondent said:

“I do not have the option to have family or friends look after my child when I return to work and I can’t afford to not be in work, but childcare costs more than my mortgage for full time hours.”

Another commented:

“My wages will just about cover our childcare costs, therefore I am basically working only to ‘hold my place’ until my baby is old enough not to need childcare i.e., once she starts school.”

That has a huge impact on the gender pay gap. Clearly, it is still by and large women who take on most of the responsibility for childcare. Research by Pregnant Then Screwed found that 62% of women who returned to work worked fewer hours, changed jobs or stopped working because of childcare costs. Sadly, we know that the resulting loss of wages has a long-term impact on far too many women.

Properly funded childcare also means ensuring that providers have the money to pay and train their staff appropriately. I want to thank early years staff and management for their efforts over the last 18 months. Most staff have worked through the entire pandemic, and many settings have kept their doors open the entire time, looking after the children of key workers and others and keeping our country moving through this international crisis. Early years staff and management deserve our thanks and appreciation, and our commitment to tackle the serious issues raised by the petitioners.

According to research by Nursery World, one in 10 childcare workers relies on foodbanks, and 45% claim some form of benefit. One in eight earns less than £5 an hour, meaning that staff turnover is high, which can impact on the quality of care, the quality of education and the stability provided for children. We also know that in the past decade, there has been a long-term decrease in the number of people who want to work in the early years sector. One nursery manager told me just how difficult it is to retain staff, particularly in a setting with a disadvantaged intake and a high incidence of special educational needs.

Employees feel that they are sacrificing any semblance of work-life balance for minimum wage, leading to higher absence rates and higher staff turnover. That means that a child’s key worker might change to someone both they and their parents are unfamiliar with multiple times in a year, affecting the quality of education that they receive. It also means that settings are regularly thrown into chaos because they cannot recruit fast enough to fill the gaps. I was told that, at least once a month, staffing issues mean that nurseries hope that not every parent will bring their child to nursery, because if every child attended there would be no way to maintain the required legal ratios. It is shocking that this is what some settings face, and it shows how badly off track we have got.

It cannot be right that while staff are poorly paid and parents pay high costs, the sector’s biggest customer, the Government, get away with paying what they know is insufficient funding. Deciding on the right level of funding and the best way to provide it is, of course, not an easy task, and I think that speaks to the need for a comprehensive, independent expert review to consider the matter in detail. Our answer to the crucial funding question speaks to what we want our early years sector to be.

Is it the state’s role to provide the minimum funding to cover, or just about cover, basic costs so that parents can at least return to work? That would mean maxed-out ratios, stressed-out staff, higher costs for parents, and providers that are unwilling to provide childcare as cheaply as possible being driven out of the market. Or are the benefits of a more generous childcare and early years education system worth it? That is what I would argue, as it means that we can unlock greater productivity, put a big dent in the gender pay gap, narrow the attainment gap at school and, in the long run, reduce other social problems such as poor mental health, unemployment and crime.

Unfortunately, in their written response to the petition, the Government said that there are no plans to commission a review of childcare funding, but I do not think that the Minister should be so quick to dismiss the petitioners’ concerns. We need a childcare system that helps not only to make the lives of families and their children better, but to make our economy work. With both parents and providers struggling and with early years staff undervalued and underpaid, childcare is becoming a big political issue, and it is not going away any time soon. I urge the Government to consider the petitioners’ request for an independent review so that we can get this right for everybody who would benefit from it.