All 1 Nadine Dorries contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading

Online Safety Bill

Nadine Dorries Excerpts
2nd reading
Tuesday 19th April 2022

(1 year, 11 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts
Nadine Dorries Portrait The Secretary of State for Digital, Culture, Media and Sport (Ms Nadine Dorries)
- Parliament Live - Hansard - -

I beg to move, That the Bill be now read a Second time.

Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.

Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.

In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.

Mark Francois Portrait Mr Mark Francois (Rayleigh and Wickford) (Con)
- Hansard - - - Excerpts

My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - -

I will take one more intervention.

Jonathan Gullis Portrait Jonathan Gullis (Stoke-on-Trent North) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - -

I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.

As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.

School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.

We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.

All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.

This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the

“safest place in the world to be online”

especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.

The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.

Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.

The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - -

I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.

Nadine Dorries Portrait Ms Dorries
- Hansard - -

The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Nadine Dorries Portrait Ms Dorries
- Hansard - -

I will take one more intervention, but that is it.

Damian Green Portrait Damian Green (Ashford) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

We have already done that—it is already in the Bill.

Daniel Kawczynski Portrait Daniel Kawczynski (Shrewsbury and Atcham) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Nadine Dorries Portrait Ms Dorries
- Hansard - -

No, I have to continue.

Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.

As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.

The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.

We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.

Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.

As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.

I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.

We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.

If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.

In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.

This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.

Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.

We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.

It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.

Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?

That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.

I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.

Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.

Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.

However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.

We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.

None Portrait Several hon. Members rose—
- Hansard -