Wednesday 26th October 2022

(1 year, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Damian Collins)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Dowd. This is my first appearance as a Minister in Westminster Hall, and your first appearance in the Chair, so we are both making our debuts. I hope we have long and successful reigns in our respective roles.

It is a great pleasure to respond to the debate secured by my right hon. Friend the Member for East Hampshire (Damian Hinds) and to his excellent opening speech. He feels strongly about these issues—as he did both in Government and previously as a member of the Digital, Culture, Media and Sport Committee—and he has spoken up about them. I enjoyed working with him when he was a Minister at the Home Office and I chaired the prelegislative scrutiny Committee, which discussed many important features of the Online Safety Bill. One feature of the Bill, of course, is the inclusion of measures on fraud and scam advertising, which was a recommendation of the Joint Committee. It made my life easier that, by the time I became a Minister in the Department, the Government had already accepted that recommendation and introduced the exemption, and I will come on to talk about that in more detail.

My right hon. Friend, the hon. Member for Pontypridd (Alex Davies-Jones) and other Members raised the case of Molly Russell, and it is important to reflect on that case. I share the sentiments expressed about the tragedy of Molly’s death, its avoidable nature and the tireless work of the Russell family, and particularly her father, Ian Russell, whom I have met several times to discuss this. The Russell family pursued a very difficult and complicated case, which required a huge release of evidence from the social media companies, particularly Instagram and Pinterest, to demonstrate the sort of content to which Molly Russell was exposed.

One of the things Ian Russell talks about is the work done by the investigating officers in the coroner’s inquest. Tellingly, the inquest restricted the amount of time that people could be exposed to the content that Molly was exposed to, and ensured that police officers who were investigating were not doing so on their own. Yet that was content that a vulnerable teenage girl saw repeatedly, on her own, in isolation from those who could have helped her.

When online safety issues are raised with social media companies, they say things like, “We make this stuff very hard to find.” The lived experience of most teenagers is not searching for such material; it is such material being selected by the platforms and targeted at the user. When someone opens TikTok, their first exposure is not to content that they have searched for; it is to content recommended to them by TikTok, which data-profiles the user and chooses things that will engage them. Those engagement-based business models are at the heart of the way the Online Safety Bill works and has to work. If platforms choose to recommend content to users to increase their engagement with the platform, they make a business decision. They are selecting content that they think will make a user want to return more frequently and stay on the platform for longer. That is how free apps make money from advertising: by driving engagement.

It is a fair criticism that, at times, the platforms are not effective enough at recognising the kinds of engagement tools they are using, the content that is used to engage people and the harm that that can do. For a vulnerable person, the sad truth is that their vulnerability will probably be detected by the AI that drives the recommendation tools. That person is far more likely to be exposed to content that will make their vulnerabilities worse. That is how a vulnerable teenage girl can be held by the hand—by an app’s AI recommendation tools—and walked from depression to self-harm and worse. That is why regulating online safety is so important and why the protection of children is so fundamental to the Bill. As hon. Members have rightly said, we must also ensure that we protect adults from some of the illegal and harmful activity on the platforms and hold those platforms to account for the business model they have created.

I take exception to the suggestion from the hon. Member for Pontypridd that this is a content-moderation Bill. It is not; it is a systems Bill. The content that we use, and often refer to, is an exemplar of the problem; it is an exemplar of things going wrong. On all the different areas of harm that are listed in the Bill, particularly the priority legal offences in schedule 7, our challenge to the companies is: “You have to demonstrate to the regulator that you have appropriate systems in place to identify this content, to ensure that you are not amplifying or recommending it and to mitigate it.” Mitigation could be suppressing the content—not letting it be amplified by their tools—removing it altogether or taking action against the accounts that post it. It is the regulator’s job to work with the companies, assess the risk, create codes of practice and then hold the companies to account for how they work.

There is criminal liability for the companies if they refuse to co-operate with the regulator. If they refuse to share information or evidence asked for by the regulator, a named company director will be criminally liable. That was in the original Bill. The recommendation in the Joint Committee report was that that should be commenced within months of the Bill being live; originally it was going to be two years. That is in the Bill today, and it is important that it is there so that companies know they have to comply with requests.

The hon. Member for Pontypridd is right to say that the Bill is world-leading, in the sense that it goes further than other people’s Bills, but other Bills have been enacted elsewhere in the world. That is why it is important that we get on with this.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister is right to say that we need to get on with this. I appreciate that he is not responsible for the business of this House, but his party and his Government are, so will he explain why the Bill has been pulled from the timetable next week, if it is such an important piece of legislation?

Damian Collins Portrait Damian Collins
- Hansard - -

As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that point, will the Minister assure us that he will push for the Bill to come back? Will he make the case to the business managers that the Bill should come back as soon as possible, in order to fulfil his aim of having it pass in this Session of Parliament?

Damian Collins Portrait Damian Collins
- Hansard - -

As the hon. Lady knows, I cannot speak to the business of the House. What I would say is that the Department has worked tirelessly to ensure the safe passage of the Bill. We want to see it on the Floor of the House as quickly as possible—our only objective is to ensure that that happens. I hope that the business managers will be able to confirm shortly when that will be. Obviously, the hon. Lady can raise the issue herself with the Leader of the House at the business statement tomorrow.

Jonathan Lord Portrait Mr Jonathan Lord (Woking) (Con)
- Hansard - - - Excerpts

Could the Minister address the serious issue raised my hon. Friend the Member for Hexham (Guy Opperman)? There can be no excuse for search engines to give a prominent place, or indeed any place, to fake Aviva sites—scamming sites—once those have been brought to their attention. Likewise, unacceptable scam ads for Aviva, Martin Lewis or whoever are completely avoidable if decent checks are in place. Will the Government address those issues in the Bill and in other ways?

Damian Collins Portrait Damian Collins
- Hansard - -

I am grateful to my hon. Friend. The answer is yes, absolutely. It was always the case with the Bill that illegal content, including fraud, was in scope. The question in the original draft Bill was that that did not include advertising. Advertising can be in the form of display advertising that can be seen on social media platforms; for search services, it can also be boosted search returns. Under the Bill, known frauds and scams that have been identified should not appear in advertising on regulated platforms. That change was recommended by the Joint Committee, and the Government accepted it. It is really important that that is the case, because the company should have a liability; we cannot work just on the basis that the offence has been committed by the person who has created the advert and who is running the scam. If an intermediary platform is profiting out of someone else’s illegal activity, that should not be allowed. It would be within Ofcom’s regulatory powers to identify whether that is happening and to see that platforms are taking action against it. If not, those companies will be failing in their safety duties, and they will be liable for very large fines that can be levied against them for breaching their obligations, as set out in the Online Safety Bill, which can be up to 10% of global revenues in any one year. That power will absolutely be there.

Some companies could choose to have systems in place to make it less likely that scam ads would appear on their platforms. Google has a policy under which it works with the Financial Conduct Authority and does not accept financial product advertising from organisations that are not FCA accredited. That has been quite an effective way to filter out a lot of potential scam ads before they appear. Whether companies have policies such as that, or other ways of doing these things, they will have to demonstrate to Ofcom that those are effective. [Interruption.] Does my hon. Friend the Member for Hexham (Guy Opperman) want to come in on that? I can see him poised to spring forward.

Guy Opperman Portrait Guy Opperman
- Hansard - - - Excerpts

No, keep going.

Damian Collins Portrait Damian Collins
- Hansard - -

I would like to touch on some of the other issues that have been raised in the debate. The hon. Member for Leeds East (Richard Burgon) and others made the point about smaller, high-risk platforms. All platforms, regardless of size, have to meet the illegal priority harm standard. For the worst offences, they will already have to produce risk assessments and respond to Ofcom’s request for information. Given that, I would suspect that, if Ofcom had a suspicion that serious illegal activity, or other activity that was causing serious concern, was taking place on a smaller platform, it would have powers to investigate and would probably find that the platform was in breach of those responsibilities. It is not the case that if a company is not a category 1 company, it is not held to account under the illegal priority harms clauses of the Bill. Those clauses cover a wide range of offences, and it is important—this was an important amendment to the Bill recommended by the Joint Committee—that those offences were written into the Bill so that people can see what they are.

The hon. Member for Pontypridd raised the issue of violence against women and girls, but what I would say is that violence against everyone is included in the Bill. The offences of promoting or inciting violence, harassment, stalking and sending unsolicited sexual images are all included in the Bill. The way the schedule 7 offences work is that the schedule lists existing areas of law. Violence against women and girls is covered by lots of different laws; that is why there is not a single offence for it and why it is not listed. That does not mean that we do not take it seriously. As I said to the hon. Lady when we debated this issue on the first day of Report, we all understand that women and girls are far more likely to be victims of abuse online, and they are therefore the group that should benefit the most from the provisions in the Bill.

The hon. Member for Coventry North West (Taiwo Owatemi) spoke about cyber-bullying. Again, offences relating to harassment are included in the Bill. This is also an important area where the regulator’s job is to ensure that companies enforce their own terms of service. For example, TikTok, which is very popular with younger users, has in place quite strict policies on preventing bullying, abuse and intimidation on its services. But does it enforce that effectively? So far, we have largely relied on the platforms self-declaring whether that is the case; we have never had the ability to really know. Now Ofcom will have that power, and it will be able to challenge companies such as TikTok. I have raised with TikTok as well my concern about the prevalence of blackout challenge content, which remains on that platform and which has led to people losing their lives. Could TikTok be more effective at removing more of that content? We will now have the regulatory power to investigate—to get behind the curtain and to see what is really going on.

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

Minister, can I just say that there may be votes very shortly? That means that we will be suspending the sitting and coming back, so if you can—

Damian Collins Portrait Damian Collins
- Hansard - -

Wrap it up in the next—

Damian Collins Portrait Damian Collins
- Hansard - -

I will just touch on a couple of other points that have been raised. My hon. Friend the Member for Barrow and Furness (Simon Fell) and other Members raised the point about the abuse of footballers. The abuse suffered by England footballers after the final of the European championship is a very good example. Some people have been charged and prosecuted for what they posted. It was a known-about risk; it was avoidable. The platform should have done more to stop it. This Bill will make sure that they do.

That shows that we have many offences where there is already a legal threshold, and we want them to be included in the regulatory systems. For online safety standards, it is important that the minimum thresholds are based on our laws. In the debate on “legal but harmful”, one of the key points to consider, and one that many Members have brought up, is what we base the thresholds on. To base them on the many offences that we already have written into law is, I think, a good starting point. We understand what those thresholds are. We understand what illegal activity is. We say to the platforms, “Your safety standards must, at a minimum, be at that level.” Platforms do go further in their terms of service. Most terms of service, if properly enforced, would deal with most of the sorts of content that we have spoken about. That is why, if the platforms are to enforce their terms of service properly, the provisions on traceability and accountability are so important. I believe that that will capture the offences that we need.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) rightly said—if I may paraphrase slightly—that we should not let the perfect be the enemy of the good. There will always be new things that we wish to add and new offences that we have not yet thought about that we need to include, and the structure of the Bill creates the framework for that. In the future, Parliament can create new offences that can form part of the schedule of priority illegal offences. On priority harms, I would say that that is the stuff that the platforms have to proactively look for. Anything illegal could be considered illegal online, and the regulators could take action against it.

Let me finish by thanking all the Members here, including my hon. Friend the Member for Gosport (Dame Caroline Dinenage), another former Minister. A very knowledgeable and distinguished group of Members have taken part in this debate. Finally, I thank the officials at the Department. Until someone is actually in the Department, they can never quite know what work is being done—that is the nature of Government—but I know how personally dedicated those officials are to the Bill. They have all gone the extra mile in the work they are doing for it. For their sakes and all of ours, I want to make sure that we pass it as soon as possible.

Question put and agreed to.

Resolved,

That this House has considered online harms.