All 9 Debates between Kirsty Blackman and Damian Collins

Tue 12th Dec 2023
Media Bill (Sixth sitting)
Public Bill Committees

Committee stage:s: 6th sitting
Thu 7th Dec 2023
Tue 5th Dec 2023
Tue 5th Dec 2023
Media Bill (First sitting)
Public Bill Committees

Committee stage: 1st sitting & Committee stage
Tue 21st Nov 2023
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage

Media Bill (Sixth sitting)

Debate between Kirsty Blackman and Damian Collins
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Following my hon. Friend’s speech, I want to speak briefly on the issue, in which I have taken an interest over many years. The Minister is nodding and he will remember that I served as a member of the Committee he chaired in 2011 looking at the phone hacking issue and the inquiry that was held at that time. Twelve years or more have passed since then, and the media landscape now is very different.

I agree with my hon. Friend the Member for Aylesbury that having a statutory regulator for the press is not compatible with our media traditions in this country. The threat of commencing section 40, with newspapers having to pay their own costs and those of the claimant even if they won the case—such a provision does not exist elsewhere in English law—would impose an onerous burden, yet the threat of commencement has not forced newspapers to seek to create or go into regulatory bodies for the press. The debates we have here on statutory regulation of the media and the debates we continually have when BBC charter renewal comes up show that whenever we create a structure, no matter how arm’s length or benign, Members of this House have points of view about how it is operated, what goes in it and how it should change or be improved. That will continue to be the case. A statutory regulator is not compatible with having a free press.

When we had the Leveson inquiry, the idea of newspapers’ business models being hollowed out by big tech platforms that would destroy their ad-funded business model was not something we considered. Newspapers were then seen as being all-powerful, extremely wealthy and well able to pay whatever charges were levelled at them. The situation is very different now.

The other issue, which I am familiar with as a former chair of the APPG on media freedom, is the issue of lawfare, whereby wealthy people, particularly oligarchs, take spurious legal action against newspapers because of content they do not like, without worrying about whether the case meets any kind of threshold. The libel laws are not absolute; they are not an absolute true-or-false test. To win, the claimant has to demonstrate that what a journalist reported has materially damaged them and their reputation, but very wealthy people do not care about that. They are quite happy to enter into such legal cases now, and even the threat of such actions deters editors from publishing stories that might be in the public interest, for fear of the almost certain legal challenge that will come back against them from people with bottomless pits of money who do not care whether they win or lose. They just seek to grind the publication into the ground with ongoing legal costs.

Commencing a regime that may open the door to yet more litigation from people who, on the whole, can easily afford it anyway, which makes the chances of success greater and which makes the cumulative impact of the costs on those publications even greater, would diminish the power of the press considerably. That would lead to a chilling effect, which was never envisaged when the Leveson report was commissioned, of inhibiting the press for fear of the cost that would come from simply doing their job and reporting the truth.

Of course, the press make mistakes and get things wrong. Newspaper editors have legal liabilities for what they publish. Members of the Committee know from our lengthy debates on measures such as the Online Safety Act 2023 that it is easy now for people to publish all sorts of stuff for which they have no legal liability—and, before the Act was passed, nor did the platforms that pursued it. The challenge that many people face, be they in the public eye or members of the community, is far more likely to be harassment and intimidation through co-ordinated attacks on social media than reporting a newspaper they do not like.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Does the hon. Gentleman not understand that the online world is now regulated differently from newspapers as a result of the Online Safety Act? I agree with the Online Safety Act and agree that there should be more regulation online of things that are illegal, but we do not have a change in the regulation of newspapers to ensure truthfulness and lack of harm, whereas we do have some more of that in the online world.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That is why it was important that there is an exemption for media organisations from the regulatory powers that Ofcom will have through the Online Safety Act. The reason those exemptions were there was that newspapers already have liability for not only the copy printed, but the adverts they accept and run. The newspaper or magazine editor is legally liable for advertising as much as they are for the articles they commission. Those liabilities and that transparency just did not exist for a lot of online publications, and it could be difficult to see who was behind it.

The challenge with the Online Safety Act was to recognise that the platforms were acting as distributors and promoters of the content—even for a lot of the content that is spam-related or comes from misinformation networks and hostile foreign states. If companies like Facebook are actively promoting that content and highlighting its existence to its users, they should have a liability for it. Newspapers and magazines already had those liabilities because it was clear who was publishing them. In the Online Safety Act, to qualify for the media exemption, it has to be clear who they are, where they are based and who the editor is, and therefore the transparency, liability and risks exist already. They did not in the online world, where many of the publishers were hidden and used that anonymity to spread lies and disinformation.

With that, the onerous costs that lawfare brings to newspapers, and the hollowing out of their business model by the ad platforms that distribute their content for nothing, there is an urgent need to have some sort of compensation mechanism for news organisations, so that local newspapers, national newspapers and magazines get fair compensation for the free distribution of their content across the web. Those are the challenges we face now, and those were things that were never envisaged at the time of Leveson.

As the hon. Member for Aberdeen North has said many times in the debate, things move pretty fast between media Bills. This is another example of how things have moved fast again. This amendment to the law and removing section 40 from the statute books reflects the need for us to change the law to reflect the media world that exists today.

Media Bill (Third sitting)

Debate between Kirsty Blackman and Damian Collins
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I agree with the Minister that the clause creates a new and bespoke prominence regime. I have some questions that I was going to ask in interventions, but I figured that I had too many.

I am happy to support the amendments tabled by the shadow Minister relating to the BBC and affirmative approval by the House, but I have some questions arising from the comments that have been made. The comment about personalisation is key. I hope that people who regularly watch S4C, for example, will be offered it. I am slightly concerned that that will conflict with the commercial nature of these devices, and that we will end up in a situation whereby Amazon provides more money to give prominence to a certain television show, which bumps S4C down the list. I am glad that there is flexibility in the Bill to allow things to be updated and changes to be made, because it is important that such conflicts are resolved.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

What may resolve that conflict is the fact that personalisation is often linked to the placement of advertising, so the platform operator may only care about advertising reaching eyeballs and may be agnostic about whether it is placed against S4C content or anyone else’s.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is possibly the case. I have never been a platform managing its budget on the basis of advertising, so I do not know exactly how the advertising regime works. As time goes on and the way that people interact with these services changes, that may be updated anyway, provided that there is the flexibility to make changes if we find that people are not being served the content that we expect, and are not receiving that level of prominence of the services. It would be great if the new regime works and people are algorithmically served the content that they like and want to see, but I am concerned that it might not always work out like that. It would therefore be incredibly helpful if the Minister can keep that under review.

On the comments about the words “significant” and “appropriate”, I completely understand the BBC’s concerns. I know that not everybody feels quite so strongly about those words. Some people believe that Ofcom will be clear that “appropriate” means “fairly significant” and “quite prominent”, so that people are able to access these broadcasters. Again, the Government need to keep that under review to ensure that there is an appropriate level of prominence, and that Ofcom has the ability and strength to say, “This is not appropriate. We need it to be more prominent than it currently is.” Ofcom must have the teeth to enforce that. It should first work with the platforms to ensure prominence—we do not want to move straight to enforcement—so that people can access the public service broadcasters that they expect.

The comments made about television remote controls were also key, and we might come back to them later in relation to radio selection. Hardware is an issue as well as software. For example, a television remote control may allow people to press a Netflix button but not a BBC button, despite the fact that significant proportions of people would prefer to press a button to access the BBC, STV or whatever service they are keen to get, and that they generally go to for information. I have spoken already about the importance of accessibility. Public service broadcasters need to be accessible, and we must work with people to make PSBs as accessible as possible, and prominent; those are two separate but related things.

Local content and local news content are very important. Yesterday, I had a discussion with the BBC, which now provides Aberdeen and Aberdeenshire local bulletins on some of its on-demand services, after a long-running campaign by the hon. Member for West Aberdeenshire and Kincardine (Andrew Bowie) and me. We have both been constantly pestering the BBC to ensure that our local news bulletins are accessible, particularly so that we can see what is being reported in our local area when we are down here. The BBC has now done that, but I would like commensurate prominence for online and on-demand television services, as well as services on my phone or computer.

On a related note, the Minister talked about the measures applying only to devices that have the main purpose of allowing people to watch television, and I can understand why he has gone down that route. I do not know whether he is aware of Ofcom’s “Media nations” report, which shows that 21% of TV users in Scotland watch through a games console. That is not an insignificant proportion. Some family members may use the games console to game, but others may use it only to watch television. If games consoles are outside the regulatory regime and are not required to give any prominence to public service broadcasters, a chunk of the population is not being properly served and does not have proper access to public service broadcasters.

I appreciate the Minister’s comment about Ofcom being able to update and make changes to the regulated services and providers. However, I am slightly concerned that he has gone too far down the route of saying that the measures apply to devices that are mainly used for television purposes. I am concerned that that will not provide my constituents with the best service, particularly when the percentage of people in Scotland who use games consoles to watch television is double that in England. The proportion is much higher, so this issue will be important. For example, if someone can watch television on a PlayStation, why would they waste money on a Fire Stick? They can already watch television through the games console. If that is the main route by which a not insignificant portion of people watch television, it is important that the Minister considers whether regulating games consoles would improve our constituents’ lives. I genuinely think it would. My concern is mostly that the Minister should not rule it out; I do not necessarily want him to say that he will definitely regulate things such as games consoles.

My other question, which the Minister may not be able to answer today, is about the prominence requirements for smart TV provision. If I say to my Fire TV Cube, “Alexa, play BBC News on BBC iPlayer,” I would expect it to do that. Provisions we come to later relate to asking a smart speaker to play something on a certain provider. That is about not just prominence on screen, but prominence when I use my voice to make a request of my Fire Cube. I hope and expect that Ofcom and the Government intend that I will get BBC News on BBC iPlayer, if that is what I ask my smart TV for. This is about not just navigating the system, but being able to make a request by voice.

Media Bill (Second sitting)

Debate between Kirsty Blackman and Damian Collins
Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

It strikes me that a lot of what the hon. Lady is talking about is relevant to the broadcasting code. It is Ofcom’s job to issue guidance in relation to the code and to take action if a broadcaster fails to meet its obligations. If Ofcom feels that a broadcaster has no intention of keeping within the remit of the code, it can withdraw its licence. That is the ultimate sanction, and one that Ofcom has already.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is absolutely the case. However, on this section of the Bill, which is about enforcing the public sector remit—sorry, I keep saying “public sector” when I mean “public service”; I spent too much time in local government. It is about enforcing the public service remit and amending this section of the Communications Act. The shadow Minister has made the case to allow Ofcom the ability to step in with a lighter touch. We do not want Ofcom to have to take licences away. We want Ofcom to assess that, if things are not going in the right direction, it is better for everyone if it ensures the proper provision and that everybody has access to the public service broadcasting that we would expect. We want Ofcom to have that earlier opportunity to step in and say, “Guys, it’s time to make some changes before it gets to the point of being beyond repair.”

Media Bill (First sitting)

Debate between Kirsty Blackman and Damian Collins
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

We covered a little of this in the last debate, in relation to access to terrestrial television services. As I said, there is still significant digital exclusion in our society when it comes to those who access television services and public service broadcasts through non-digital means.

It is possible to do what I do, which is to access television entirely through digital means—I have not had an aerial for a significant time. We moved into our house in 2016 and I am not aware that we have ever watched terrestrial television there, but we are lucky enough to have and be able to pay for a fast broadband connection and to live in a city where we can access one; we are not in any of the excluded and more vulnerable groups that find it more difficult to access television through on-demand means. A significant number of people can still access TV only through terrestrial services.

The amendments are about trying to pin the Minister down on what he means by “an overwhelming majority”. This is about looking at the numbers: is 98.5% of the population the kind of figure that the Minister was thinking about when he said “overwhelming majority”, or did he mean 60% or 70%? I am in debt to my hon. Friend the Member for Paisley and Renfrewshire North (Gavin Newlands), who, like me, has met Broadcast 2040+, which crafted these amendments. My hon. Friend is significantly more of a football fan than I am, and has specifically mentioned the fact that football viewing figures are higher for terrestrial TV than they are for subscription services. Removing access to terrestrial TV, which may happen at some point in the future and may need to happen at some point in the very distant future, will reduce the number of people able to access Scottish football. Therefore, in addition to the comments I was making about the educational provision available on television, I make the point that it is also important that there is the ability to view sport.

Yesterday in the Chamber, there was a ministerial update on the risk and resilience framework, which was published by the Government last year. Ministers have been at pains to state how much more transparency the framework enables than was the case previously. I appreciate the work that the Government are trying to do to update the national risk register, to ensure that it is as public as possible and that people are able to access this information. However, an incredibly important part of local resilience is being able to access up-to-date news, up-to-date and on-the-spot weather, and information when something significant happens.

I will give an example. Recently, there were significant floods in Brechin, which is just down the road from Aberdeen—although I am not sure that people in Brechin would want to be described in relation to Aberdeen; Brechin is a very lovely place in its own right and not just a neighbour of Aberdeen. People in Brechin saw really significant flooding, and a number of properties were evacuated. Without the ability to access information on what was happening through terrestrial TV or radio services, people would have been much less aware that the river was about to break its banks. If there is really significant wind—as there was, during the significant rain—accessing mobile phone masts, for example, is much more difficult. Terrestrial TV service masts, having been up for significantly longer, are significantly less likely to come down in the kinds of winds that we saw during Storm Arwen and Storm Babet, as weather events increase. In terms of resilience, it is important for people to be able to access that.

During the covid pandemic, people were glued to their television screens for updates about what was happening and the latest lockdown news. If some of our most vulnerable communities were struggling to access such content because, after the withdrawal of the terrestrial services, they did not have the broadband speeds necessary to watch television on demand, they would be less likely to be able to comply with and understand the law if another pandemic or national emergency happened.

It is important for the Government to know that they can reach the general population; that is how they could make the case for lockdown restrictions or ensure that people were aware of when the Queen sadly passed away last year. They can make those announcements and ensure people have the understanding and ability to know when significant national events have happened.

If people who are older, in poverty or otherwise digitally excluded are less likely to hear timeously about extreme weather or massive national events of incredible importance, then we further marginalise communities that are already struggling. As I said, I appreciate the Minister using the term “overwhelming majority” but I am just not confident enough that—

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady should recognise that such switchovers are possible only when the technology supports it, which is a question of changing the distribution mechanism at some point. That can lead to more choice.

Take the village in Kent where I live. When we had to do the switchover in 2012, the consequence of turning off the analogue signal and replacing it with a digital one was that we could get Channel 5, which people would otherwise not have been able to get at all. With the improvement in infrastructure, some people may see a significant improvement in services, but only where that infrastructure is ready.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that and think it is important, but my point is about those who cannot get access and do not have the financial ability to do so. If we have a commitment to continue to provide terrestrial services and the legacy infrastructure, the providers of that infrastructure—the public service broadcasters—can continue to invest in it and not just say, “Well, the Government are going to allow us to turn it off in 2040 so there is no point in investing in it now. It has only got 17 years left to run, so we are just going to run the network down.” I am concerned that that may be the direction of travel.

Without a very clear commitment from the Government, I am worried that there will be a lack of investment in terrestrial services and that people will lose out. I would not want anybody to lose out on Channel 5 and I am very glad that people have access to it, but they need to have the choice. I would rather people had access to some public service broadcasting than none, which would be entirely possible if the digitally excluded could no longer access terrestrial TV services.

If the Minister made some really clear commitments today, that would be incredibly helpful. He may not be able to do that, in which case I may press some of the amendments. I will certainly be supporting the Labour party’s new clause. If the Minister cannot make more commitments, will he make clear the Government’s point of view about people likely to be excluded from taking part in a switchover, in relation to current investment in the network and investment to ensure that the network can last the next 15, 20, or 30 years? Would the Minister be happy to see that network diminish and for there to be a lack of investment so that services run down of their own accord or would he would prefer people to continue to be able to access them?

It would be great to have a little more clarity from the Government on the proposed direction of travel. I thank my hon. Friend the Member for Paisley and Renfrewshire North and also Broadcast 2040+ for all the work that they do to try to ensure that marginalised groups can continue to access public service broadcasting.

Media Bill

Debate between Kirsty Blackman and Damian Collins
2nd reading
Tuesday 21st November 2023

(5 months, 1 week ago)

Commons Chamber
Read Full debate Media Bill 2023-24 View all Media Bill 2023-24 Debates Read Hansard Text Read Debate Ministerial Extracts
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I did not even think about the TV schedule as something that people look at. I never look at a TV schedule. I do not know if my Fire Stick or my PlayStation has a TV schedule. On significant prominence, I was picturing the BBC iPlayer app being at the top of the apps list. Does the hon. Gentleman agree that Ofcom should look at both those things: how it appears on the screen and where the public service broadcasters are in any live schedule?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady makes an important point. It should be easier to find through app stores. Although they are not directly in scope of the legislation because they are not broadcast formats in their own right, that question should be asked—is it easy to find? It should be easy to find on a connected device when it is turned on, and it should be easy to locate the apps.

Ofcom also has to consider whether the business model that underpins connected devices is fair to public service broadcasters. There is no doubt that the business model for Amazon and Google is to try to create a connected device space where all the entertainment exists and is tailored to each person. They also want to build the ad tech into that, so that they are the principal beneficiaries of the ad revenue, by monetising the placement of that content as well and diverting it away from broadcasters who have traditionally sold audiences to make money. That is the underlying problem that public service broadcasting faces today. The sale of audiences to generate advertising revenue to invest in programmes—the model that has fuelled independent public broadcasting for 50 years—is not broken, but it does not work in the way it used to; it is much more diffuse.

The revenue challenges that come from that are extremely real. That is why, on Channel 4, although I am pleased to see the Government’s changes to the remit, we need to keep a watching brief to see whether they go far enough. We have not gone as far as Channel 4 asked to go in its counter-offer to privatisation, which was the ability to go to the markets to raise money from private investors to create a programming fund that would invest £1 billion over two years in new programming. If we simply allow Channel 4 to acquire a stake in the making of programmes that it will broadcast, which will make revenue in the future, will that be enough now to meet the challenges that it will face? Given the ongoing pressures this year on declining ad revenue for TV broadcasting, we need to make sure that that will be enough. We should not assume that the measures in the Bill, which are welcome, will be the last word on that. There may be more challenges to come.

I would like to add two further points. It is right that we try to create more parity between the regulation of on-demand online services and broadcast television. If a viewer turns on their connected TV device, as far as they are concerned Netflix is as much television as the BBC, and there should be some parity in the way the platforms are regulated, the obligations they have to their users and the notifications they give about the suitability of the content. That should apply to advertising too. Often the debate we have is around advertising that targets children, but children are not watching live television; they are watching it on demand. The danger at the moment is that we have a highly regulated live broadcast television environment, but an almost completely unregulated online one. We should be far more worried about the ad rules that apply on YouTube than those on ITV, because that is where the children are. It is vital that the work on the Government’s online advertising review is completed at pace. The project has been worked on for a number of years. There needs to be proper enforceability of the advertising codes that have stood us in good stead in the broadcast world, but do not yet work in the same way online.

Finally, on media ownership and media freedom, which the Secretary of State mentioned in her opening remarks, we should give some consideration—maybe the Bill is not the right place—to the ownership of UK news companies and news assets, particularly if they are acquired by organisations based in jurisdictions overseas where maybe the regard for press freedom is not the same as it is in the UK. The Bill does not address that concern. If we have an ongoing concern about a vibrant news media landscape, there should be some concern about the companies that own media organisations—where they are based, what their interests are and what interest they have in the way the news is reported here. We do not want to see the press regulated in any way—we want to avoid that and in many ways the measures in the Bill are a nod to that as well—but we want certainty about safeguarding media freedom in the future.

ONLINE SAFETY BILL (Second sitting)

Debate between Kirsty Blackman and Damian Collins
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I rise briefly to say that the introduction of the shields is a significant additional safety measure in the Bill and shows that the Government have thought about how to improve certain safety features as the Bill has progressed.

In the previous version of the Bill, as we have discussed at length, there were the priority legal offences that companies had to proactively identify and mitigate, and there were the measures on transparency and accountability on the terms of service. However, if pieces of content fell below the threshold for the priority legal offences or were not covered, or if they were not addressed in the terms of service, the previous version of the Bill never required the companies to act in any particular way. Reports might be done by Ofcom raising concerns, but there was no requirement for further action to be taken if the content was not a breach of platform policies or the priority safety duties.

The additional measure before us says that there may be content where there is no legal basis for removal, but users nevertheless have the right to have that content blocked. Many platforms offer ad tools already—they are not perfect, but people can opt in to say that they do not want to see ads for particular types of content—but there was nothing for the types of content covered by the Online Safety Bill, where someone could say, “I want to make sure I protect myself from seeing this at all,” and then, for the more serious content, “I expect the platforms to take action to mitigate it.” So this measure is an important additional level of protection for adult users, which allows them to give themselves the certainty that they will not see certain types of content and puts an important, additional duty on the companies themselves.

Briefly, on the point about gambling, the hon. Member for Aberdeen North is quite right to say that someone can self-exclude from gambling at the betting shop, but the advertising code already requires that companies do not target people who have self-excluded with advertising messages. As the Government complete their online advertising review, which is a separate piece of work, it is important that that is effectively enforced on big platforms, such as Facebook and Google, to ensure that they do not allow companies to advertise to vulnerable users in breach of the code. However, that can be done outside the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

My concern is not just about advertising content or stuff that is specifically considered as an advert. If someone put up a TikTok video about how to cheat an online poker system, that would not be classed as an advert and therefore would not be caught. People would still be able to see it, and could not opt out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I totally appreciate the point that the hon. Lady makes, which is a different one. For gambling, the inducement to act straightaway often comes in the form of advertising. It usually comes in the form of free bets and immediate inducements to act. People who have self-excluded should not be targeted in that way. We need to ensure that that is rigorously enforced on online platforms too.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

We have seen that just from the people from external organisations who have contacted us about the Bill. The amount of expertise that we do not have that they have brought to the table has significantly improved the debate and hopefully the Bill. Even prior to the consultations that have happened, that encouraged the Minister to make the Bill better. Surely that is why the pre-legislative scrutiny Committee looked at the Bill—in order to improve it and to get expert advice. I still think that having specific access to expertise in order to analyse the transparency report has not been covered adequately.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Annual transparency reporting is an important part of how the system will work. Transparency is one of the most important aspects of how the Online Safety Bill works, because without it companies can hide behind the transparency reports they produce at the moment, which give no transparency at all. For example, Facebook and YouTube report annually that their AI finds 95% of the hate speech they remove, but Frances Haugen said that they removed only 5% of the hate speech. So the transparency report means that they remove 95% of 5%, and that is one of the fundamental problems. The Bill gives the regulator the power to know, and the regulator then has to make informed decisions based on the information it has access to.

ONLINE SAFETY BILL (First sitting)

Debate between Kirsty Blackman and Damian Collins
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 4 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I wish to add some brief words in support of the Government’s proposals and to build on the comments from Members of all parties.

We know that access to extreme and abusive pornography is a direct factor in violence against women and girls. We see that play out in the court system every day. People claim to have watched and become addicted to this type of pornography; they are put on trial because they seek to play that out in their relationships, which has resulted in the deaths of women. The platforms already have technology that allows them to figure out the age of people on their platforms. The Bill seeks to ensure that they use that for a good end, so I thoroughly support it. I thank the Minister.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

There are two very important and distinct issues here. One is age verification. The platforms ask adults who have identification to verify their age; if they cannot verify their age, they cannot access the service. Platforms have a choice within that. They can design their service so that it does not have adult content, in which case they may not need to build in verification systems—the platform polices itself. However, a platform such as Twitter, which allows adult content on an app that is open to children, has to build in those systems. As the hon. Member for Aberdeen North mentioned, people will also have to verify their identity to access a service such as OnlyFans, which is an adult-only service.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that specific point, I searched on Twitter for the name—first name and surname—of a politician to see what people had been saying, because I knew that he was in the news. The pictures that I saw! That was purely by searching for the name of the politician; it is not as though people are necessarily seeking such stuff out.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady accept that the amendments would give people control over the bit of the service that they do not currently have control of? A user can choose what to search for and which members to engage with, and can block people. What they cannot do is stop the recommendation feeds recommending things to them. The shields intervene there, which gives user protection, enabling them to say, “I don’t want this sort of content recommended to me. On other things, I can either not search for them, or I can block and report offensive users.” Does she accept that that is what the amendment achieves?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I think that that is what the clause achieves, rather than the amendments that I have tabled. I recognise that the clause achieves that, and I have no concerns about it. It is good that the clause does that; my concern is that it does not take the second step of blocking access to certain features on the platform. For example, somebody could be having a great time on Instagram looking at various people’s pictures or whatever, but they may not want to be bombarded with private messages. They have no ability to turn off the private messaging section.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

They can disengage with the user who is sending the messages. On a Meta platform, often those messages will be from someone they are following or engaging with. They can block them, and the platforms have the ability, in most in-app messaging services, to see whether somebody is sending priority illegal content material to other users. They can scan for that and mitigate that as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

That is exactly why users should be able to block private messaging in general. Someone on Twitter can say, “I’m not going to receive a direct message from anybody I don’t follow.” Twitter users have the opportunity to do that, but there is not necessarily that opportunity on all platforms. We are asking for those things to be included, so that the provider can say, “You’re using private messaging inappropriately. Therefore, we are blocking all your access to private messaging,” or, “You are being harmed as a result of accessing private messaging. Therefore, we are blocking your access to any private messaging. You can still see pictures on Instagram, but you can no longer receive any private messages, because we are blocking your access to that part of the site.” That is very different from blocking a user’s access to certain kinds of content, for example. I agree that that should happen, but it is about the functionalities and stopping access to some of them.

We are not asking Ofcom to mandate that platforms take this measure; they could still take the slightly more nuclear option of banning somebody entirely from their service. However, if this option is included, we could say, “Your service is doing pretty well, but we know there is an issue with private messaging. Could you please take action to ensure that those people who are using private messaging to harm children no longer have access to private messaging and are no longer able to use the part of the service that enables them to do these things?” Somebody might be doing a great job of making games in Roblox, but they may be saying inappropriate things. It may be proportionate to block that person entirely, but it may be more proportionate to block their access to voice chat, so that they can no longer say those things, or direct message or contact anybody. It is about proportionality and recognising that the service is not necessarily inherently harmful but that specific parts of it could be.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. The hon. Member put that much better than I could. I was trying to formulate that point in my head, but had not quite got there, so I appreciate her intervention. She is right: we should not put the onus on a victim to deal with a situation. Once they have seen a message from someone, they can absolutely block that person, but that person could create another account and send them messages again. People could be able to choose, and to say, “No, I don’t want anyone to be able to send me private messages,” or “I don’t want any private messages from anyone I don’t know.” We could put in those safeguards.

I am talking about adding another layer to the clause, so that companies would not necessarily have to demonstrate that it was proportionate to ban a person from using their service, as that may be too high a bar—a concern I will come to later. They could, however, demonstrate that it was proportionate to ban a person from using private messaging services, or from accessing livestreaming features. There has been a massive increase in self-generated child sexual abuse images, and huge amount has come from livestreaming. There are massive risks with livestreaming features on services.

Livestreaming is not always bad. Someone could livestream themselves showing how to make pancakes. There is no issue with that—that is grand—but livestreaming is being used by bad actors to manipulate children into sharing videos of themselves, and once they are on the internet, they are there forever. It cannot be undone. If we were able to ban vulnerable users—my preferred option would be all children—from accessing livestreaming services, they would be much safer.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is talking about extremely serious matters. My expectation is that Ofcom would look at all of a platform’s features when risk-assessing the platform and enforcing safety, and in-app messaging services would not be exempt. Platforms have to demonstrate what they would do to mitigate harmful and abusive behaviour, and that they would take action against the accounts responsible.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely, I agree, but the problem is with the way the Bill is written. It does not suggest that a platform could stop somebody accessing a certain part of a service. The Bill refers to content, and to the service as a whole, but it does not have that middle point that I am talking about.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

A platform is required to demonstrate to Ofcom what it would do to mitigate activity that would breach the safety duties. It could do that through a feature that it builds in, or it may take a more draconian stance and say, “Rather than turning off certain features, we will just suspend the account altogether.” That could be discussed in the risk assessments, and agreed in the codes of practice.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

What I am saying is that the clause does not actually allow that middle step. It does not explicitly say that somebody could be stopped from accessing private messaging. The only options are being banned from certain content, or being banned from the entire platform.

I absolutely recognise the hard work that Ofcom has done, and I recognise that it will work very hard to ensure that risks are mitigated, but the amendment ensures what the Minister intended with this legislation. I am not convinced that he intended there to be just the two options that I outlined. I think he intended something more in line with what I am suggesting in the amendment. It would be very helpful if the Minister explicitly said something in this Committee that makes it clear that Ofcom has the power to say to platforms, “Your risk assessment says that there is a real risk from private messaging”—or from livestreaming—“so why don’t you turn that off for all users under 18?” Ofcom should be able to do that.

Could the Minister be clear that that is the direction of travel he is hoping and intending that Ofcom will take? If he could be clear on that, and will recognise that the clause could have been slightly better written to ensure Ofcom had that power, I would be quite happy to not push the amendment to a vote. Will the Minister be clear about the direction he hopes will be taken?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Absolutely. The amendment I tabled regarding the accessibility of terms of service was designed to ensure that if the Government rely on terms of service, children can access those terms of service and are able to see what risks they are putting themselves at. We know that in reality children will not read these things. Adults do not read these things. I do not know what Twitter’s terms of service say, but I do know that Twitter managed to change its terms of service overnight, very easily and quickly. Companies could just say, “I’m a bit fed up with Ofcom breathing down my neck on this. I’m just going to change my terms of service, so that Ofcom will not take action on some of the egregious harm that has been done. If we just change our terms of service, we don’t need to bother. If we say that we are not going to ban transphobia on our platform—if we take that out of the terms of service—we do not need to worry about transphobia on our platform. We can just let it happen, because it is not in our terms of service.”

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does the hon. Lady agree that the Government are not relying solely on terms of service, but are rightly saying, “If you say in your terms of service that this is what you will do, Ofcom will make sure that you do it”? Ofcom will take on that responsibility for people, making sure that these complex terms of service are understood and enforced, but the companies still have to meet all the priority illegal harms objectives that are set out in the legislation. Offences that exist in law are still enforced on platforms, and risk-assessed by Ofcom as well, so if a company does not have a policy on race hate, we have a law on race hate, and that will apply.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

Online Harms

Debate between Kirsty Blackman and Damian Collins
Wednesday 26th October 2022

(1 year, 6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

On that point, will the Minister assure us that he will push for the Bill to come back? Will he make the case to the business managers that the Bill should come back as soon as possible, in order to fulfil his aim of having it pass in this Session of Parliament?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As the hon. Lady knows, I cannot speak to the business of the House. What I would say is that the Department has worked tirelessly to ensure the safe passage of the Bill. We want to see it on the Floor of the House as quickly as possible—our only objective is to ensure that that happens. I hope that the business managers will be able to confirm shortly when that will be. Obviously, the hon. Lady can raise the issue herself with the Leader of the House at the business statement tomorrow.

Online Safety Bill

Debate between Kirsty Blackman and Damian Collins
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend raises an important point that deserves further consideration as the Bill progresses through its parliamentary stages. There is, of course, still a general presumption that any illegal activity that could also constitute illegal activity online—for example, promoting or sharing content that could incite people to commit violent acts—is within scope of the legislation. There are some priority illegal offences, which are set out in schedule 7, but the non-priority offences also apply if a company is made aware of content that is likely to be in breach of the law. I certainly think this is worth considering in that context.

In addition, the Bill makes it clear that platforms have duties to mitigate the risk of their service facilitating an offence, including where that offence may occur on another site, such as can occur in cross-platform child sexual exploitation and abuse—CSEA—offending, or even offline. This addresses concerns raised by a wide coalition of children’s charities that the Bill did not adequately tackle activities such as breadcrumbing—an issue my hon. Friend the Member for Solihull (Julian Knight), the Chair of the Select Committee, has raised in the House before—where CSEA offenders post content on one platform that leads to offences taking place on a different platform.

We have also tabled new clause 14 and a related series of amendments in order to provide greater clarity about how in-scope services should determine whether they have duties with regard to content on their services. The new regulatory framework requires service providers to put in place effective and proportionate systems and processes to improve user safety while upholding free expression and privacy online. The systems and processes that companies implement will be tailored to the specific risk profile of the service. However, in many cases the effectiveness of companies’ safety measures will depend on them making reasonable judgments about types of content. Therefore, it is essential to the effective functioning of the framework that there is clarity about how providers should approach these judgments. In particular, such clarity will safeguard against companies over-removing innocuous content if they wrongly assume mental elements are present, or under-removing content if they act only where all elements of an offence are established beyond reasonable doubt. The amendments make clear that companies must consider all reasonably available contextual information when determining whether content is illegal content, a fraudulent advert, content that is harmful to children, or content that is harmful to adults.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I was on the Bill Committee and we discussed lots of things, but new clause 14 was not discussed: we did not have conversations about it, and external organisations have not been consulted on it. Is the Minister not concerned that this is a major change to the Bill and it has not been adequately consulted on?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As I said earlier, in establishing the threshold for priority illegal offences, the current threshold of laws that exist offline should provide good guidance. I would expect that as the codes of practice are developed, we will be able to make clear what those offences are. On the racial hatred that the England footballers received after the European championship football final, people have been prosecuted for what they posted on Twitter and other social media platforms. We know what race hate looks like in that context, we know what the regulatory threshold should look at and we know the sort of content we are trying to regulate. I expect that, in the codes of practice, Ofcom can be very clear with companies about what we expect, where the thresholds are and where we expect them to take enforcement action.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I will try to avoid too much preamble, but I thank the former Minister, the hon. Member for Croydon South (Chris Philp), for all his work in Committee and for listening to my nearly 200 contributions, for which I apologise. I welcome the new Minister to his place.

As time has been short today, I am keen to meet the Minister to discuss my new clauses and amendments. If he cannot meet me, I would be keen for him to meet the NSPCC, in particular, on some of my concerns.

Amendment 196 is about using proactive technology to identify CSEA content, which we discussed at some length in Committee. The hon. Member for Croydon South made it very clear that we should use scanning to check for child sexual abuse images. My concern is that new clause 38, tabled by the Lib Dems, might exclude proactive scanning to look for child sexual abuse images. I hope that the Government do not lurch in that direction, because we need proactive scanning to keep children protected.

New clause 18 specifically addresses child user empowerment duties. The Bill currently requires that internet service providers have user empowerment duties for adults but not for children, which seems bizarre. Children need to be able to say yes or no. They should be able to make their own choices about excluding content and not receiving unsolicited comments or approaches from anybody not on their friend list, for example. Children should be allowed to do that, but the Bill explicitly says that user empowerment duties apply only to adults. New clause 18 is almost a direct copy of the adult user empowerment duties, with a few extra bits added. It is important that children have access to user empowerment.

Amendment 190 addresses habit-forming features. I have had conversations about this with a number of organisations, including The Mix. I regularly accessed its predecessor, The Site, more than 20 years ago, and it is concerned that 42% of young people surveyed by YoungMinds show addiction-like behaviour in what they are accessing on social media. There is nothing on that in this Bill. The Mix, the Mental Health Foundation, the British Psychological Society, YoungMinds and the Royal College of Psychiatrists are all unhappy about the Bill’s failure to regulate habit-forming features. It is right that we provide support for our children, and it is right that our children are able to access the internet safely, so it is important to address habit-forming behaviour.

Amendment 162 addresses child access assessments. The Bill currently says that providers need to do a child access assessment only if there is a “significant” number of child users. I do not think that is enough and I do not think it is appropriate, and the NSPCC agrees. The amendment would remove the word “significant.” OnlyFans, for example, should not be able to dodge the requirement to child risk assess its services because it does not have a “significant” number of child users. These sites are massively harmful, and we need to ensure changes are made so they cannot wriggle out of their responsibilities.

Finally, amendment 161 is about live, one-to-one oral communications. I understand why the Government want to exempt live, one-to-one oral communications, as they want to ensure that phone calls continue to be phone calls, which is totally fine, but they misunderstand the nature of things like Discord and how people communicate on Fortnite, for example. People are having live, one-to-one oral communications, some of which are used to groom children. We cannot explicitly exempt them and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way so that children can be protected from the grooming behaviour we see on some online platforms.

Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot meet me, will he please meet the NSPCC? We cannot explicitly exempt those and allow a loophole for perpetrators of abuse in this Bill. I understand what the Government are trying to do, but they need to do it in a different way, in order that children can be protected from that grooming behaviour that we see on some of those platforms that are coming online. Once again, if the Minister cannot accept these amendments, I would be keen to meet him. If he cannot do that, I ask that the NSPCC have a meeting with him.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

We have had a wide-ranging debate of passion and expert opinion from Members in all parts of the House, which shows the depth of interest in this subject, and the depth of concern that the Bill is delivered and that we make sure we get it right. I speak as someone who only a couple of days ago became the Minister for online safety, although I was previously involved in engaging with the Government on this subject. As I said in my opening remarks, this has been an iterative process, where Members from across the House have worked successfully with the Government to improve the Bill. That is the spirit in which we should complete its stages, both in the Commons and in the Lords, and look at how we operate this regime when it has been created.

I wish to start by addressing remarks made by the hon. Member for Pontypridd (Alex Davies-Jones), the shadow Minister, and by the hon. Member for Cardiff North (Anna McMorrin) about violence against women and girls. There is a slight assumption that if the Government do not accept an amendment that writes, “Violence against women and girls” into the priority harms in the Bill, somehow the Bill does not address that issue. I think we would all agree that that is not the case. The provisions on harmful content that is directed at any individual, particularly the new harms offences approved by the Law Commission, do create offences in respect of harm that is likely to lead to actual physical harm or severe psychological harm. As the father of a teenage girl, who was watching earlier but has now gone to do better things, I say that the targeting of young girls, particularly vulnerable ones, with content that is likely to make them more vulnerable is one of the most egregious aspects of the way social media works. It is right that we are looking to address serious levels of self-harm and suicide in the Bill and in the transparency requirements. We are addressing the self-harm and suicide content that falls below the illegal threshold but where a young girl who is vulnerable is being sent content and prompted with content that can make her more vulnerable, could lead her to harm herself or worse. It is absolutely right that that was in the scope of the Bill.

New clause 3, perfectly properly, cites international conventions on violence against women and girls, and how that is defined. At the moment, with the way the Bill is structured, the schedule 7 offences are all based on existing areas of UK law, where there is an existing, clear criminal threshold. Those offences, which are listed extensively, will all apply as priority areas of harm. If there is, through the work of the Law Commission or elsewhere, a clear legal definition of misogyny and violence against women and girls that is not included, I think it should be included within scope. However, if new clause 3 was approved, as tabled, it would be a very different sort of offence, where it would not be as clear where the criminal threshold applied, because it is not cited against existing legislation. My view, and that of the Government, is that existing legislation covers the sorts of offences and breadth of offences that the shadow Minister rightly mentioned, as did other Members. We should continue to look at this—