26 Maria Miller debates involving the Department for Digital, Culture, Media & Sport

Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 24th May 2022
Tue 24th May 2022
Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 19th Apr 2022

Online Safety Bill (Fifth sitting)

Maria Miller Excerpts
Dan Carden Portrait Dan Carden (Liverpool, Walton) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve on the Committee. I want to apologise for missing the evidence sessions. Unfortunately, I came down with covid, but I have been following the progress of the Committee.

This is important legislation. We spend so much of our lives online these days, yet there has never been an attempt to regulate the space, or for democratically elected Members to contribute towards its regulation. Clause 1 gives a general outline of what to expect in the Bill. I have no doubt that this legislation is required, but also that it will not get everything right, and that it will have to change over the years. We may see many more Bills of this nature in this place.

I have concerns that some clauses have been dropped, and I hope that there will be future opportunities to amend the Bill, not least with regard to how we educate and ensure that social media companies promote media literacy, so that information that is spread widely online is understood in its context—that it is not always correct or truthful. The Bill, I hope, will go some way towards ensuring that we can rely more on the internet, which should provide a safer space for all its users.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

May I join others in welcoming line-by-line scrutiny of the Bill? I am sure that the Minister will urge us to ensure that we do not make the perfect the enemy of the good. This is a very lengthy and complex Bill, and a great deal of time and scrutiny has already gone into it. I am sure that we will all pay due regard to that excellent work.

The hon. Member for Pontypridd is absolutely right to say that in many ways the world is watching what the Government are doing regarding online regulation. This will set a framework for many countries around the world, and we must get it right. We are ending the myth that social media and search engines are not responsible for their content. Their use of algorithms alone demonstrates that, while they may not publish all of the information on their sites, they are the editors at the very least and must take responsibility.

We will no doubt hear many arguments about the importance of free speech during these debates and others. I would like gently to remind people that there are many who feel that their free speech is currently undermined by the way in which the online world operates. Women are subject to harassment and worse online, and children are accessing inappropriate material. There are a number of areas that require specific further debate, particularly around the safeguarding of children, adequate support for victims, ensuring that the criminal law is future-proof within this framework, and ensuring that we pick up on the comments made in the evidence sessions regarding the importance of guidance and codes of practice. It was slightly shocking to hear from some of those giving evidence that the operators did not know what was harmful, as much has been written about the harm caused by the internet.

I will listen keenly to the Minister’s responses on guidance and codes of practice, and secondary legislation more generally, because it is critical to how the Bill works. I am sure we will have many hours of interesting and informed debate on this piece of legislation. While there has already been a great deal of scrutiny, the Committee’s role is pivotal to ensure that the Bill is as good as it can be.

Question put and agreed to.

Clause 1 accordingly ordered to stand part of the Bill.

Clause 2

Key Definitions

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 3 stand part.

That schedules 1 and 2 be the First and Second schedules to the Bill.

Clause 4 stand part.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger.

I do not want to get sidetracked, but I agree that there is a major parental knowledge gap. Tomorrow’s parents will have grown up on the internet, so in 20 years’ time we will have not have that knowledge gap, but today media literacy is lacking particularly among parents as well as among children. In Scotland, media literacy is embedded in the curriculum; I am not entirely sure what the system is in the rest of the UK. My children are learning media literacy in school, but there is still a gap about media literacy for parents. My local authority is doing a media literacy training session for parents tomorrow night, which I am very much looking forward to attending so that I can find out even more about how to keep my children safe online.

I was asking the Minister about the App Store and the Google Play Store. I do not need an answer today, but one at some point would be really helpful. Do the App Store, the Google Play Store and other stores of that nature fall under the definition of search engines or of user-to-user content? The reality is that if somebody creates an app, presumably they are a user. Yes, it has to go through an approval process by Apple or Google, but once it is accepted by them, it is not owned by them; it is still owned by the person who generated it. Therefore, are those stores considered search engines, in that they are simply curating content, albeit moderated content, or are they considered user-to-user services?

That is really important, particularly when we are talking about age verification and children being able to access various apps. The stores are the key gateways where children get apps. Once they have an app, they can use all the online services that are available on it, in line with whatever parental controls parents choose to put in place. I would appreciate an answer from the Minister, but he does not need to provide it today. I am happy to receive it at a later time, if that is helpful.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I want to pick up on two issues, which I hope the Minister can clarify in his comments at the end of this section.

First, when we took evidence, the Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Bill, so that it does not lose the ability to pick up child abuse images, as has already been referred to in the debate. The ability to scan end-to-end encryption is crucial. Will the Minister clarify if that is in scope and if the IWF will be able to continue its important work in safeguarding children?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

A number of people have raised concerns about freedom of speech in relation to end-to-end encryption. Does the right hon. Lady agree with me that, there should not be freedom of speech when it comes to child sexual abuse images, and that it is reasonable for those systems to check for child sexual abuse images?

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The hon. Lady is right to pick up on the nuance and the balance that we have to strike in legislation between freedom of speech and the protection of vulnerable individuals and children. I do not think there can be many people, particularly among those here today, who would want anything to trump the safeguarding of children. Will the Minister clarify exactly how the Bill works in relation to such important work?

Secondly, it is important that the Government have made the changes to schedule 2. They have listened closely on the issue of pornography and extended the provisions of the Bill to cover commercial pornography. However, the hon. Member for Pontypridd mentioned nudification software, and I am unclear whether the Bill would outlaw such software, which is designed to sexually harass women. That software takes photographs only of women, because its database relates only to female figures, and makes them appear to be completely naked. Does that software fall in scope of the Bill? If not, will the Minister do something about that? The software is available and we have to regulate it to ensure that we safeguard women’s rights to live without harassment in their day-to-day life.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I want to make a couple of brief comments. Unfortunately, my hon. Friend the Member for Ochil and South Perthshire is not here as, ironically, he is at the DCMS committee taking evidence on the Online Safety Bill. That is a pretty unfortunate clash of timing, but that is why I am here solo for the morning.

I wanted to make a quick comment on subsection 7. The Minister will have heard the evidence given on schedule 7 and the fact that the other schedules, particularly schedule 6, has a Scottish-specific section detailing the Scottish legislation that applies. Schedule 7 has no Scotland-specific section and does not adequately cover the Scottish legislation. I appreciate that the Minister has tabled amendment 126, which talks about the Scottish and Northern Irish legislation that may be different from England and Wales legislation, but will he give me some comfort that he does intend Scottish-specific offences to be added to schedule 7 through secondary legislation? There is a difference between an amendment on how to add them and a commitment that they will be added if necessary and if he feels that that will add something to the Bill. If he could commit that that will happen, I would appreciate that—obviously, in discussion with Scottish Ministers if amendment 126 is agreed. It would give me a measure of comfort and would assist, given the oral evidence we heard, in overcoming some of the concerns raised about schedule 7 and the lack of inclusion of Scottish offences.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

In many ways, clause 6 is the central meat of the Bill. It brings into play a duty of care, which means that people operating online will be subject to the same rules as the rest of us when it comes to the provision of services. But when it comes to the detail, the guidance and codes that will be issued by Ofcom will play a central role. My question for the Minister is: in the light of the evidence that we received, I think in panel three, where the providers were unable to define what was harmful because they had not yet seen codes of practice from Ofcom, could he update us on when those codes and guidance might be available? I understand thoroughly why they may not be available at this point, and they certainly should not form part of the Bill because they need to be flexible enough to be changed in future, but it is important that we know how the guidance and codes work and that they work properly.

Will the Minister update the Committee on what further consideration he and other Ministers have given to the establishment of a standing committee to scrutinise the implementation of the Bill? Unless we have that in place, it will be difficult to know whether his legislation will work.

Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

Some of the evidence we heard suggested that the current precedent was that the Secretary of State had very little to do with independent regulators in this realm, but that the Bill overturns that precedent. Does the right hon. Lady have any concerns that the Bill hands too much power to the Secretary of State to intervene and influence regulators that should be independent?

Maria Miller Portrait Dame Maria Miller
- Hansard - -

The hon. Gentleman brings up an important point. We did hear about that in the evidence. I have no doubt the Secretary of State will not want to interfere in the workings of Ofcom. Having been in his position, I know there would be no desire for the Department to get involved in that, but I can understand why the Government might want the power to ensure things are working as they should. Perhaps the answer to the hon. Gentleman’s question is to have a standing committee scrutinising the effectiveness of the legislation and the way in which it is put into practice. That committee could be a further safeguard against what he implies: an unnecessary overreach of the Secretary of State’s powers.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Roger, for allowing me to intervene again. I was not expecting the standing committee issue to be brought up at this point, but I agree that there needs to be a post-implementation review of the Bill. I asked a series of written questions to Departments about post-legislative review and whether legislation that the Government have passed has had the intended effect. Most of the Departments that answered could not provide information on the number of post-legislative reviews. Of those that could provide me with the information, none of them had managed to do 100% of the post-implementation reviews that they were supposed to do.

It is important that we know how the Bill’s impact will be scrutinised. I do not think it is sufficient for the Government to say, “We will scrutinise it through the normal processes that we normally use,” because it is clear that those normal processes do not work. The Government cannot say that legislation they have passed has achieved the intended effect. Some of it will have and some of it will not have, but we do not know because we do not have enough information. We need a standing committee or another way to scrutinise the implementation.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I thank the hon. Lady for raising this point. Having also chaired a Select Committee, I can understand the sensitivities that this might fall under the current DCMS Committee, but the reality is that the Bill’s complexity and other pressures on the DCMS Committee means that this perhaps should be seen as an exceptional circumstance—in no way is that meant as a disrespect to that Select Committee, which is extremely effective in what it does.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I completely agree. Having sat on several Select Committees, I am aware of the tight timescales. There are not enough hours in the day for Select Committees to do everything that they would like to do. It would be unfortunate and undesirable were this matter to be one that fell between the cracks. Perhaps DCMS will bring forward more legislation in future that could fall between the cracks. If the Minister is willing to commit to a standing committee or anything in excess of the normal governmental procedures for review, that would be a step forward from the position that we are currently in. I look forward to hearing the Minister’s views on that.

Online Safety Bill (Sixth sitting)

Maria Miller Excerpts
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Very well; we will debate clause 9 separately. In that case, I will move on to amendments 19 and 20, which seek to address cross-platform risk. Again, we completely agree with the Opposition that cross-platform risk is a critical issue. We heard about it in evidence. It definitely needs to be addressed and covered by the Bill. We believe that it is covered by the Bill, and our legal advice is that it is covered by the Bill, because in clause 8 as drafted—[Interruption.] Bless you—or rather, I bless the shadow Minister, following Sir Roger’s guidance earlier, lest I inadvertently bless the wrong person.

Clause 8 already includes the phrase to which I alluded previously. I am talking about the requirement that platforms risk-assess illegal content that might be encountered

“by means of the service”.

That is a critical phrase, because it means not just on that service itself; it also means, potentially, via that service if, for example, that service directs users onward to illegal content on another site. By virtue of the words,

“by means of the service”,

appearing in clause 8 as drafted, the cross-platform risk that the Opposition and witnesses have rightly referred to is covered. Of course, Ofcom will set out further steps in the code of practice as well.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - -

I was listening very closely to what the Minister was saying and I was hoping that he might be able to comment on some of the evidence that was given, particularly by Professor Lorna Woods, who talked about the importance of risk assessments being about systems, not content. Would the Minister pick up on that point? He was touching on it in his comments, and I was not sure whether this was the appropriate point in the Bill at which to bring it up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for raising that. The risk assessments and, indeed, the duties arising under this Bill all apply to systems and processes—setting up systems and processes that are designed to protect people and to prevent harmful and illegal content from being encountered. We cannot specify in legislation every type of harmful content that might be encountered. This is about systems and processes. We heard the Chairman of the Joint Committee on the draft Online Safety Bill, our hon. Friend the Member for Folkestone and Hythe (Damian Collins), confirm to the House on Second Reading his belief—his accurate belief—that the Bill takes a systems-and-processes approach. We heard some witnesses saying that as well. The whole point of this Bill is that it is tech-agnostic—to future-proof it, as hon. Members mentioned this morning—and it is based on systems and processes. That is the core architecture of the legislation that we are debating.

Amendments 25 and 26 seek to ensure that user-to-user services assess and mitigate the risk of illegal content being produced via functions of the service. That is covered, as it should be—the Opposition are quite right to raise the point—by the illegal content risk assessment and safety duties in clauses 8 and 9. Specifically, clause 8(5)(d), on page 7 of the Bill—goodness, we are only on page 7 and we have been going for over half a day already—requires services to risk-assess functionalities of their service being used to facilitate the presence of illegal content. I stress the word “presence” in clause 8(5)(d). Where illegal content is produced by a functionality of the service—for example, by being livestreamed—that content will be present on the service and companies must mitigate that risk. The objective that the Opposition are seeking to achieve, and with which we completely agree with, is covered in clause 8(5)(d) by the word “presence”. If the content is present, it is covered by that section.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We discussed personal liability extensively this morning. As we discussed, there is personal liability in relation to providing information, with a criminal penalty of up to two years’ imprisonment, to avoid situations like the one we saw a year or two ago, where one of these companies failed to provide the Competition and Markets Authority with the information that it required.

The shadow Minister pointed out the very high levels of global turnover—$71.5 billion—that these companies have. That means that ultimately they can be fined up to $7 billion for each set of breaches. That is a vast amount of money, particularly if those breaches happen repeatedly. She said that such companies will just set up again if we deny their service. Clearly, small companies can close down and set up again the next day, but gigantic companies, such as Meta—Facebook—cannot do that. That is why I think the sanctions I have pointed to are where the teeth really lie.

I accept the point about governance being important as well; I am not dismissing that. That is why we have personal criminal liability for information provision, with up to two years in prison, and it is why governance is referenced in clause 10. I accept the spirit of the points that have been made, but I think the Bill delivers these objectives as drafted.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

Will my hon. Friend give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

One last time, because I am conscious that we need to make some progress this afternoon.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I have huge sympathy with the point that the Minister is making on this issue, but the hon. Member for Pontypridd is right to drive the point home. The Minister says there will be huge fines, but I think there will also be huge court bills. There will be an awful lot of litigation about how things are interpreted, because so much money will come into play. I just reiterate the importance of the guidance and the codes of practice, because if we do not get those right then the whole framework will be incredibly fragile. We will need ongoing scrutiny of how the Bill works or there will be a very difficult situation.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

My right hon. Friend, as always, makes a very good point. The codes of practice will be important, particularly to enable Ofcom to levy fines where appropriate and then successfully defend them. This is an area that may get litigated. I hope that, should lawyers litigating these cases look at our transcripts in the future, they will see how strongly those on both sides of the House feel about this point. I know that Ofcom will ensure that the codes of practice are properly drafted. We touched this morning on the point about timing; we will follow up with Ofcom to make sure that the promise it made us during the evidence session about the road map is followed through and that those get published in good time.

On the point about the Joint Committee, I commend my right hon. Friend for her persistence—[Interruption.] Her tenacity—that is the right word. I commend her for her tenacity in raising that point. I mentioned it to the Secretary of State when I saw her at lunchtime, so the point that my right hon. Friend made this morning has been conveyed to the highest levels in the Department.

I must move on to the final two amendments, 11 and 13, which relate to transparency. Again, we had a debate about transparency earlier, when I made the point about the duties in clause 64, which I think cover the issue. Obviously, we are not debating clause 64 now but it is relevant because it requires Ofcom—it is not an option but an obligation; Ofcom must do so—to require providers to produce a transparency report every year. Ofcom can say what is supposed to be in the report, but the relevant schedule lists all the things that can be in it, and covers absolutely everything that the shadow Minister and the hon. Member for Worsley and Eccles South want to see in there.

That requirement to publish transparently and publicly is in the Bill, but it is to be found in clause 64. While I agree with the Opposition’s objectives on this point, I respectfully say that those objectives are delivered by the Bill as drafted, so I politely and gently request that the amendments be withdrawn.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The purpose of this clause is to ensure that children at risk of online harms are given protections from harmful, age-inappropriate content through specific children’s safety duties for user-to-user services likely to be accessed by children.

It is welcome that the Bill contains strong provisions to ensure that service providers act upon and mitigate the risks identified in the required risk assessment, and to introduce protective systems and processes to address what children encounter. This amendment aims to ensure that online platforms are proactive in their attempts to mitigate the opportunity for sex offenders to abuse children.

As we have argued with other amendments, there are missed opportunities in the Bill to be preventive in tackling the harm that is created. The sad reality is that online platforms create an opportunity for offenders to identify, contact and abuse children, and to do so in real time through livestreaming. We know there has been a significant increase in online sexual exploitation during the pandemic. With sex offenders unable to travel or have physical contact with children, online abuse increased significantly.

In 2021, UK law enforcement received a record 97,727 industry reports relating to online child abuse, a 29% increase on the previous year, which is shocking. An NSPCC freedom of information request to police forces in England and Wales last year showed that online grooming offences reached record levels in 2020-21, with the number of sexual communications with a child offences in England and Wales increasing by almost 70% in three years. There has been a deeply troubling trend in internet-facilitated abuse towards more serious sexual offences against children, and the average age of children in child abuse images, particularly girls, is trending to younger ages.

In-person contact abuse moved online because of the opportunity there for sex offenders to continue exploiting children. Sadly, they can do so with little fear of the consequences, because detection and disruption of livestreamed abuse is so low. The duty to protect children from sexual offenders abusing them in real time and livestreaming their exploitation cannot be limited to one part of the internet and tech sector. While much of the abuse might take place on the user-to-user services, it is vital that protections against such abuse are strengthened across the board, including in the search services, as set out in clause 26.

At the moment there is no list of harms in the Bill that must be prioritised by regulated companies. The NSPCC and others have suggested including a new schedule, similar to schedule 7, setting out what the primary priority harms should be. It would be beneficial for the purposes of parliamentary scrutiny for us to consider the types of priority harm that the Government intend the Bill to cover, rather than leaving that to secondary legislation. I hope the Minister will consider that and say why it has not yet been included.

To conclude, while we all hope the Bill will tackle the appalling abuse of children currently taking place online, this cannot be achieved without tackling the conditions in which these harms can take place. It is only by requiring that steps be taken across online platforms to limit the opportunities for sex offenders to abuse children that we can see the prevalence of this crime reduced.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise, hopefully to speak to clause 11 more generally—or will that be a separate stand part debate, Ms Rees?

None Portrait The Chair
- Hansard -

That is a separate debate.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

My apologies. I will rise later.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The Government obviously support the objective of these amendments, which is to prevent children from suffering the appalling sexual and physical abuse that the hon. Member for Worsley and Eccles South outlined in her powerful speech. It is shocking that these incidents have risen in the way that she described.

To be clear, that sort of appalling sexual abuse is covered in clause 9—which we have debated already—which covers illegal content. As Members would expect, child sexual abuse is defined as one of the items of priority illegal content, which are listed in more detail in schedule 6, where the offences that relate to sexual abuse are enumerated. As child sexual exploitation is a priority offence, services are already obliged through clause 9 to be “proactive” in preventing it from happening. As such, as Members would expect, the requirements contained in these amendments are already delivered through clause 9.

The hon. Member for Worsley and Eccles South also asked when we are going to hear what the primary priority harms to children might be. To be clear, those will not include the sexual exploitation offences, because as Members would also expect, those are already in the Bill as primary illegal offences. The primary priority harms might include material promoting eating disorders and that kind of thing, which is not covered by the criminal matters—the illegal matters. I have heard the hon. Lady’s point that if that list were to be published, or at least a draft list, that would assist Parliament in scrutinising the Bill. I will take that point away and see whether there is anything we can do in that area. I am not making a commitment; I am just registering that I have heard the point and will take it away.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 26 stand part.

Maria Miller Portrait Dame Maria Miller
- Hansard - -

I rise to speak to clause 11, because this is an important part of the Bill that deals with the safety duties protecting children. Many of us here today are spurred on by our horror at the way in which internet providers, platform providers and search engines have acted over recent years, developing their products with no regard for the safety of children, so I applaud the Government for bringing forward this groundbreaking legislation. They are literally writing the book on this, but in doing so, we have be very careful about the language we use and the way in which we frame our requirements of these organisations. The Minister has rightly characterised these organisations as being entirely driven by finance, not the welfare of their consumers, which must make them quite unique in the world. I can only hope that that will change: presumably, over time, people will not want to use products that have no regard for the safety of those who use them.

In this particular part of the Bill, the thorny issue of age assurance comes up. I would value the Minister’s views on some of the evidence that we received during our evidence sessions about how we ensure that age assurance is effective. Some of us who have been in this place for a while would be forgiven for thinking that we had already passed a law on age assurance. Unfortunately, that law did not seem to come to anything, so let us hope that second time is lucky. The key question is: who is going to make sure that the age assurance that is in place is good enough? Clause 11(3) sets out

“a duty to operate a service using proportionate systems and processes”

that is designed to protect children, but what is a proportionate system? Who is going to judge that? Presumably it will be Ofcom in the short term, and in the long term, I am sure the courts will get involved.

In our evidence, we heard some people advocating very strongly for these sorts of systems to be provided by third parties. I have to say, in a context where we are hearing how irresponsible the providers of these services are, I can understand why people would think that a third party would be a more responsible way forward. Can the Minister help the Committee understand how Ofcom will ensure that the systems used, particularly the age assurance systems, are proportionate—I do not particularly like that word; I would like those systems to be brilliant, not proportionate—and are actually doing what we need them to do, which is safeguard children? For the record, and for the edification of judges who are looking at this matter in future—and, indeed, Ofcom—will he set out how important this measure is within the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank my right hon. Friend for her remarks, in which she powerfully and eloquently set out how important the clause is to protecting children. She is right to point out that this is a critical area in the Bill, and it has wide support across the House. I am happy to emphasise, for the benefit of those who may study our proceedings in future, that protecting children is probably the single-most important thing that the Bill does, which is why it is vital that age-gating, where necessary, is effective.

My right hon. Friend asked how Ofcom will judge whether the systems under clause 11(3) are proportionate to

“prevent children of any age from encountering”

harmful content and so on. Ultimately, the proof of the pudding is in the eating; it has to be effective. When Ofcom decides whether a particular company or service is meeting the duty set out in the clause, the simple test will be one of effectiveness: is it effective and does it work? That is the approach that I would expect Ofcom to take; that is the approach that I would expect a court to take. We have specified that age verification, which is the most hard-edged type of age assurance—people have to provide a passport or something of that nature—is one example of how the duty can be met. If another, less-intrusive means is used, it will still have to be assessed as effective by Ofcom and, if challenged, by the courts.

I think my right hon. Friend was asking the Committee to confirm to people looking at our proceedings our clear intent for the measures to be effective. That is the standard to which we expect Ofcom and the courts to hold those platforms in deciding whether they have met the duties set out in the clause.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - -

For clarification, does the Minister anticipate that Ofcom might be able to insist that a third-party provider be involved if there is significant evidence that the measures put in place by a platform are ineffective?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have deliberately avoided being too prescriptive about precisely how the duty is met. We have pointed to age verification as an example of how the duty can be met without saying that that is the only way. We would not want to bind Ofcom’s hands, or indeed the hands of platforms. Clearly, using a third party is another way of delivering the outcome. If a platform were unable to demonstrate to Ofcom that it could deliver the required outcome using its own methods, Ofcom may well tell it to use a third party instead. The critical point is that the outcome must be delivered. That is the message that the social media firms, Ofcom and the courts need to hear when they look at our proceedings. That is set out clearly in the clause. Parliament is imposing a duty, and we expect all those to whom the legislation applies to comply with it.

Question put and agreed to.

Clause 11 accordingly ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Online Safety Bill (Third sitting)

Maria Miller Excerpts
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q187 Good morning to our witnesses. Thank you for joining us today. One of the main criticisms of the Bill is that the vast majority of the detail will not be available until after the legislation is enacted, under secondary legislation and so on. Part of the problem is that we are having difficulty in differentiating the “legal but harmful” content. What impact does that have?

William Perrin: At Carnegie, we saw this problem coming some time ago, and we worked in the other place with Lord McNally on a private Member’s Bill —the Online Harms Reduction Regulator (Report) Bill—that, had it carried, would have required Ofcom to make a report on a wide range of risks and harms, to inform and fill in the gaps that you have described.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

On a point of order, Ms Rees. There is a gentleman taking photographs in the Gallery.

None Portrait The Chair
- Hansard -

There is no photography allowed here.

William Perrin: Unfortunately, that Bill did not pass and the Government did not quite take the hint that it might be good to do some prep work with Ofcom to provide some early analysis to fill in holes in a framework Bill. The Government have also chosen in the framework not to bring forward draft statutory instruments or to give indications of their thinking in a number of key areas of the Bill, particularly priority harms to adults and the two different types of harms to children. That creates uncertainty for companies and for victims, and it makes the Bill rather hard to scrutinise.

I thought it was promising that the Government brought forward a list of priority offences in schedule 7 —I think that is where it is; I get these things mixed up, despite spending hours reading the thing. That was helpful to some extent, but the burden is on the Government to reduce complexity by filling in some of the blanks. It may well be better to table an amendment to bring some of these things into new schedules, as we at Carnegie have suggested—a schedule 7A for priority harms to adults, perhaps, and a 7B and 7C for children and so on—and then start to fill in some of the blanks in the regime, particularly to reassure victims.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. I am going to bring Maria Miller in now.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q This evidence session is underlining to me how complicated these issues are. I am really grateful for your expertise, because we are navigating through a lot of issues. With slight trepidation I open the conversation up into another area—the issue of protection for children. One of the key objectives of the legislation is to ensure a higher level of protection for children than for adults. In your view, does the Bill achieve that? I am particularly interested in your views on whether the risks of harm to children should be set out on the face of the Bill, and if so, what harms should be included. Can I bring Mat in here?

Mat Ilic: Thank you so much. The impact of social media in children’s lives has been a feature of our work since 2015, if not earlier; we have certainly researched it from that period. We found that it was a catalyst to serious youth violence and other harms. Increasingly, we are seeing it as a primary issue in lots of the child exploitation and missing cases that we deal with—in fact, in half of the cases we have seen in some of the areas that we work in it featured as the primary reason rather than as a coincidental reason. The online harm is the starting point rather than a conduit.

In relation to the legislation, all our public statements on this have been informed by user research. I would say that is one of the central principles to think through in the primary legislation—a safety-by-design focus. We have previously called this the toy car principle, which means any content or product that is designed with children in mind needs to be tested in a way that is explicitly for children, as Mr Moy talked about. It needs to have some age-specific frameworks built in, but we also need to go further than that by thinking about how we might raise the floor, rather than necessarily trying to tackle explicit harms. Our point is that we need to remain focused on online safety for children and the drivers of online harm and not the content.

The question is, how can that be done? One way is the legal design requirement for safety, and how that might play out, as opposed to having guiding principles that companies might adopt. Another way is greater transparency on how companies make particular decisions, and that includes creating or taking off content that pertains to children. I want to underline the point about empowerment for children who have been exposed to or experience harm online, or offline as a result of online harm. That includes some kind of recourse to be able to bring forward cases where complaints, or other issues, were not taken seriously by the platforms.

If you read the terms and conditions of any given technology platform, which lots of young people do not do on signing up—I am sure lots of adults do not do that either—you realise that even with the current non-legislative frameworks that the companies deploy to self-regulate, there is not enough enforcement in the process. For example, if I experience some kind of abuse and complain, it might never be properly addressed. We would really chime on the enforcement of the regulatory environment; we would try to raise the floor rather than chase specific threats and harms with the legislation.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I bring Lorna in here? We are talking about moving from content to the drivers of harm. Where would you suggest that should be achieved within the Bill?

Professor Lorna Woods: I think by an overarching risk assessment rather than one that is broken down into the different types of content, because that, in a way, assumes a certain knowledge of the type of content before you can do a risk assessment, so you are into a certain circular mode there. Rather than prejudging types of content, I think it would be more helpful to look at what is there and what the system is doing. Then we could look at what a proportionate response would be—looking, as people have said, at the design and the features. Rather than waiting for content to be created and then trying to deal with it, we could look at more friction at an earlier stage.

If I may add a technical point, I think there is a gap relating to search engines. The draft Bill excluded paid-for content advertising. It seems that, for user-to-user content, this is now in the Bill, bringing it more into line with the current standards for children under the video-sharing platform provisions. That does not apply to search. Search engines have duties only in relation to search content, and search content excludes advertising. That means, as I read it, that search engines would have absolutely no duties to children under their children safety duty in relation to advertising content. You could, for example, target a child with pornography and it would fall outside the regime. I think that is a bit of a gap.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you, witnesses, for your time this morning. I am going to focus initially on journalistic content. Is it fair that the platforms themselves are having to try to define what journalistic content is and, by default, what a journalist is? Do you see a way around this?

William Moy: No, no, yes. First, no, it is not fair to put that all on the platforms, particularly because—I think this a crucial thing for the Committee across the Bill as a whole—for anything to be done at internet scale, it has to be able to be done by dumb robots. Whatever the internet companies tell you about the abilities of their technology, it is not magic, and it is highly error-prone. For this duty to be meaningful, it has to be essentially exercised in machine learning. That is really important to bear in mind. Therefore, being clear about what it is going to tackle in a way that can be operationalised is important.

To your second point, it is really important in this day and age to question whether journalistic content and journalists equate to one another. I think this has come up in a previous session. Nowadays, journalism, or what we used to think of as journalism, is done by all kinds of people. That includes the same function of scrutiny and informing others and so on. It is that function that we care about—the passing of information between people in a democracy. We need to protect that public interest function. I think it is really important to get at that. I am sure there are better ways of protecting the public interest in this Bill by targeted protections or specifically protecting freedom of expression in specific ways, rather than these very broad, vague and general duties.

Online Safety Bill (Fourth sitting)

Maria Miller Excerpts
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me. I would like to discuss your thoughts on transparency and how we can make social media companies like Meta be more transparent and open with their data, beyond the measures we currently have in the Bill. For instance, we could create statute to allow academics or researchers in to examine their data. Do you have any thoughts on how this can be incentivised?

Stephen Almond: Transparency is a key foundation of data protection law in and of itself. As the regulator in this space, I would say that there is a significant emphasis within the data protection regime on ensuring that companies are transparent about the processing of personal data that they undertake. We think that that provides proportionate safeguards in this space. I would not recommend an amendment to the Bill on this point, because I would be keen to avoid duplication or an overlap between the regimes, but it is critical; we want companies to be very clear about how people’s personal data is being processed. It is an area that we are going to continue to scrutinise.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

May I ask a supplementary to that before I come on to my main question?

None Portrait The Chair
- Hansard -

Absolutely.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you so much for coming along. You spoke in your initial comments to my colleague about encryption. The challenges of encryption around child abuse images have been raised with us previously. How can we balance the need to allow people to have encrypted options, if possible, with the need to ensure that this does not adversely affect organisations such as the Internet Watch Foundation, which does so much good in protecting children and rooting out child abuse imagery?

Stephen Almond: I share your concern about this. To go back to what I was saying before, I think the approach that is set out in the Bill is proportionate and targeted. The granting of, ultimately, backstop powers to Ofcom to issue technology notices and to require services to deal with this horrendous material will have a significant impact. I think this will ensure that the regime operates in a risk-based way, where risks can be identified. There will be the firm expectation on service providers to take action, and that will require them to think about all the potential technological solutions that are available to them, be they content scanning or alternative ways of meeting their safety duties.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q My main question is about child safety, which is a prime objective for the Government in this legislation. Do you feel that the Bill’s definition of “likely to be accessed by children” should be more closely aligned with the one used in the ICO’s age-appropriate design code?

Stephen Almond: The objectives of both the Online Safety Bill and the children’s code are firmly aligned in respect of protecting children online. We have reviewed the definitions and, from our perspective, there are distinctions in the definition that is applied in the Bill and the children’s code, but we find no significant tension between them. My focus at the ICO, working in co-operation with Ofcom, will ultimately be on ensuring that there is clarity for business on how the definitions apply to their services, and that organisations know when they are in scope of the children’s code and what actions they should take.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Do you think any further aspects of the age-appropriate design code should be incorporated into the Bill?

Stephen Almond: We are not seeking to incorporate further aspects of the code into the Bill. We think it is important that the regimes fit together coherently, but that that is best achieved through regulatory co-operation between the ICO and Ofcom. The incorporation of the children’s code would risk creating some form of regulatory overlap and confusion.

I can give you a strong assurance that we have a good track record of working closely with Ofcom in this area. Last year, the children’s code came into force, and not too longer after it, Ofcom’s video-sharing platform regime came into force. We have worked very closely to make sure that those regimes are introduced in a harmonised way and that people understand how they fit together.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Working closely with Ofcom is really good, but do you think there needs to be a duty to co-operate with Ofcom, or indeed with other regulators—to be specified in the Bill—in case relations become more tense in future?

Stephen Almond: The Bill has, in my view, been designed to work closely alongside data protection law. It supports effective co-operation between us and Ofcom by requiring and setting out a series of duties for Ofcom to consult with the ICO on the development of any codes of practice or formal guidance with an impact on privacy. With that framework in mind, I do not think there is a case to instil further co-operation duties in that way. I hope I can give you confidence that we and Ofcom will be working tirelessly together to promote the safety and privacy of citizens online. It is firmly in our interests and in the interest of society as a whole to do so.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

Q Thank you for joining us, Mr Almond. You stated the aim of making the UK the

“safest place in the world to be online”.

In your view, what needs to be added or taken away from the Bill to achieve that?

Stephen Almond: I am not best placed to comment on the questions of online safety and online harms. You will speak to a variety of different experts who can comment on that point. From my perspective as a digital regulator, one of the most important things will be ensuring that the Bill is responsive to future challenges. The digital world is rapidly evolving, and we cannot necessarily envisage all the developments in technology that will come, or the emergence of new harms. The data protection regime is a principles-based piece of legislation. That gives us a great degree of flexibility and discretion to adapt to novel forms of technology and to provide appropriate guidance as challenges emerge. I really recommend retaining that risk-based, principles-based approach to regulation that is envisaged currently in the Online Safety Bill.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Lynn Perry is on the line, but we have lost her for the moment. I am afraid we are going to have to press on.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to focus on one particular issue, which is anonymity. Kick It Out has done so much with the FA to raise awareness of that issue. I was interested in your views on how the Bill treats that. The Bill mentions anonymity and pseudonymity, but it does so only once. Should the Bill take a clearer stance on online anonymity? Do you have any views on whether people should be able to use the internet fully anonymously, or should they disclose their identity to the platform? Do you have any thoughts on that? You have done a huge amount of work on it.

Sanjay Bhandari: There is quite a lot in that question. In terms of whether people should be fully anonymous or not, it depends on what you mean by fully. I am a lawyer, so I have 30 years specialising in the grey, rather than in the black and white. It really does depend on what you mean by fully. In my experience, nothing is absolute. There is no absolute right to freedom of speech; I cannot come in here and shout “Fire!” and make you all panic. There is also no absolute right to anonymity; I cannot use my anonymity online as a cloak to commit fraud. Everything is qualified. It is a question of what is the balance of those qualifications and what those qualifications should be, in the particular context of the problem that we are seeking to address.

The question in this context is around the fact that anonymity online is actually very important in some contexts. If you are gay in a country where that is illegal, being anonymous is a fantastic way to be able to connect with people like you. In a country that has a more oppressive regime, anonymity is another link to the outside world. The point of the Bill is to try to get the balance so that anonymity is not abused. For example, when a football player misses a penalty in a cup final, the point of the Bill is that you cannot create a burner account and instantly send them a message racially abusing them and then delete the account—because that is what happens now. The point of the Bill, which we are certainly happy with in general terms, is to draw a balance in the way that identity verification must be offered as an option, and to give users more power over who they interact with, including whether they wish to engage only with verified accounts.

We will come back and look in more detail at whether we would like more amendments, and we will also work with other organisations. I know that my colleague Stephen Kinsella of Clean up the Internet has been looking at those anonymity provisions and at whether verification should be defined and someone’s status visible on the face of the platforms, for instance. I hope that answers those two or three questions.

Maria Miller Portrait Mrs Miller
- Hansard - -

That is very helpful; thank you.

None Portrait The Chair
- Hansard -

I saw you nodding, Ms Perry. Do you wish to add anything?

Lynn Perry: I agree. The important thing, particularly from the perspective of Barnardo’s as a children’s charity, is the right of children to remain safe and protected online and in no way compromised by privacy or anonymity considerations online. I was nodding along at certain points to endorse the need to ensure that the right balance is struck for protections for those who might be most vulnerable.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q We have heard a lot from other witnesses about the ability of Ofcom to regulate the smaller high-risk platforms. What is your view on that?

Poppy Wood: Absolutely, and I agree with what was said earlier, particularly by groups such as HOPE not hate and Antisemitism Policy Trust. There are a few ways to do this, I suppose. As we are saying, at the moment the small but high-risk platforms just are not really caught in the current categorisation of platforms. Of course, the categories are not even defined in the Bill; we know there are going to be categories, but we do not know what they will be.

I suppose there are different ways to do this. One is to go back to where this Bill started, which was not to have categories of companies at all but to have a proportionality regime, where depending on your size and your functionality you had to account for your risk profile, and it was not set by Ofcom or the Government. The problem of having very prescriptive categories—category 1, category 2A, category 2B—is, of course, that it becomes a race to the bottom in getting out of these regulations without having to comply with the most onerous ones, which of course are category 1.

There is also a real question about search. I do not know how they have wriggled out of this, but it was one of the biggest surprises in the latest version of the Bill that search had been given its own category without many obligations around adult harm. I think that really should be revisited. All the examples that were given earlier today are absolutely the sort of thing we should be worrying about. If someone can google a tractor in their workplace and end up looking at a dark part of the web, there is a problem with search, and I think we should be thinking about those sorts of things. Apologies for the example, but it is a really, really live one and it is a really good thing to think about how search promotes these kinds of content.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to touch on something we have not talked about a lot today, which is enforcement and the enforcement powers in the Bill. There are significant enforcement powers in the Bill, but do our two witnesses here which those enforcement powers are enough. Eva?

Eva Hartshorn-Sanders: Are you specifically asking about the takedown notices and the takedown powers?

Maria Miller Portrait Mrs Miller
- Hansard - -

No, I am talking about director liability and the enforcement on companies.

Eva Hartshorn-Sanders: Right. I think the responsibility on both companies and senior executives is a really critical part of this legislative package. You see how adding liability alongside financial penalties works in health and safety legislation and corporate manslaughter provisions to motivate changes not only within company culture but in the work that they are doing and what they factor into the decisions they make. It is a critical part of this Bill.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there more that could or should be added to the Bill?

Eva Hartshorn-Sanders: I think it is a good start. I would want to have another look at it to say more. There is a review after two years, as set out in clause 149, so there could be a factor that gets added into that, as well.

Maria Miller Portrait Mrs Miller
- Hansard - -

Poppy, do you have anything to add?

Poppy Wood: Yes. I think we could go much further on enforcement. One of the things that I really worry about is that if the platforms make an inadequate risk assessment, there is not much that Ofcom can do about it. I would really like to see powers for Ofcom to say, “Okay, your risk assessment hasn’t met the expectations that we put on you, so we want you to redo it. And while you’re redoing it, we may want to put you into a different category, because we may want to have higher expectations of you.” That way, you cannot start a process where you intentionally make an inadequate risk assessment in order to extend the process of you being properly regulated. I think that is one thing.

Then, going back to the point about categorisation, I think that Ofcom should be given the power to recategorise companies quickly. If you think that a category 2B company should be a category 1 company, what powers are there for Ofcom to do that? I do not believe that there are any for Ofcom to do that, certainly not to do it quickly, and when we are talking about small but high-risk companies, that is absolutely the sort of thing that Ofcom should be able to do—to say, “Okay, you are now acting like a category 1 company.” TikTok, Snapchat—they all started really small and they accelerated their growth in ways that we just could not have predicted. When we are talking about the emergence of new platforms, we need to have a regulator that can account for the scale and the pace at which these platforms grow. I think that is a place where I would really like to see Ofcom focusing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a question for the Centre for Countering Digital Hate. I raised some of your stats on reporting with Meta—Facebook—when they were here, such as the number of reports that are responded to. They basically said, “This is not true any more; we’re now great”—I am paraphrasing, obviously. Could you please let us know whether the reporting mechanism on major platforms—particularly Facebook—is now completely fixed, or whether there are still lots of issues with it?

Eva Hartshorn-Sanders: There are still lots of issues with it. We recently put a report out on anti-Muslim hatred and found that 90% of the content that was reported was not acted on. That was collectively, across the platforms, so it was not just Facebook. Facebook was in the mid-90s, I think, in terms of its failure to act on that type of harmful content. There are absolutely still issues with it, and this regulation—this law—is absolutely necessary to drive change and the investment that needs to go into it.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q A great deal of the discussion we are having about this Bill is its scope—what is covered and what is not covered. Many of us will look regularly at newspapers online, particularly the comments sections, which can be quite colourful. Should comments on newspaper publisher platforms be included in the scope of the Bill?

Owen Meredith: Yes, I think they should be included within the news publisher exemption as it is spelt out. As far as I understand, that has always been the intention, since the original White Paper many years ago that led to where we are today. There is a very good reason for that, not least the fact that the comments on news publisher websites are still subject to the responsibility of the editor and the publisher; they are subject to the regulation of the Independent Press Standards Organisation, in the case of those publishers who are regulated under the self-regulation system by IPSO, as the majority of my members are. There is a very different environment in news publisher websites’ comments sections, where you are actively seeking to engage with those and read those as a user, whereas on social media platforms that content can come to you without you wishing to engage with it.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can I just probe on that slightly? You say the comments are the responsibility of the editor. Does that mean that if something is published on there that is defamatory, it would then be attributed to the editor?

Owen Meredith: Everything published by the news site is ultimately the responsibility of the editor.

Matt Rogerson: I think there are various cases. I think Delfi is the relevant case in relation to comments, where if a publisher is notified of a defamatory comment within their comments section, they are legally liable for it if they do not take it down. To speak from a Guardian perspective, we would like comments sections to be included within the exemption. The self-regulation we have in place for our comments section has been quite a journey. We undertook quite a big bit of research on all the comments that had been left over an 11-year period. We tightened up significantly the processes that we had in place. We currently use a couple of steps to make sure those comments sections are well moderated. We use machine learning against very tightly defined terms, and then every single comment that is taken down is subject to human review. I think that works in the context of a relatively small website such as The Guardian, but it would be a much bigger challenge for a platform of the size of Facebook.

None Portrait The Chair
- Hansard -

Kim Leadbeater?

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q One final question from me, because I know others will want to come in. How do you think platforms such as Meta—I know we have used Meta as an example, but there are others—can be incentivised, beyond the statutory duty that we are currently imposing, to publish their data to allow academics and researchers into their platforms to examine exactly what is going on? Or is this the only way?

Frances Haugen: All industries that live in democratic societies must live within democratic processes, so I do believe that it is absolutely essential that we the public, through our democratic representatives like yourself, have mandatory transparency. The only two other paths I currently see towards getting any transparency out of Meta, because Meta has demonstrated that it does not want to give even the slightest slivers of data—for example, how many moderators there are—are via ESG, so we can threaten then with divestment by saying, “Prosocial companies are transparent with their data,” and via litigation. In the United States, sometimes we can get data out of these companies through the discovery process. If we want consistent and guaranteed access to data, we must put it in the Bill, because those two routes are probabilistic—we cannot ensure that we will get a steady, consistent flow of data, which is what we need to have these systems live within a democratic process.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Turning to the issue of child safety and online abuse with images involving children, what should be added to or removed from the Bill to improve how it protects children online? Have you got any thoughts on that? Some groups have described the Bill’s content as overly broad. Would you make any comments on how effective it will be in terms of online safety for children?

Frances Haugen: I am not well versed on the exact provisions in the Bill regarding child safety. What I can say is that one of the most important things that we need to have in there is transparency around how the platforms in general keep children under the age of 13 off their systems—transparency on those processes—because we know that Facebook is doing an inadequate job. That is the single biggest lever in terms of child safety.

I have talked to researchers at places like Oxford and they talk about how, with social media, one of the critical windows is when children transition through puberty, because they are more sensitive on issues, they do not have great judgment yet and their lives are changing in really profound ways. Having mandatory transparency on what platforms are doing to keep kids off their platforms, and the ability to push for stronger interventions, is vital, because keeping kids off them until they are at least 13, if not 16, is probably the biggest single thing we can do to move the ball down the field for child safety.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You say that transparency is so important. Can you give us any specifics about particular areas that should be subject to transparency?

Frances Haugen: Specifically for children or across the whole platform?

Maria Miller Portrait Mrs Miller
- Hansard - -

Specifically for children.

Frances Haugen: I will give you an example. Facebook has estimated ages for every single person on the platform, because the reality is that lots of adults also lie about their ages when they join, and advertisers want to target very specific demographics—for example, if you are selling a kit for a 40th birthday, you do not want to mis-target that by 10 years. Facebook has estimated ages for everyone on the platform. It could be required to publish every year, so that we could say, “Hey, there are four kids on the platform who you currently believe, using your estimated ages, are 14 years old—based not on how old they say they are, but on your estimate that this person is 14 years old. When did they join the platform? What fraction of your 14-year-olds have been on the platform since they were 10?” That is a vital statistic.

If the platforms were required to publish that every single quarter, we could say, “Wow! You were doing really badly four years ago, and you need to get a lot better.” Those kinds of lagging metrics are a way of allowing the public to grade Facebook’s homework, instead of just trusting Facebook to do a good job.

Facebook already does analyses like this today. They already know that on Facebook Blue, for example, for some age cohorts, 20% of 11-year-olds were on the platform—and back then, not that many kids were online. Today, I would guess a much larger fraction of 11-year-olds are on Instagram. We need to have transparency into how badly they are doing their jobs.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Frances, do you think that the Bill needs to set statutory minimum standards for things such as risk assessments and codes of practice? What will a company such as Facebook do without a minimum standard to go by?

Frances Haugen: It is vital to get into the statute minimum standards for things such as risk assessments and codes of conduct. Facebook has demonstrated time and again—the reality is that other social media platforms have too—that it does the bare minimum to avoid really egregious reputational damage. It does not ensure the level of quality needed for public safety. If you do not put that into the Bill, I worry that it will be watered down by the mountains of lobbyists that Facebook will throw at this problem.

Online Safety Bill (First sitting)

Maria Miller Excerpts
None Portrait The Chair
- Hansard -

Maria Miller has indicated that she would like to ask a question, so if I may, I will bring her in.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Not immediately —go on please.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q Thank you, Chair, and thank you, Maria.

I am just trying to get to the intricacies of this, and of what would happen during the time that it would take for you to recategorise. This platform, which is disseminating harm to both children and adults, would be allowed to carry on while the recategorisation process is under way. There is no mechanism in the Bill to stop that from happening.

Richard Wronka: A really important point here is that we will be regulating that platform from the outset for illegal content and, potentially, for how it protects children on its platform, irrespective of the categorisation approach. That is really important. We will be able to take action, and take action quickly, irrespective of how the platform is categorised. Categorisation really determines whether the adult “legal but harmful” provisions apply. That is the bit that really matters in this context.

It is worth reminding ourselves what those provisions mean: they are more a transparency and accountability measure. Those categorised category 1 platforms will need to have clear terms and conditions applied to adult “legal but harmful” content, and they will need to implement those consistently. We would expect the really serious and egregious concerns to be picked up by the “illegal” part of the regime, and the protection-of-children part of the regime. The categorisation process may go on. It may take a little time, but we will have tools to act in those situations.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q May I bring you on to the powers of the Secretary of State and the question of the regulator’s independence? The Bill will see the Secretary of State, whoever that may be, have a huge amount of personal direction over Ofcom. Do you have any other experience of being directed by a Secretary of State in this way, and what are the consequences of such an approach?

Kevin Bakhurst: We do have some experience across the various sectors that we regulate, but being directed by the Secretary of State does not happen very often. Specifically on the Bill, our strong feeling is that we think it entirely appropriate, and that the Secretary of State should be able to direct us on matters of national security and terrorist content. However, we have some concerns about the wider direction powers of the Secretary of State, and particularly the grounds on which the Secretary of State can direct public policy, and we have expressed those concerns previously.

We feel it is important that the independence of a regulator can be seen to be there and is there in practice. Legally, we feel it important that there is accountability. We have some experience of being taken to judicial review, and there must be accountability for the codes of practice that we put in place. We must be able to show why and how we have created those codes of practice, so that we can be accountable and there is absolute clarity between regulator and Government.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Thank you very much to the witnesses who have taken the time to be with us today. We are really grateful. You have already alluded to the fact that you have quite extensive experience in regulation, even in social media spaces. I think the Committee would be really interested in your view, based on your experience, about what is not in the Bill that should be.

Kevin Bakhurst: Richard has been leading this process, so he can give more detail on it, but suffice to say, we have been engaging closely with DCMS over the last year or so, and we appreciate the fact that it has taken on board a number of our concerns. What we felt we needed from the Bill was clarity as far as possible, and a balance between clarity and flexibility for this regime, which is a very fast-moving field. We feel, by and large, that the Bill has achieved that.

We still have concerns about one or two areas, to pick up on your question. We feel it is really important—hopefully this is something the Committee can contribute to—that the definition of “illegal content” is really clear for platforms, and particularly the area of intent of illegality, which at the moment might be quite tricky for the platforms to pick up on.

Richard Wronka: I completely agree with Kevin that the Bill as it stands gives us a good framework. I think the pre-legislative scrutiny process has been really helpful in getting us there, and I point out that it is already quite a broad and complex regime. We welcome the introduction of issues such as fraudulent advertising and the regulation of commercial pornographic providers, but I think there is a point about ensuring that the Bill does not expand too much further, because that might raise some practical and operational issues for us.

I completely agree with Kevin that clarity in the Bill regarding illegal content and what constitutes that is really important. An additional area that requires clarity is around some of the complex definitions in the Bill, such as journalistic content and democratically important content. Those are inherently tricky issues, but any extra clarity that Parliament can provide in those areas would be welcome.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q You talk about illegal content and that Ofcom would not have a view on particular laws, but do you think there are harmful areas of content that are not currently covered by the law? I am thinking particularly about the issue of intimate image abuse, which is currently under Law Commission review, with recommendations expected very soon. Have you had any thoughts, particularly in the area of policy, about how you deal with issues that should be against the law but currently are not, given that part of your regulatory process is to determine whether companies are operating within the law?

Richard Wronka: I would start by saying that this is a fluid area. We have had a number of conversations with the Law Commission in particular and with other stakeholders, which has been really helpful. We recognise that the Bill includes four new offences, so there is already some fluidity in this space. We are aware that there are other Law Commission proposals that the Government are considering. Incitement to self-harm and flashing imagery that might trigger epilepsy are a couple of issues that come to mind there. Ultimately, where the criminal law sits is a matter for Parliament. We are a regulator: our role here is to make sure that the criminal law is reflected in the regulatory regime properly, rather than to determine or offer a view on where the criminal law should sit. Linking back to our point just a minute ago, we think it is really important that there is as much clarity as possible about how platforms can take some of those potentially quite tricky decisions about whether content meets the criminal threshold.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q May I press a little further? The four new offences that you talked about, and others, and just the whole approach of regulation will lead more individuals to seek redress and support. You are not responsible for individuals; you are responsible for regulation, but you must have some thoughts on whether the current system of victim support will cope with the changes in the law and the new regulatory process. What might you want to see put in place to ensure that those victims are not all landing at your door, erroneously thinking that Ofcom will provide them with individual redress? Do you have any thoughts on that?

Kevin Bakhurst: One area that is very important and which is in the Bill and one of our responsibilities is to make sure there is a sufficiently robust and reactive complaints process from the platforms—one that people feel they can complain to and be heard—and an appeals process. We feel that that is in the Bill. We already receive complaints at Ofcom from people who have issues about platforms and who have gone to the platforms but do not feel their complaints have been properly dealt with or recognised. That is within the video-sharing platform regime. Those individual complaints, although we are not going to be very specific in looking at individual pieces of material per se, are very useful to alert us where there are issues around particular types of offence or harm that the platforms are not seen to be dealing with properly. It will be a really important part of the regime to make sure that platforms provide a complaints process that is easy to navigate and that people can use quite quickly and accessibly.

Richard Wronka: An additional point I would make, building on that, is that this is a really complex ecosystem. We understand that and have spent a lot of the last two or three years trying to get to grips with that complex ecosystem and building relationships with other participants in the ecosystem. It brings in law enforcement, other regulators, and organisations that support victims of crime or online abuse. We will need to find effective ways to work with those organisations. Ultimately, we are a regulator, so there is a limit to what we can do. It is important that those other organisations are able to operate effectively, but that is perhaps slightly outside our role.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

Q To what extent do you think services should publish publicly the transparency and risk assessments that they will be providing to Ofcom?

Richard Wronka: I think our starting point here is that we think transparency is a really important principle within the regime—a fundamental principle. There are specific provisions in the Bill that speak to that, but more generally we are looking for this regime to usher in a new era of transparency across the tech sector, so that users and other participants in this process can be clearer about what platforms are doing at the moment, how effective that is and what more might be done in the future. That is something that will be a guiding principle for us as we pick up regulation.

Specifically, the Bill provides for transparency reports. Not all services in scope will need to provide transparency reports, but category 1 and 2 services will be required to produce annual transparency reports. We think that is really important. At the moment, risk assessments are not intended to be published—that is not provided for in the Bill—but the transparency reports will show the effectiveness of the systems and processes that those platforms have put in place.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I think Lynn Perry is back. Are you with us, Lynn? [Interruption.] No—okay. We will move on to Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I have a question for the Children’s Commissioner. You talked just now about doing more on the advocacy of individual cases. I asked a question of Ofcom in the first panel about the issue of support for victims. Its response was that complaints processes will be part of what it will regulate. Do you think that will be enough to answer your concerns, or are you expecting more than simply ensuring that platforms do what they should do?

Dame Rachel de Souza: I absolutely think that we need to look at independent advocacy and go further. I do not think the Bill does enough to respond to individual cases of abuse and to understand issues and concerns directly from children. Children should not have to exhaust platforms’ ineffective complaints routes. It can take days, weeks, months. Even a few minutes or hours of a nude image being shared online can be enormously traumatising for children.

That should inform Ofcom’s policies and regulation. As we know, the risks and harms of the online world are changing constantly. It serves a useful purpose as an early warning mechanism within online safety regulation. I would like to see independent advocacy that allows a proper representation service for children. We need to hear from children directly, and I would like to see the Bill go further on this.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Is there capacity in the sector to deliver what you are talking about?

Dame Rachel de Souza: I think we need to make capacity. There is some—the NSPCC has its Childline and, as Children’s Commissioner, I have my own advocacy service for children in care. I think this should function in that way, with direct access. So I think that we can create it.

Andy Burrows: May I come in briefly? Our proposals for user advocacy reflect the clear “polluter pays” principle that we think should apply here, to help build and scale up that capacity, but the levy that is covering the direct cost of regulation should also provide really effective user advocacy. That is really important not only to help to give victims what they need in frontline services, but in ensuring that there is a strong counterbalance to some of the largest companies in the world for our sector, which has clear ambition but self-evident constraints.

Dame Rachel de Souza: One of the concerns that has come to me from children—I am talking about hundreds of thousands of children—over the past year is that there is not strong enough advocacy for them and that their complaints are not being met. Girls in particular, following the Everyone’s Invited concerns, have tried so hard to get images down. There is this almost medieval bait-out practice of girls’ images being shared right across platforms. It is horrendous, and the tech firms are not acting quickly enough to get those down. We need proper advocacy and support for children, and I think that they would expect that of us in this groundbreaking Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q There has not been a huge amount of discussion of online gaming in the context of the Bill, despite the fact that for many young people that is the way in which they interact with other people online. Do you think the Bill covers online gaming adequately? A lot of interaction in online gaming is through oral communication—voice chat messages. Do you think that it is possible to properly regulate oral communications in gaming?

Dame Rachel de Souza: Good question. I applaud the Bill for what it does cover. We are looking at a Bill that, for the first time, is going to start protecting children’s rights online, so I am really pleased to see that. We have looked a bit at gaming in the past. In terms of harms, obviously the Bill does not cover gaming in full, but it does cover the safety aspects of children’s experience.

It is always good for us to be looking further. Gaming, we know, has some extremely harmful and individualistic issues with it, particularly around money and the profile of potential grooming and safety. In terms of communications, one of the reasons that I am so concerned about encryption and communications online is that it happens through gaming. We need to make sure that those elements are really firm.

Andy Burrows: It is vitally important that the gaming sector is in scope. We know that there are high-risk gaming sites—for example, Twitch—and gaming-adjacent services such as Discord. To go back to my earlier point about the need for cross-platform provisions to apply here, in gaming we can see grooming pathways that can take on a different character from those on social networks, for example, where we might see abuse pathways where that grooming is taking place at the same time, rather than sequentially from a gaming streaming service, say, to a gaming-adjacent platform such as Discord. I think it is very important that a regulator is equipped to understand the dynamics of the harms and how they will perhaps apply differently on gaming services. That is a very strong and important argument for use advocacy.

I would say a couple of things on oral communications. One-to-one oral communication are excluded from the Bill’s scope—legitimately—but we should recognise that there is a grooming risk there, particularly when that communication is embedded in a platform of wider functionality. There is an argument for a platform to consider all aspects of its functionality within the risk assessment process. Proactive scanning is a different issue.

There is a broader challenge for the Bill, and this takes us back to the fundamental objectives and the very welcome design based around systemic risk identification and mitigation. We know that right now, in respect of oral communications and livestream communications, the industry response is not as developed in terms of detecting and disrupting harm as it is for, say, text-based chat. In keeping with the risk assessment process, it should be clear that if platforms want to offer that functionality, they should have to demonstrate through the risk assessment process that they have high-quality, effective arrangements in place to detect and disrupt harm, and that should be the price of admission. If companies cannot demonstrate that, they should not be offering their services, because there is a high risk to children.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I am sorry, I have to interrupt because of time. Maria Miller.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Two hopefully quick questions. I have been listening carefully. Could you summarise the main changes you will make to your products that your users will notice make them safer, whether they are children or adults? I have heard a lot about problems, but what are the changes you will actually make? Within that, could you talk about how you will improve your complaints system, which earlier witnesses said is inadequate?

Katy Minshall: We would certainly look to engage with Ofcom positively on the requirements it sets out. I am sorry to sound repetitive, but the challenge is that the Bill depends on so many things that do not exist yet and the definitions around what we mean by content harmful to adults or to children. In practice, that makes it challenging to say to you exactly today what approaches we would take. To be clear, we would of course look to continue working with the Government and now Ofcom with the shared objective of making the online space safer for everyone.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q I want to probe you a little on that. Harmful content not being defined means that you will not make any changes other than around that. It is quite a large Bill; surely there are other things you will do differently, no?

Katy Minshall: The lesson of the past three or four years is that we cannot wait for the Bill. We at Twitter are continuing to make changes to our product and our policies to improve safety for everyone, including children.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q So the bill is irrelevant to you.

Katy Minshall: The Bill is a really important piece of regulation, which is why I was so pleased to come today and share our perspectives. We are continuing to engage positively with Ofcom. What I am trying to say is that until we see the full extent of the requirements and definitions, it is hard to set out exactly what steps we would take with regards to the Bill.

Ben Bradley: To add to that point, it is hard to be specific about some of the specific changes we would make because a lot of the detail of the Bill defers to Ofcom guidance and the codes of practice. Obviously we all have the duties around child safety and adult safety, but the Ofcom guidance will suggest specific measures that we can take to do that, some of which we may take already, and some of which may go further than what we already do. Once we see the details of the codes, we will be able to give a clearer answer.

Broadly from a TikTok perspective, through the design of the product and the way we approach safety, we are in a good place for when the new regime comes in, because we are regulated by Ofcom in the VSP regime, but we would have to wait for the full amount of detail. But outside some of the companies that you will hear from today, this will touch 20,000 companies and will raise the floor for all the companies that will be regulated under the regime.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q But you cannot give any further detail about specific changes you will make as a result of this legislation because you have not seen the guidance and the codes.

Ben Bradley: Yes, the codes of practice will recommend specific steps that we should take to achieve our duties. Until we see the detail of those codes it is hard to be specific about some of the changes that we would make.

None Portrait The Chair
- Hansard -

Barbara, you have just a couple of minutes.

Online Safety Bill (Second sitting)

Maria Miller Excerpts
None Portrait The Chair
- Hansard -

You are nodding, Ms Eagelton.

Jessica Eagelton: I agree with Professor McGlynn. Thinking about the broader landscape and intimate image abuse as well, I think there are some significant gaps. There is quite a piecemeal approach at the moment and issues that we are seeing in terms of enforcing measures on domestic abuse as well.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

Q Thank you to all the panellists; it is incredibly helpful to have you here. The strength of the Bill will really be underpinned by the strength of the criminal law that underpins it, and schedule 7 lists offences that relate to sexual images, including revenge pornography, as priority offences. Can the witnesses say whether they think the law is sufficient to protect women from having their intimate pictures shared without their consent, or indeed whether the Bill will do anything to prevent the making and sharing of deepfake images? What would you like to see?

Professor Clare McGlynn: You make a very good point about how, in essence, criminal offences are now going to play a key part in the obligations of platforms under this Bill. In general, historically, the criminal law has not been a friend to women and girls. The criminal law was not written, designed or interpreted with women’s harms in mind. That means that you have a very piecemeal, confusing, out-of-date criminal law, particularly as regards online abuse, yet that is the basis on which we have to go forward. That is an unfortunate place for us to be, but I think we can strengthen it.

We could strengthen schedule 7 by, for example, including trafficking offences. There are tens of thousands of cases of trafficking, as we know from yourselves and whistleblowers, that platforms could be doing so much more about, but that is not a priority offence. The Obscene Publications Act distribution of unlawful images offence is not included. That means that incest porn, for example, is not a priority offence; it could be if we put obscene publications in that provision. Cyber-flashing, which again companies could take a lot of steps to act against, is not listed as a priority offence. Blackmail—sexual extortion, which has risen exponentially during the pandemic—again is not listed as a priority offence.

Deepfake pornography is a rising phenomenon. It is not an offence in English law to distribute deepfake pornography at the moment. That could be a very straightforward, simple change in the Bill. Only a few words are needed. It is very straightforward to make that a criminal offence, thanks to Scots law, where it is actually an offence to distribute altered images. The way the Bill is structured means the platforms will have to go by the highest standard, so in relation to deepfake porn, it would be interpreted as a priority harm—assuming that schedule 7 is actually altered to include all the Scottish offences, and the Northern Irish ones, which are absent at the moment.

The deepfake example points to a wider problem with the criminal law on online abuse: the laws vary considerably across the jurisdictions. There are very different laws on down-blousing, deepfake porn, intimate image abuse, extreme pornography, across all the different jurisdictions, so among the hundreds of lawyers the platforms are appointing, I hope they are appointing some Scots criminal lawyers, because that is where the highest standard tends to be.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Would the other panellists like to comment on this?

Jessica Eagelton: I think something that will particularly help in this instance is having that broad code of practice; that is a really important addition that must be made to the Bill. Refuge is the largest specialist provider of gender-based violence services in the country. We have a specialist tech abuse team who specialise in technology-facilitated domestic abuse, and what they have seen is that, pretty consistently, survivors are being let down by the platforms. They wait weeks and weeks for responses—months sometimes—if they get a response at all, and the reporting systems are just not up to scratch.

I think it will help to have the broad code of practice that Janaya mentioned. We collaborated with others to produce a workable example of what that could look like, for Ofcom to hopefully take as a starting point if it is mandated in the Bill. That sets out steps to improve the victim journey through content reporting, for example. Hopefully, via the code of practice, a victim of deepfakes and other forms of intimate image abuse would be able to have a more streamlined, better response from platforms.

I would also like to say, just touching on the point about schedule 7, that from the point of view of domestic abuse, there is another significant gap in that: controlling and coercive behaviour is not listed, but it should be. Controlling and coercive behaviour is one of the most common forms of domestic abuse. It carries serious risk; it is one of the key aggravating factors for domestic homicide, and we are seeing countless examples of that online, so we think that is another gap in schedule 7.

None Portrait The Chair
- Hansard -

Ms Walker?

Janaya Walker: Some of these discussions almost reiterate what I was saying earlier about the problematic nature of this, in that so much of what companies are going to be directed to do will be tied only to the specific schedule 7 offences. There have been lots of discussions about how you respond to some harms that reach a threshold of criminality and others that do not, but that really contrasts with the best practice approach to addressing violence against women and girls, which is really trying to understand the context and all of the ways that it manifests. There is a real worry among violence against women and girls organisations about the minimal response to content that is harmful to adults and children, but will not require taking such a rigorous approach.

Having the definition of violence against women and girls on the face of the Bill allows us to retain those expectations on providers as technology changes and new forms of abuse emerge, because the definition is there. It is VAWG as a whole that we are expecting the companies to address, rather than a changing list of offences that may or may not be captured in criminal law.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q My first question is for Lulu. Do small tech companies have enough staff with technical expertise to be able to fulfil their obligations under the Bill?

Lulu Freemont: It is a great question. One of the biggest challenges is capacity. We hear quite a lot from the smaller tech businesses within our membership that they will have to divert their staff away from existing work to comply with the regime. They do not have compliance teams, and they probably do not have legal counsel. Even at this stage, to try to understand the Bill as it is currently drafted—there are lots of gaps—they are coming to us and saying, “What does this mean in practice?” They do not have the answers, or the capability to identify that. Attendant regulatory costs—thinking about the staff that you have and the cost, and making sure the regulation is proportionate to the need to divert away from business development or whatever work you might be doing in your business—are really fundamental.

Another real risk, and something in the Bill that smaller businesses are quite concerned about, is the potential proposal to extend the senior management liability provisions. We can understand them being in there to enable the regulators to do their job—information requests—but if there is any extension into individual pieces of content, coupled with a real lack of definitions, those businesses might find themselves in the position of restricting access to their services, removing too much content or feeling like they cannot comply with the regime in a proportionate way. That is obviously a very extreme case study. It will be Ofcom’s role to make sure that those businesses are being proportionate and understand the provisions, but the senior management liability does have a real, chilling impact on the smaller businesses within our membership.

Adam Hildreth: One of the challenges that we have seen over the last few years is that you can have a business that is small in revenue but has a huge global user base, with millions of users, so it is not really a small business; it just has not got to the point where it is getting advertisers and getting users to pay for it. I have a challenge on the definition of a small to medium-sized business. Absolutely, for start-ups with four people in a room—or perhaps even still just two—that do not have legal counsel or anything else, we need to make it simple for those types of businesses to ingest and understand what the principles are and what is expected of them. Hopefully they will be able to do quite a lot early on.

The real challenge comes when someone labels themselves as a small business but they have millions of users across the globe—and sometimes actually quite a lot of people working for them. Some of the biggest tech businesses in the world that we all use had tens of people working for them at one point in time, when they had millions of users. That is the challenge, because there is an expectation for the big-tier providers to be spending an awful lot of money, when the small companies are actually directly competing with them. There is a challenge to understanding the definition a small business and whether that is revenue-focused, employee-focused or about how many users it has—there may be other metrics.

Ian Stevenson: One of the key questions is how much staffing this will actually take. Every business in the UK that processes data is subject to GDPR from day one. Few of them have a dedicated data protection officer from day one; it is a role or responsibility that gets taken on by somebody within the organisation, or maybe somebody on the board who has some knowledge. That is facilitated by the fact that there are a really clear set of requirements there, and there are a lot of services you can buy and consume that help you deliver compliance. If we can get to a point where we have codes of practice that make very clear recommendations, then even small organisations that perhaps do not have that many staff to divert should be able to achieve some of the basic requirements of online safety by buying in the services and expertise that they need. We have seen with GDPR that many of those services are affordable to small business.

If we can get the clarity of what is required right, then the staff burden does not have to be that great, but we should all remember that the purpose of the Bill is to stop some of the egregiously bad things that happen to people as a result of harmful content, harmful behaviours and harmful contact online. Those things have a cost in the same way that implementing data privacy has a cost. To come back to Lulu’s point, it has to be proportionate to the business.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Adam, you said a few moments ago that companies are starting to put safety at the core of what they do, which will be welcome to us all—maybe it should have happened a lot earlier. I know you have worked a lot in that area. Regulators and company owners will have to depend on an ethical culture in their organisations if they are going to abide by the new regulations, because they cannot micromanage and regulators cannot micromanage. Will the Bill do enough to drive that ethical culture? If not, what more could it do or could the industry do? I would be really interested in everybody’s answer to this one, but I will start with Adam.

Adam Hildreth: What we are seeing from the people that are getting really good at this and that really understand it is that they are treating this as a proper risk assessment, at a very serious level, across the globe. When we are talking about tier 1s, they are global businesses. When they do it really well, they understand risk and how they are going to roll out systems, technology, processes and people in order to address that. That can take time. Yes, they understand the risk, who it is impacting and what they are going to do about it, but they still need to train people and develop processes and maybe buy or build technology to do it.

We are starting to see that work being done really well. It is done almost in the same way that you would risk assess anything else: corporate travel, health and safety in the workplace—anything. It should really become one of those pillars. All those areas I have just gone through are regulated. Once you have regulation there, it justifies why someone is doing a risk assessment, and you will get businesses and corporates going through that risk assessment process. We are seeing others that do not do the same level of risk assessment and they do not have that same buy-in.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Lulu, how do you drive a culture change?

Lulu Freemont: TechUK’s membership is really broad. We have cyber and defence companies in our membership, and large platforms and telcos. We speak on behalf of the sector. We would say that there is a real commitment to safety and security.

To bring it back to regulation, the risk-based approach is very much the right one—one that we think has the potential to really deliver—but we have to think about the tech ecosystem and its diversity. Lots of TechUK members are on the business-to-business side and are thinking about the role that they play in supporting the infrastructure for many of the platforms to operate. They are not entirely clear that they are exempt in the Bill. We understand that it is a very clear policy intention to exempt those businesses, but they do not have the level of legal clarity that they need to understand their role as access facilities within the tech.

That is just one example of a part of the sector that you would not expect to be part of this culture change or regulation but which is being caught in it slightly as an unintended consequence of legal differences or misinterpretations. Coming from that wide-sector perspective, we think that we need clarity on those issues to understand the different functionalities, and each platform and service will be different in their approach to this stuff.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Ian, how do you drive a culture change in the sector?

Ian Stevenson: I think you have to look at the change you are trying to effect. For many people in the sector, there is a lack of awareness about what happens when the need to consider safety in building features is not put first. Even when you realise how many bad things can happen online, if you do not know what to do about it, you tend not to be able to do anything about it.

If we want to change culture—it is the same for individual organisations as for the sector as a whole—we have to educate people on what the problem is and give them the tools to feel empowered to do something about it. If you educate and empower people, you remove the barrier to change. In some places, an extremely ethical people-centric and safety-focused culture very naturally emerges, but in others, less so. That is precisely where making it a first-class citizen in terms of risk assessment for boards and management becomes so important. When people see management caring about things, that gets pushed out through the organisations.

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q In your view, what needs to be added or taken away from the Bill to help it achieve the Government’s aim of making the UK

“the safest place in the world to be online”?

Lulu Freemont: First, I want to outline that there are some strong parts in the Bill that the sector really supports. I think the majority of stakeholders would agree that the objectives are the right ones. The Bill tries to strike a balance between safety, free speech and encouraging innovation and investment in the UK’s digital economy. The approach—risk-based, systems-led and proportionate—is the right one for the 25,000 companies that are in scope. As it does not focus on individual pieces of content, it has the potential to be future-proof and to achieve longer-term outcomes.

The second area in the Bill that we think is strong is the prioritisation of illegal content. We very much welcome the clear definitions of illegal content on the face of the Bill, which are incredibly useful for businesses as they start to think about preparing for their risk assessment on illegal content. We really support Ofcom as the appropriate regulator.

There are some parts of the Bill that need specific focus and, potentially, amendments, to enable it to deliver on those objectives without unintended consequences. I have already mentioned a few of those areas. The first is defining harmful content in primary legislation. We can leave it to codes to identify the interpretations around that, but we need definitions of harmful content so that businesses can start to understand what they need to do.

Secondly, we need clarity that businesses will not be required to monitor every piece of content as a result of the Bill. General monitoring is prohibited in other regions, and we have concerns that the Online Safety Bill is drifting away from those norms. The challenges of general monitoring are well known: it encroaches on individual rights and could result in the over-removal of content. Again, we do not think that the intention is to require companies of all sizes to look at every piece of content on their site, but it might be one of the unintended consequences, so we would like an explicit prohibition of general monitoring on the face of the Bill.

We would like to remove the far-reaching amendment powers of the Secretary of State. We understand the need for technical powers, which are best practised within regulation, but taking those further so that the Secretary of State can amend the regime in such an extreme way to align with public policy is of real concern, particularly to smaller businesses looking to confidently put in place systems and processes. We would like some consideration of keeping senior management liability as it is. Extending that further is only going to increase the chilling impact that it is having and the environment it is creating within UK investment. The final area, which I have just spoken about, is clarifying the scope. The business-to-business companies in our membership need clarity that they are not in scope and for that intention to be made clear on the face of the Bill.

We really support the Bill. We think it has the potential to deliver. There are just a few key areas that need to be changed or amended slightly to provide businesses with clarity and reassurances that the policy intentions are being delivered on.

Adam Hildreth: To add to that—Lulu has covered absolutely everything, and I agree—the critical bit is not monitoring individual pieces of content. Once you have done your risk assessment and put in place your systems, processes, people and technology, that is what people are signing up for. They are not signing up for this end assessment where, because you find that one piece of harmful content exists, or maybe many, you have failed to abide by what you are really signing up to.

That is the worry from my perspective: that people do a full risk assessment, implement all the systems, put in place all the people, technology and processes that they need, do the best job they can and have understood what investment they are putting in, and someone comes along and makes a report to a regulator—Ofcom, in this sense—and says, “I found this piece of content there.” That may expose weaknesses, but the very best risk assessments are ongoing ones anyway, where you do not just put it away in a filing cabinet somewhere and say, “That’s done.” The definitions of online harms and harmful content change on a daily basis, even for the biggest social media platforms; they change all the time. There was talk earlier about child sexual abuse material that appears as cartoons, which would not necessarily be defined by certain legislation as illegal. Hopefully the legislation will catch up, but that is where that risk assessment needs to be made again, and policies may need to be changed and everything else. I just hope we do not get to the point where the individual monitoring of content, or content misses, is the goal of the Bill—that the approach taken to online safety is this overall one.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q Jared Sine, did you have anything to add?

Jared Sine: Sure. I would add a couple of thoughts. We run our own age verification scans, which we do through the traditional age gate but also through a number of other scans that we run.

Again, online dating platforms are a little different. We warn our users upfront that, as they are going to be meeting people in real life, there is a fine balance between safety and privacy, and we tend to lean a little more towards safety. We announce to our users that we are going to run message scans to make sure there is no inappropriate behaviour. In fact, one of the tools we have rolled out is called “Are you sure? Does this bother you?”, through which our AI looks at the message a user is planning to send and, if it is an inappropriate message, a flag will pop up that says, “Are you sure you want to send this?” Then, if they go ahead and send it, the person receiving it at the other end will get a pop-up that says, “This may not be something you want to see. Go ahead and click here if you want to.” If they open it, they then get another pop-up that asks “Does this bother you?” and, if it does, you can report the user immediately.

We think that is an important step to keep our platform safe. We make sure our users know that it is happening, so it is not under the table. However, we think there has to be a balance between safety and privacy, especially when we have users who are meeting in person. We have actually demonstrated on our platforms that this reduces harassment and behaviour that would otherwise be untoward or that you would not want on the platform.

We think that we have to be careful not to tie the hands of industry to be able to come up with technological solutions and advances that can work side by side with third-party tools and solutions. We have third-party ID verification tools that we use. If we identify or believe a user is under the age of 18, we push them through an ID verification process.

The other thing to remember, particularly as it relates to online dating, is that companies such as ours and Bumble have done the right thing by saying “18-plus only on our platforms”. There is no law that says that an online dating platform has to be 18-plus, but we think it is right thing to do. I am a father of five kids; I would not want kids on my platform. We are very vigilant in taking steps to make sure we are using the latest and greatest tools available to try to make sure that our platforms are safe.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Rachel, we have, in you, what we are told is a leading, pre-eminent authority on the issue of age verification, so we are listening very carefully to what you say. I am thinking about the evidence we had earlier today, which said that it is reasonably straightforward for a large majority of young people to subvert age verification through the use of VPNs. You have been advocating third-party verification. How could we also deal with this issue of subverting the process through the use of the VPNs?

Dr Rachel O'Connell: I am the author of the technical standard PAS 1296, an age checking code of practice, which is becoming a global standard at the moment. We worked a lot with privacy and security and identity experts. It should have taken nine months, but it took a bit longer. There was a lot of thought that went into it. Those systems were developed to, as I just described, ensure a zero data, zero knowledge kind of model. What they do is enable those verifications to take place and reduce the requirement. There is a distinction between monitoring your systems, as was said earlier, for age verification purposes and abuse management. They are very different. You have to have abuse management systems. It is like saying that if you have a nightclub, you have to have bouncers. Of course you have to check things out. You need bouncers at the door. You cannot let people go into the venue, then afterwards say that you are spotting bad behaviour. You have to check at the door that they are the appropriate age to get into the venue.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q Can they not just hop on a VPN and bypass the whole system anyway?

Dr Rachel O'Connell: I think you guys will be aware of the DCMS programme of work about the verification of children last year. As part of that, there was a piece of research that asked children what they would think about age verification. The predominant thing that came across from young children is that they are really tired of having to deal with weirdos and pervs. It is an everyday occurrence for them.

To just deviate slightly to the business model, my PhD is in forensics and tracking paedophile activity on the internet way back in the ’90s. At that time, guys would have to look for kids. Nowadays, on TikTok and various livestream platforms, the algorithms recognise that an individual—a man, for example—is very interested in looking at content produced by kids. The algorithms see that a couple of times and go, “You don’t have to look anymore. We are going to seamlessly connect you with kids who livestream. We are also going to connect you with other men that like looking at this stuff.”

If you are on these livestream sites at 3 o’clock in the morning, you can see these kids who are having sleepovers or something. They put their phone down to record whatever the latest TikTok dance is, and they think that they are broadcasting to other kids. You would assume that, but what they then hear is the little pops of love hearts coming on to the screen and guys’ voices saying, “Hey sweetie, you look really cute. Lick your lips. Spread your legs.” You know where I am going with this.

The Online Safety Bill should look at the systems and processes that underpin these platforms, because there is gamification of kids. Kids want to become influencers—maybe become really famous. They see the views counter and think, “Wow, there are 200 people looking at us.” Those people are often men, who will co-ordinate their activities at the back. They will push the boys a little bit further, and if a girl is on her own, they will see. If the child does not respond to the request, they will drop off. The kid will think, “Oh my God. Well, maybe I should do it this one time.”

What we have seen is a quadrupling of child sexual abuse material online that has been termed “self-generated”, because the individual offender hasn’t actually produced it. From a psychological perspective, it is a really bad name, but that is a separate topic. Imagine if that was your kid who had been coerced into something that had then been labelled as “self-generated”. The businesses models that underpin those processes that happen online are certainly something that should be really within scope.

We do not spend enough time thinking about the implications of the use of recommendation engines and so on. I think the idea of the VPN is a bit of a red herring. Children want safety. They do not want to have to deal with this sort of stuff online. There are other elements. If you were a child and felt that you might be a little bit fat, you could go on YouTube and see whether you could diet or something. The algorithms will pick that up also. There is a tsunami of dieting and thinspiration stuff. There is psychological harm to children as a result of the systems and processes that these companies operate.

There was research into age verification solutions and trials run with BT. Basically, the feedback from both parents and children was, “Why doesn’t this exist already?”. If you go into your native EE app where it says, “Manage my family” and put in your first name, last name and mobile number and your child’s first name, last name and date of birth, it is then verified that you are their parent. When the child goes on Instagram or TikTok, they put in their first and last name. The only additional data point is the parent’s mobile number. The parent gets a notification and they say yes or no to access.

There are solutions out there. As others have mentioned, the young people want them and the parents want them. Will people try to work around them? That can happen, but if it is a parent-initiated process or a child-initiated process, you have the means to know the age bands of the users. From a business perspective, it makes a lot of sense because you can have a granular approach to the offerings you give to each of your customers in different age bands.

Nima Elmi: Just to add to what Rachel has said, I think she has articulated extremely well the complexities of the issues around not only age verification, but business models. Ultimately, this is such a complex matter that it requires continued consultation across industry, experts and civil society to identity pragmatic recommendations for industry when it comes to not only verifying the age of their users, but thinking about the nuanced differences between platforms, purposes, functionality and business models, and what that means.

In the context of the work we do here at Bumble, we are clear about our guidelines requiring people to be 18-plus to download our products from app stores, as well as ensuring that we have robust moderation processes to identify and remove under-18s from our platforms. There is an opportunity here for the Bill to go further in providing clarity and guidance on the issue of accessibility of children to services.

Many others have said over the course of today’s evidence that there needs to be a bit more colour put into definitions, particularly when certain sections of the Bill refer to what constitutes a “significant number of users” for determining child accessibility to platforms. Coupled with the fact that age verification or assurance is a complex area in and of itself and the nuance between how social media may engage with it versus a dating or social networking platform, I think that more guidance is very much needed and a much more nuanced approach would be welcome.

None Portrait The Chair
- Hansard -

I have three Members and the Minister to get in before 5 o’clock, so I urge brief questions and answers please.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Ms McDonald?

Rhiannon-Faye McDonald: I was just thinking of what I could add to what Susie has said. My understanding is that it is difficult to deal with cross-platform abuse because of the ability to share information between different platforms—for example, where a platform has identified an issue or offender and not shared that information with other platforms on which someone may continue the abuse. I am not an expert in tech and cannot present you with a solution to that, but I feel that sharing intelligence would be an important part of the solution.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q What risks do end-to-end encrypted platforms pose to children, and how should the Bill seek to mitigate those risks specifically?

Susie Hargreaves: We are very clear that end-to-end encryption should be within scope, as you have heard from other speakers today. Obviously, the huge threat on the horizon is the end-to-end encryption on Messenger, which would result in the loss of millions of images of child sexual abuse. In common with previous speakers, we believe that the technology is there. We need not to demonise end-to-end encryption, which in itself is not bad; what we need to do is ensure that children do not suffer as a consequence. We must have mitigations and safety mechanisms in place so that we do not lose these child sexual abuse images, because that means that we will not be able to find and support those children.

Alongside all the child protection charities, we are looking to ensure that protections equivalent to the current ones are in place in the future. We do not accept that the internet industry cannot put them in place. We know from experts such as Dr Hany Farid, who created PhotoDNA, that those mechanisms and protections exist, and we need to ensure that they are put in place so that children do not suffer as a consequence of the introduction of end-to-end encryption. Rhiannon has her own experiences as a survivor, so I am sure she would agree with that.

Rhiannon-Faye McDonald: I absolutely would. I feel very strongly about this issue, which has been concerning me for quite some time. I do not want to share too much, but I am a victim of online grooming and child sex abuse. There were images and videos involved, and I do not know where they are and who has seen them. I will never know that. I will never have any control over it. It is horrifying. Even though my abuse happened 19 years ago, I still walk down the street wondering whether somebody has seen those images and recognises me from them. It has a lifelong impact on the child, and it impacts on recovery. I feel very strongly that if end-to-end encryption is implemented on platforms, there must be safeguards in place to ensure we can continue to find and remove these images, because I know how important that is to the subject of those images.

Maria Miller Portrait Mrs Miller
- Hansard - -

Q So what needs to change in the Bill to make sure that happens? I am not clear.

Susie Hargreaves: We just want to make sure that the ability to scan in an end-to-end encrypted environment is included in the Bill in some way.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q The ability to scan is there right now—we have got that—so you are just trying to make sure we are standing still, basically. Am I correct in my understanding?

Susie Hargreaves: I think with technology you can never stand still. We do not know what is coming down the line. We have to deal with the here and now, but we also need to be prepared to deal with whatever comes down the line. The answer, “Okay, we will just get people to report,” is not a good enough replacement for the ability to scan for images.

When the privacy directive was introduced in Europe and Facebook stopped scanning for a short period, we lost millions of images. What we know is that we must continue to have those safety mechanisms in place. We need to work collectively to do that, because it is not acceptable to lose millions of images of child sexual abuse and create a forum where people can safely share them without any repercussions, as Rhiannon says. One survivor we talked to in this space said that one of her images had been recirculated 70,000 times. The ability to have a hash of a unique image, go out and find those duplicates and make sure they are removed means that people are not re-victimised on a daily basis. That is essential.

Online Safety Bill

Maria Miller Excerpts
2nd reading
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Online Safety Act 2023 View all Online Safety Act 2023 Debates Read Hansard Text Read Debate Ministerial Extracts
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

For too long, the tech giants have been able to dismiss the harms they create for the people we represent because they do not take seriously their responsibility for how their products are designed and used, which is why this legislation is vital.

The Bill will start to change the destructive culture in the tech industry. We live simultaneously in online and offline worlds, and we expect the rules and the culture to be the same in both, but at the moment, they are not. When I visited the big tech companies in Silicon Valley as Secretary of State in 2014 to talk about online moderation, which was almost completely absent at that stage, and child abuse images, which were not regularly removed, I rapidly concluded that the only way to solve the problem and the cultural deficit I encountered would be to regulate. I think this Bill has its roots in those meetings, so I welcome it and the Government’s approach.

I am pleased to see that measures on many of the issues on which I have been campaigning in the years since 2014 have come to fruition in this Bill, but there is still room for improvement. I welcome the criminalisation of cyber-flashing, and I pay tribute to Grazia, Clare McGlynn and Bumble for all their work with me and many colleagues in this place.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - - - Excerpts

Scotland banned cyber-flashing in 2010, but that ban includes a motivation test, rather than just a consent test, so a staggering 95% of cyber-flashing goes unpunished. Does the right hon. Lady agree that we should not make the same mistake?

Maria Miller Portrait Mrs Miller
- Hansard - -

I will come on to that shortly, and the hon. Lady knows I agree with her. This is something the Government need to take seriously.

The second thing I support in this Bill is limiting anonymous online abuse. Again, I pay tribute to the Football Association, with which I have worked closely, Glitch, the Centenary Action Group, Compassion in Politics, Hope not Hate and Kick It Out. They have all done a tremendous job, working with many of us in this place, to get to this point.

Finally, I support preventing children from accessing pornography, although I echo what we heard earlier about it being three years too late. It is shameful that this measure was not enacted earlier.

The Minister knows that three demands are coming his way from me. We need to future-proof our approach to the law in this area. Tech moves quickly—quicker than the Government’s approach to legislation, which leaves us playing whack-a-mole. The devious methods of causing harm change rapidly, as do the motivations of perpetrators, to answer the point raised by the hon. Member for Bath (Wera Hobhouse). What stays the same is the lack of consent from victims, so will the Government please look at that as a way of future-proofing our law? A worrying example of that is deepfake technology that creates pornographic images of women. That is currently totally lawful. Nudification software is commercially available and uses images—only of women —to create nude images. I have already stated publicly that that should be banned. It has been in South Korea and Taiwan, yet our law is playing catch-up.

The second issue that the Government need to address is the fact that they are creating many more victims as a result of this Bill. We need to make sure that victim support is in place to augment the amazing work of organisations such as the Revenge Porn Helpline. Finally, to echo the point made by my hon. Friend the Member for Watford (Dean Russell), let me say that this is a complex area, as we are proving with every speech in this debate. I pay tribute to the Select Committee Chair, who is no longer in his place, and the Joint Committee Chair, but I believe that we need a joint standing committee to scrutinise the implementation of this Bill when it is enacted. This is a world-class piece of legislation to change culture, but we also need other countries to adopt a similar approach. A global approach is needed if this is to work to end the wild west.

UK City of Culture: Southampton’s Bid

Maria Miller Excerpts
Tuesday 19th April 2022

(2 years ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Caroline Nokes Portrait Caroline Nokes
- Hansard - - - Excerpts

Of course, I agree with my hon. Friend; he is absolutely bang-on and I will mention some of the fantastic attributes Eastleigh is bringing to the wider bid. I am heartened by the strength of the partnerships supporting the bid, as my hon. Friend emphasises.

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

On that, may I point out that all parts of Hampshire would be interested in partnering with the city of Southampton in its bid to be city of culture? My own constituency of Basingstoke brings the likes of the Anvil theatre, one of the top 10 concert halls in Europe, as well as the Haymarket and the Proteus theatre. There is a wealth of support there for this bid, and that can also help with the legacy which is so important and I know my right hon. Friend puts great store by.

Draft Online Safety Bill Report

Maria Miller Excerpts
Thursday 13th January 2022

(2 years, 4 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - -

Back in 2013, when I was Secretary of State for Culture, Media and Sport—I am one of at least two former Culture Secretaries in this debate—I visited a number of social media companies in Silicon Valley. I spent time looking at their ethos, priorities and vision, and I rapidly came to the conclusion that statutory regulation would be absolutely essential for the sector in the UK; I pay tribute to my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright) for taking that forward.

I welcome the Joint Committee’s work on this issue. The report is absolutely right to start by saying that self-regulation of online services has failed. It has failed to ensure the design of products that are safe to use, that products are age-appropriate and that content is properly monitored and moderated—hygiene factors in almost any other sector in this country.

When she gave evidence to the Joint Committee, the Secretary of State was completely right in saying that the first priority has to be to end online abuse. Regulation will only ever be part of that, as my hon. Friend the Member for Gosport (Dame Caroline Dinenage) said: there are many other elements to the issue. I would like to touch on a couple of those and underline some of the points in the report.

Better law that properly recognises new types of crime in the online world has to be a priority of the Government now—they have gone a great deal of the way towards recognising that following the Law Commission’s reports, but there is much more to do, particularly given the violence against women and girls strategy and its acknowledgement that online abuse disproportionately affects women. The Joint Committee’s report touches on this issue, but I would go further.

The Government have already recognised the need to criminalise cyber-flashing and deep fake, but the development of such things as nudification software shows how sickeningly ingenious the sector is in inventing new and different ways to perpetrate harm. When it comes to online harm, we need to make sure that the law is better and future-proofed. As well as putting current laws into the Bill, will the Minister produce a schedule of other areas against which the Government intend to legislate—particularly in the area of intimate image abuse—to make sure that all the law is up to date? The regulator needs a strong criminal law in place if it is to be effective.

I underline the issue of individual redress. I whole-heartedly support the Joint Committee recommendation on reporting mechanisms and the idea of an online safety ombudsman; that will be really important to make sure that the Bill does not appear to our constituents, the people who suffer online abuse, as something remote and not practically helpful in their day-to-day life online. I would go one step further and make sure that victim support groups are also well funded because many more victims will be coming forward.

A number of right hon. and hon. Members have touched on the issue of anonymity. We know that the Bill is silent on that issue at the moment, and the Joint Committee is right to recommend changes. I do not fully agree with its recommendations; in my view, the evidence shows that anonymity creates an atmosphere of impunity for those who are abusive. Polling by Compassion in Politics shows that more than 80% of social media users would provide evidence to get verified accounts, and making verified accounts the default option—not the only option—would help to stop some of the abuse. It would not stop all of the abuse, of course, but it would help to change the ethos.

The Joint Committee’s report underlines the importance of having evidence, as the lack of transparency and reporting makes scrutiny of the impact of social media very difficult. I wholeheartedly support the idea of ensuring better, clearer and more transparent reporting.

A self-regulated online world has failed. Statutory regulation is the only way forward, but we need to encourage others around the world to follow suit. I applaud the Government for their approach, and I would add to the report’s recommendations by saying that the Government should use their work in developing strong relations around the world and in establishing new trade agreements to discuss the stake we all have in a safe and healthy online world.

Elected Women Representatives: Online Abuse

Maria Miller Excerpts
Tuesday 20th April 2021

(3 years ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- Hansard - -

I beg to move,

That this House has considered online abuse of elected women representatives.

It is a great pleasure to serve under your chairmanship, Mr Paisley. I start by thanking the Backbench Business Committee for granting this debate, which was called for by members of the all-party parliamentary group on women in Parliament. It is a great privilege for me to be able to open the debate. May I, in advance, thank hon. Members for their overwhelming support for the debate today? I also thank the organisations that have provided briefings, including the Local Government Association, the Centenary Action Group, the Fawcett Society and Compassion in Politics.

Parliaments are at their best when they are diverse. King’s College London’s global institute for women’s leadership, chaired by Julia Gillard, put it well when it showed the evidence that equal participation of men and women strengthens our democracy. We have made huge progress here: one in three of our Members is now female. Record numbers of women stood for election in 2019, and record numbers were elected to this place. They were inspired to make a difference to their community and their country; inspired not to accept the world as it is but to put themselves forward, despite the barriers that remain, to be here and to be able to change things; and inspired by many of the right hon. and hon. Members taking part in the debate today. Many Members, including in this debate, do not want to accept politics as it is today —a politics in which aggression and abuse are part and parcel of our working day. That is not the democracy that we have signed up for. We want a democracy that relies not on that, but on reasoned debate.

The evidence is also clear that online abuse is not only a factor in preventing women from taking on careers in politics; it is also cutting short the careers of those who stand and are successful in being elected. Members of the APPG on women in Parliament called for this debate because tackling online abuse, particularly when it impacts the freedom of speech of elected Members, needs to be a bigger priority, not only for the Government but for Parliament.

Mr Speaker has done an incredible amount of work to improve the physical security of Members. I very much welcome in particular the statement on 9 March 2021 by my hon. Friend the Minister for the Constitution and Devolution, updating the House on progress on tackling intimidation in public life, but three years on from the original review, we are still waiting for a cohesive plan, with Parliament and Government working together. We need to redouble our efforts in this area, ensuring that we have a comprehensive package of the right laws to tackle online abuse, the right support and awareness of the law that is in place, the right punishments for those who seek to silence democratically elected representatives, and the right support for staff and colleagues who are victims of these dreadful abuses.

Let us be clear about what this abuse looks like. It is not offensive language. It is not strongly held views in political debate. For some colleagues, online abuse is a threat of rape, murder, stalking, or physical violence towards them or their families. At other times it is mass co-ordinated online harassment by groups. Online anonymity means the police can find it difficult to take action swiftly. Above all, online abuse is an attempt to keep women away from contributing to political life. As elected representatives, we condemn those who take that approach, and we need to redouble our efforts to stop them.

I want to be clear: MP or not, there is absolutely no place for any vile abuse online and the law needs to protect everyone better. I look forward to the introduction of the Government’s online harms Bill as soon as possible. Legislation needs to specifically address online abuse, which is designed to bully, intimidate and silence. When it comes to elected representatives, we need to recognise that the impact of that abuse is particularly concerning and unacceptable. In a free and open democracy, no elected representative should ever be intimidated or feel constrained in speaking out on behalf of their constituents, but that, at the moment, is exactly what happens.

This debate is about elected female representatives who endure online abuse disproportionately, but I know that many, including the Local Government Association, are concerned that those from black and minority ethnic backgrounds, members of the LGBT+ community and those with disabilities are also being particularly targeted.

The current police advice is to remove victims’ social media presence, which is completely unacceptable as it prevents candidates from campaigning online and engaging effectively with residents once they are elected. It is clear that action is needed swiftly, well before the next general election, if we are to stop that unacceptable impact on our democracy.

Let us look at the evidence before us. Elected women MPs already report that the levels of online abuse and treatment from the media that they experience would have stopped them putting themselves forward for selection if they had known the extent to which it would affect them and their family lives, according to research from the Fawcett Society, which might help to account for why women’s tenure as Members of Parliament is significantly shorter than that of male Members. Unfortunately, the abuse is not limited to them. Family members, children and often staff members are also subjected to harassment, and it puts women off standing in the first place.

In its Equal Power project, the Fawcett Society found that almost 70% of respondents cited abuse or harassment as a reason for not pursuing a career in politics. The evidence is really clear. Although the rise in abuse impacts all candidates, the increase in hate and abuse towards women over the past three general elections was almost double that experienced by men, according to data from the Constitution Unit at University College London. The University of Sheffield’s research has shown that rising abuse against women MPs has persisted during the pandemic. The role of online media in delivering that abuse is significant.

The Minister will no doubt say that what is illegal offline is illegal online, so the police can and do take action on threats to murder, rape, sexually assault and stalk. There have been many cases of online abuse involving female MPs and many convictions in the past 12 months. There is no definitive figure, however, on the number of reports, the number of cases taken forward, and the number of convictions. If Parliament is to take its equality duties seriously, it should closely monitor that, and commission research on why male and female elected Members are treated so differently on social media to ensure that measures are in place to deal with it.

This is not just a UK problem. In 2016, a study by the Inter-Parliamentary Union found that 82% of women politicians surveyed in 39 countries had experienced some form of psychological violence. Indeed, 44% had received death threats, or threats of rape, beatings or abduction, and 65% had been subject to sexist remarks. In our year of hosting the G7, let us use this opportunity to draw nations together to voice our solidarity with women across the globe by pledging our support for democratically elected women, and to advocate laws and support that stop the abuse worldwide.

Gone are the days when we did not know what to do about online abuse. We know. We need to have clear laws and the Government must not only promptly bring forward the legislation to introduce the new electoral sanction on intimidation, but use the online harms Bill to recognise the harm faced by women and elected representatives when they deal with social media.

Research by the think-tank Compassion in Politics found that more than one in four people is put off posting on social media because of dreadful abuse. We cannot have an open and honest pluralistic political debate online in an atmosphere in which people are scared to speak up. I hope the Government will also look to tackle anonymous social media accounts in the online harms Bill. I support the idea of giving people the opportunity to create verified accounts by supplying a piece of personal identification, and also of having the ability to filter out unverified accounts. Does the Minister agree that we need to change the culture from the police recommendation that victims move offline, and move to social media platforms banning abusers, with sanctions to incentivise social media platforms to act quickly when abuse actually happens?

The House of Commons itself has a role as an entity to play in this issue. Online abuse must be treated as workplace harassment, a safety hazard by the House of Commons authorities, who should provide training and support—not only to MPs but to our constituency staff, who often have to deal with this abuse on a daily basis—so that we can all feel safer, more prepared and better able to protect ourselves. Political parties have a role, particularly when it comes to candidates. My right hon. Friend the Member for Cannock Chase (Amanda Milling) has already made significant changes in the support provided by the Conservative party. The House of Commons has a duty to protect our democracy and should routinely collect and analyse data that breaks down why we fail to have a fully representative Parliament, including the impact of online abuse on female MPs.

I want the Government to think about how we fund the programmes to tackle online abuse. According to the Office for National Statistics, the new digital services tax of 2% on tech giants such as Facebook, Google and Twitter raised £29 million in the first month of operation alone. Let us take the polluter pays principle and use the money yielded from that to provide the sort of support against online abuse that women and girls need more generally, but that is also needed to protect democratically elected women both at local and national level.

Sixteen months ago, we celebrated record numbers of women being elected to Parliament and we still celebrate and encourage more women to stand for election. All the main parties agree that a 50:50 Parliament would be a better place, but the facts speak loudly. The Fawcett Society’s Equal Power project found that almost 60% of women surveyed said they were unlikely to stand as an MP and 44% were unlikely to stand as a councillor. Nearly a year on from that, those numbers have risen to 74% and 62%, respectively. This is a worrying state of affairs, particularly when we see that 69% of respondents cited abuse or harassment as the key reason for not pursuing a career in politics. Online abuse is a significant part of that. This is a barrier that has to be lifted, for the strength of our democracy. It is in our power to tackle this. Parliament, Government and parties have to work together to do just that.

Ian Paisley Portrait Ian Paisley (in the Chair)
- Hansard - - - Excerpts

Before I call our next speaker, who will be Caroline Nokes—just to give her early warning—Members will have four minutes to speak.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - -

I thank all right hon. and hon. Members for taking part in the debate. The Government need to get the law right, and we have heard from the Minister, but Parliament and the House of Commons need to play their role, too. Online abuse is an affront to our democracy, and it is a workplace safety issue. We cannot allow women to be bullied out of public life by online abuse, because that weakens our democracy. The House of Commons needs to up its game in monitoring and supporting Members and their staff, and the APPG stands ready to assist with that work.

Question put and agreed to.

Resolved,

That this House has considered online abuse of elected women representatives.