All 55 contributions to the Online Safety Act 2023 (Ministerial Extracts Only)

Read Full Bill Debate Texts

Tue 19th Apr 2022
Online Safety Bill
Commons Chamber

2nd reading & 2nd reading
Tue 24th May 2022
Tue 24th May 2022
Thu 26th May 2022
Online Safety Bill (Third sitting)
Public Bill Committees

Committee stage: 3rd sitting & Committee Debate - 3rd sitting
Thu 26th May 2022
Online Safety Bill (Fourth sitting)
Public Bill Committees

Committee stage: 4th sitting & Committee Debate - 4th sitting
Tue 7th Jun 2022
Tue 7th Jun 2022
Thu 9th Jun 2022
Thu 9th Jun 2022
Tue 14th Jun 2022
Tue 14th Jun 2022
Thu 16th Jun 2022
Thu 16th Jun 2022
Tue 21st Jun 2022
Online Safety Bill (Thirteenth sitting)
Public Bill Committees

Committee stage: 13th sitting & Committee Debate - 13th sitting
Tue 21st Jun 2022
Thu 23rd Jun 2022
Tue 28th Jun 2022
Tue 28th Jun 2022
Tue 12th Jul 2022
Online Safety Bill
Commons Chamber

Report stage & Report stage (day 1) & Report stage
Mon 5th Dec 2022
Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 17th Jan 2023
Wed 1st Feb 2023
Wed 19th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage & Committee stage
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 27th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 11th May 2023
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 16th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 22nd Jun 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1 & Report stage: Minutes of Proceedings
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 3
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 1
Mon 10th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2
Wed 12th Jul 2023
Mon 17th Jul 2023
Wed 19th Jul 2023
Wed 6th Sep 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments
Tue 19th Sep 2023
Online Safety Bill
Lords Chamber

Consideration of Commons amendments

Online Safety Bill

(Limited Text - Ministerial Extracts only)

Read Full debate
2nd reading
Tuesday 19th April 2022

(2 years, 7 months ago)

Commons Chamber
Online Safety Act 2023 Read Hansard Text Watch Debate

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

Nadine Dorries Portrait The Secretary of State for Digital, Culture, Media and Sport (Ms Nadine Dorries)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read a Second time.

Given the time and the number of people indicating that they wish to speak, and given that we will have my speech, the shadow Minister’s speech and the two winding-up speeches, there might be 10 minutes left for people to speak. I will therefore take only a couple of interventions and speak very fast in the way I can, being northern.

Almost every aspect of our lives is now conducted via the internet, from work and shopping to keeping up with our friends, family and worldwide real-time news. Via our smartphones and tablets, we increasingly spend more of our lives online than in the real world.

In the past 20 years or so, it is fair to say that the internet has overwhelmingly been a force for good, for prosperity and for progress, but Members on both sides of the House will agree that, as technology advances at warp speed, so have the new dangers this progress presents to children and young people.

Mark Francois Portrait Mr Mark Francois (Rayleigh and Wickford) (Con)
- Hansard - - - Excerpts

My right hon. Friend will know that, last Wednesday, the man who murdered our great friend Sir David Amess was sentenced to a whole-life term. David felt very strongly that we need legislation to protect MPs, particularly female MPs, from vile misogynistic abuse. In his memory, will she assure me that her Bill will honour the spirit of that request?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Sir David was a friend to all of us, and he was very much at the forefront of my mind during the redrafting of this Bill over the last few months. I give my right hon. Friend my absolute assurance on that.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

A number of constituents have contacted me over the last few months about eating disorders, particularly anorexia and bulimia, and about bullying in schools. Will the Secretary of State assure me and this House that those concerns will be addressed by this Bill so that my constituents are protected?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

They will. Inciting people to take their own life or encouraging eating disorders in anorexia chatrooms—all these issues are covered by the Bill.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention.

Jonathan Gullis Portrait Jonathan Gullis (Stoke-on-Trent North) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend, and I thank her for her written communications regarding Angela Stevens, the mother of Brett, who tragically took his own life having been coerced by some of these vile online sites. The Law Commission considered harmful online communications as part of the Bill’s preparation, and one of its recommendations is to introduce a new offence of encouraging or assisting self-harm. I strongly urge my right hon. Friend to adopt that recommendation. Can she say more on that?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

Yes. Exactly those issues will be listed in secondary legislation, under “legal but harmful”. I will talk about that further in my speech, but “legal but harmful” focuses on some of the worst harmful behaviours. We are talking not about an arbitrary list, but about incitement to encourage people to take their own life and encouraging people into suicide chatrooms—behaviour that is not illegal but which is indeed harmful.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I am going to whizz through my speech now in order to allow people who have stayed and want to speak to do so.

As the Minister for mental health for two years, too often, I heard stories such as the one just highlighted by my hon. Friend the Member for Stoke-on-Trent North (Jonathan Gullis). We have all sat down with constituents and listened as the worst stories any parents could recount were retold: stories of how 14-year-old girls take their own life after being directed via harmful algorithms into a suicide chatroom; and of how a child has been bombarded with pro-anorexia content, or posts encouraging self-harm or cyber-bullying.

School bullying used to stop at the school gate. Today, it accompanies a child home, on their mobile phone, and is lurking in the bedroom waiting when they switch on their computer. It is the last thing a bullied child reads at night before they sleep and the first thing they see when they wake in the morning. A bullied child is no longer bullied in the playground on school days; they are bullied 24 hours a day, seven days a week. Childhood innocence is being stolen at the click of a button. One extremely worrying figure from 2020 showed that 80% of 12 to 15-year-olds had at least one potentially harmful online experience in the previous year.

We also see this every time a footballer steps on to the pitch, only to be subjected to horrific racism online, including banana and monkey emojis. As any female MP in this House will tell you, a woman on social media—I say this from experience—faces a daily barrage of toxic abuse. It is not criticism—criticism is a fair game—but horrific harassment and serious threats of violence. Trolls post that they hope we get raped or killed, urge us to put a rope around our neck, or want to watch us burn in a car alive—my own particular experience.

All this behaviour is either illegal or, almost without exception, explicitly banned in a platform’s terms and conditions. Commercially, it has to be. If a platform stated openly that it allowed such content on its sites, which advertisers, its financial lifeblood, would knowingly endorse and advertise on it? Which advertisers would do that? Who would openly use or allow their children to use sites that state that they allow illegal and harmful activity? None, I would suggest, and platforms know that. Yet we have almost come to accept this kind of toxic behaviour and abuse as part and parcel of online life. We have factored online abuse and harm into our daily way of life, but it should not and does not have to be this way.

This Government promised in their manifesto to pass legislation to tackle these problems and to make the UK the

“safest place in the world to be online”

especially for children. We promised legislation that would hold social media platforms to the promises they have made to their own users—their own stated terms and conditions—promises that too often are broken with no repercussions. We promised legislation that would bring some fundamental accountability to the online world. That legislation is here in the form of the ground- breaking Online Safety Bill. We are leading the way and free democracies across the globe are watching carefully to see how we progress this legislation.

The Bill has our children’s future, their unhindered development and their wellbeing at its heart, while at the same time providing enhanced protections for freedom of speech. At this point, I wish to pay tribute to my predecessors, who have each trodden the difficult path of balancing freedom of speech and addressing widespread harms, including my immediate predecessor and, in particular, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), who worked so hard, prior to my arrival in the Department for Digital, Culture, Media and Sport, with stakeholders and platforms, digging in to identify the scope of the problem.

Let me summarise the scope of the Bill. We have reserved our strongest measures in this legislation for children. For the first time, platforms will be required under law to protect children and young people from all sorts of harm, from the most abhorrent child abuse to cyber-bullying and pornography. Tech companies will be expected to use every possible tool to do so, including introducing age-assurance technologies, and they will face severe consequences if they fail in the most fundamental of requirements to protect children. The bottom line is that, by our passing this legislation, our youngest members of society will be far safer when logging on. I am so glad to see James Okulaja and Alex Holmes from The Diana Award here today, watching from the Gallery as we debate this groundbreaking legislation. We have worked closely with them as we have developed the legislation, as they have dedicated a huge amount of their time to protecting children from online harms. This Bill is for them and those children.

The second part of the Bill makes sure that platforms design their services to prevent them from being abused by criminals. When illegal content does slip through the net, such as child sex abuse and terrorist content, they will need to have effective systems and processes in place to quickly identify it and remove it from their sites. We will not allow the web to be a hiding place or a safe space for criminals. The third part seeks to force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all sorts of other unacceptable behaviour that they claim not to allow but that ruins life in practice. In other words, we are just asking the largest platforms to simply do what they say they will do, as we do in all good consumer protection measures in any other industry. If platforms fail in any of those basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down the full weight of the law upon them.

None Portrait Several hon. Members rose—
- Hansard -

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take just two more interventions and that will be it, otherwise people will not have a chance to speak.

John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

I am very grateful to my right hon. Friend for giving way. The internet giants that run the kind of awful practices that she has described have for too long been unaccountable, uncaring and unconscionable in the way they have fuelled every kind of spite and fed every kind of bigotry. Will she go further in this Bill and ensure that, rather like any other publisher, if those companies are prepared to allow anonymous posts, they are held accountable for those posts and subject to the legal constraints that a broadcaster or newspaper would face?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

These online giants will be held accountable to their own terms and conditions. They will be unable any longer to allow illegal content to be published, and we will also be listing in secondary legislation offences that will be legal but harmful. We will be holding those tech giants to account.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I thank the Secretary of State for giving way. She talked about how this Bill is going to protect children much more, and it is a welcome step forward. However, does she accept that there are major gaps in this Bill? For instance, gaming is not covered. It is not clear whether things such as virtual reality and the metaverse are going to be covered. [Interruption.] It is not clear and all the experts will tell us that. The codes of practice in the Bill are only recommended guidance; they are not mandatary and binding on companies. That will encourage a race to the bottom.

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

The duties are mandatory; it is the Online Safety Bill and the metaverse is included in the Bill. Not only is it included, but, moving forward, the provisions in the Bill will allow us to move swiftly with the metaverse and other things. We did not even know that TikTok existed when this Bill started its journey. These provisions will allow us to move quickly to respond.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

I will take one more intervention, but that is it.

Damian Green Portrait Damian Green (Ashford) (Con)
- Hansard - - - Excerpts

I am grateful to my right hon. Friend for giving way. One of the most important national assets that needs protecting in this Bill and elsewhere is our reputation for serious journalism. Will she therefore confirm that, as she has said outside this House, she intends to table amendments during the passage of the Bill that will ensure that platforms and search engines that have strategic market status protect access to journalism and content from recognised news publishers, ensuring that it is not moderated, restricted or removed without notice or right of appeal, and that those news websites will be outside the scope of the Bill?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

We have already done that—it is already in the Bill.

Daniel Kawczynski Portrait Daniel Kawczynski (Shrewsbury and Atcham) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Nadine Dorries Portrait Ms Dorries
- Hansard - - - Excerpts

No, I have to continue.

Not only will the Bill protect journalistic content, democratic content and democratic free speech, but if one of the tech companies wanted to take down journalistic content, the Bill includes a right of appeal for journalists, which currently does not exist. We are doing further work on that to ensure that content remains online while the appeal takes place. The appeal process has to be robust and consistent across the board for all the appeals that take place. We have already done more work on that issue in this version of the Bill and we are looking to do more as we move forward.

As I have said, we will not allow the web to be a hiding place or safe space for criminals and when illegal content does slip through the net—such as child sex abuse and terrorist content— online platforms will need to have in place effective systems and processes to quickly identify that illegal content and remove it from their sites.

The third measure will force the largest social media platforms to enforce their own bans on racism, misogyny, antisemitism, pile-ons and all the other unacceptable behaviours. In other words, we are asking the largest platforms to do what they say they will do, just as happens with all good consumer-protection measures in any other industry. Should platforms fail in any of their basic responsibilities, Ofcom will be empowered to pursue a range of actions against them, depending on the situation, and, if necessary, to bring down upon them the full weight of the law. Such action includes searching platforms’ premises and confiscating their equipment; imposing huge fines of up to 10% of their global turnover; pursuing criminal sanctions against senior managers who fail to co-operate; and, if necessary, blocking their sites in the UK.

We know that tech companies can act very quickly when they want to. Last year, when an investigation revealed that Pornhub allowed child sexual exploitation and abuse imagery to be uploaded to its platform, Mastercard and Visa blocked the use of their cards on the site. Lo and behold, threatened with the prospect of losing a huge chunk of its profit, Pornhub suddenly removed nearly 10 million child sexual exploitation videos from its site overnight. These companies have the tools but, unfortunately, as they have shown time and again, they need to be forced to use them. That is exactly what the Bill will do.

Before I move on, let me point out something very important: this is not the same Bill as the one published in draft form last year. I know that Members throughout the House are as passionate as I am about getting this legislation right, and I had lots of constructive feedback on the draft version of the Bill. I have listened carefully to all that Members have had to say throughout the Bill’s process, including by taking into account the detailed feedback from the Joint Committee, the Digital, Culture, Media and Sport Committee and the Petitions Committee. They have spent many hours considering every part of the Bill, and I am extremely grateful for their dedication and thorough recommendations on how the legislation could be improved.

As a result of that feedback process, over the past three months or so I have strengthened the legislation in a number of important ways. There were calls for cyber-flashing to be included; cyber-flashing is now in the Bill. There were calls to ensure that the legislation covered all commercial pornography sites; in fact, we have expanded the Bill’s scope to include every kind of provider of pornography. There were concerns about anonymity, so we have strengthened the Bill so that it now requires the biggest tech platforms to offer verification and empowerment tools for adult users, allowing people to block anonymous trolls from the beginning.

I know that countless MPs are deeply concerned about how online fraud—particularly scam ads—has proliferated over the past few years. Under the new version of the Bill, the largest and highest-risk companies—those that stand to make the most profit—must tackle scam ads that appear on their services.

We have expanded the list of priority offences named on the face of the legislation to include not just terrorism and child abuse imagery but revenge porn, fraud, hate crime, encouraging and assisting suicide, and organised immigration crime, among other offences.

If anyone doubted our appetite to go after Silicon Valley executives who do not co-operate with Ofcom, they will see that we have strengthened the Bill so that the criminal sanctions for senior managers will now come into effect as soon as possible after Royal Assent— I am talking weeks, not years. We have expanded the things for which those senior managers will be criminally liable to cover falsifying data, destroying data and obstructing Ofcom’s access to their premises.

In addition to the regulatory framework in the Bill that I have described, we are creating three new criminal offences. While the regulatory framework is focused on holding companies to account, the criminal offences will be focused on individuals and the way people use and abuse online communications. Recommended by the Law Commission, the offences will address coercive and controlling behaviour by domestic abusers; threats to rape, kill or inflict other physical violence; and the sharing of dangerous disinformation deliberately to inflict harm.

This is a new, stronger Online Safety Bill. It is the most important piece of legislation that I have ever worked on and it has been a huge team effort to get here. I am confident that we have produced something that will protect children and the most vulnerable members of society while being flexible and adaptable enough to meet the challenges of the future.

Let me make something clear in relation to freedom of speech. Anyone who has actually read the Bill will recognise that its defining focus is the tackling of serious harm, not the curtailing of free speech or the prevention of adults from being upset or offended by something they have seen online. In fact, along with countless others throughout the House, I am seriously concerned about the power that big tech has amassed over the past two decades and the huge influence that Silicon Valley now wields over public debate.

We in this place are not the arbiters of free speech. We have left it to unelected tech executives on the west coast to police themselves. They decide who is and who is not allowed on the internet. They decide whose voice should be heard and whose should be silenced—whose content is allowed up and what should be taken down. Too often, their decisions are arbitrary and inconsistent. We are left, then, with a situation in which the president of the United States can be banned by Twitter while the Taliban is not; in which talkRADIO can be banned by YouTube for 12 hours; in which an Oxford academic, Carl Heneghan, can be banned by Twitter; or in which an article in The Mail on Sunday can be plastered with a “fake news” label—all because they dared to challenge the west coast consensus or to express opinions that Silicon Valley does not like.

It is, then, vital that the Bill contains strong protections for free speech and for journalistic content. For the first time, under this legislation all users will have an official right to appeal if they feel their content has been unfairly removed. Platforms will have to explain themselves properly if they remove content and will have special new duties to protect journalistic content and democratically important content. They will have to keep those new duties in mind whenever they set their terms and conditions or moderate any content on their sites. I emphasise that the protections are new. The new criminal offences update section 1 of the Malicious Communications Act 1988 and section 127 of the Communications Act 2003, which were so broad that they interfered with free speech while failing to address seriously harmful consequences.

Without the Bill, social media companies would be free to continue to arbitrarily silence or cancel those with whom they do not agree, without any need for explanation or justification. That situation should be intolerable for anyone who values free speech. For those who quite obviously have not read the Bill and say that it concedes power to big tech companies, I have this to say: those big tech companies have all the power in the world that they could possibly want, right now. How much more power could we possibly concede?

That brings me to my final point. We now face two clear options. We could choose not to act and leave big tech to continue to regulate itself and mark its own homework, as it has been doing for years with predictable results. We have already seen that too often, without the right incentives, tech companies will not do what is needed to protect their users. Too often, their claims about taking steps to fix things are not backed up by genuine actions.

I can give countless examples from the past two months alone of tech not taking online harm and abuse seriously, wilfully promoting harmful algorithms or putting profit before people. A recent BBC investigation showed that women’s intimate pictures were being shared across the platform Telegram to harass, shame and blackmail women. The BBC reported 100 images to Telegram as pornography, but 96 were still accessible a month later. Tech did not act.

Twitter took six days to suspend the account of rapper Wiley after his disgusting two-day antisemitic rant. Just last week, the Centre for Countering Digital Hate said that it had reported 253 accounts to Instagram as part of an investigation into misogynistic abuse on the platform, but almost 90% remained active a month later. Again, tech did not act.

Remember: we have been debating these issues for years. They were the subject of one of my first meetings in this place in 2005. During that time, things have got worse, not better. If we choose the path of inaction, it will be on us to explain to our constituents why we did nothing to protect their children from preventable risks, such as grooming, pornography, suicide content or cyber-bullying. To those who say protecting children is the responsibility of parents, not the job of the state, I would quote the 19th-century philosopher John Stuart Mill, one of the staunchest defenders of individual freedom. He wrote in “On Liberty” that the role of the state was to fulfil the responsibility of the parent in order to protect a child where a parent could not. If we choose not to act, in the years to come we will no doubt ask ourselves why we did not act to impose fundamental online protections.

However, we have another option. We can pass this Bill and take huge steps towards tackling some of the most serious forms of online harm: child abuse, terrorism, harassment, death threats, and content that is harming children across the UK today. We could do what John Stuart Mill wrote was the core duty of Government. The right to self-determination is not unlimited. An action that results in doing harm to another is not only wrong, but wrong enough that the state can intervene to prevent that harm from occurring. We do that in every other part of our life. We erect streetlamps to make our cities and towns safer. We put speed limits on our roads and make seatbelts compulsory. We make small but necessary changes to protect people from grievous harm. Now it is time to bring in some fundamental protections online.

We have the legislation ready right now in the form of the Online Safety Bill. All we have to do is pass it. I am proud to commend the Bill to the House.

None Portrait Several hon. Members rose—
- Hansard -

--- Later in debate ---
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

In the interest of time, I will just pose a number of questions, which I hope the Minister might address in summing up. The first is about the scope of the Bill. The Joint Committee of which I was a member recommended that the age-appropriate design code, which is very effectively used by the Information Commissioner, be used as a benchmark in the Bill, so that any services accessed or likely to be accessed by children are regulated for safety. I do not understand why the Government rejected that suggestion, and I would be pleased to hear from the Minister why they did so.

Secondly, the Bill delegates lots of detail to statutory instruments, codes of practice from the regulator, or later decisions by the Secretary of State. Parliament must see that detail before the Bill becomes an Act. Will the Minister commit to those delegated decisions being published before the Bill becomes an Act? Could he explain why the codes of practice are not being set as mandatory? I do not understand why codes of practice, much of the detail of which the regulator is being asked to set, will not be made mandatory for businesses. How can minimum standards for age or identity verification be imposed if those codes of practice are not made mandatory? Perhaps the Minister could explain.

Many users across the country will want to ensure that their complaints are dealt with effectively. We recommended an ombudsman service that dealt with complaints that were exhausted through a complaints system at the regulated companies, but the Government rejected it. Please could the Minister explain why?

I was pleased that the Government accepted the concept of the ability for a super-complaint to be brought on behalf of groups of users, but the decision as to who will be able a bring a super-complaint has been deferred, subject to a decision by the Secretary of State. Why, and when will that decision be taken? If the Minister could allude to who they might be, I am sure that would be welcome.

Lastly, there is a number of exemptions and more work to be done, which leaves significant holes in the legislation. There is much more work to be done on clauses 5, 6 and 50—on democratic importance, journalism and the definition of journalism, on the exemptions for news publishers, and on disinformation, which is mentioned only once in the entire Bill. I and many others recognise that these are not easy issues, but they should be considered fully before legislation is proposed that has gaping holes for people who want to get around it, and for those who wish to test the parameters of this law in the courts, probably for many years. All of us, on a cross-party basis in this House, support the Government’s endeavours to make it safe for children and others to be online. We want the legislation to be implemented as quickly as possible and to be as effective as possible, but there are significant concerns that it will be jammed up in the judicial system, where this House is unacceptably giving judges the job of fleshing out the definition of what many of the important exemptions will mean in practice.

The idea that the Secretary of State has the power to intervene with the independent regulator and tell it what it should or should not do obviously undermines the idea of an independent regulator. While Ministers might give assurances to this House that the power will not be abused, I believe that other countries, whether China, Russia, Turkey or anywhere else, will say, “Look at Great Britain. It thinks this is an appropriate thing to do. We’re going to follow the golden precedent set by the UK in legislating on these issues and give our Ministers the ability to decide what online content should be taken down.” That seems a dangerous precedent.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

The Minister is shaking his head, but I can tell him that the legislation does do that, because we looked at this and took evidence on it. The Secretary of State would be able to tell the regulator that content should be “legal but harmful” and therefore should be removed as part of its systems design online. We also heard that the ability to do that at speed is very restricted and therefore the power is ineffective in the first place. Therefore, the Government should evidently change their position on that. I do not understand why, in the face of evidence from pretty much every stakeholder, the Government agree that that is an appropriate use of power or why Parliament would vote that through.

I look forward to the Minister giving his answers to those questions, in the hope that, as the Bill proceeds through the House, it can be tidied up and made tighter and more effective, to protect children and adults online in this country.

--- Later in debate ---
Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

The piece of legislation before the House this evening is truly groundbreaking, because no other jurisdiction anywhere in the world has attempted to legislate as comprehensively as we are beginning to legislate here. For too long, big tech companies have exposed children to risk and harm, as evidenced by the tragic suicide of Molly Russell, who was exposed to appalling content on Instagram, which encouraged her, tragically, to take her own life. For too long, large social media firms have allowed illegal content to go unchecked online.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I have spoken before about dangerous suicide-related content online. The Minister mentions larger platforms. Will the Government go away and bring back two amendments based on points made by the Samaritans? One would bring smaller platforms within the scope of sanctions, and the second would make the protective aspects of the Bill cover people who are over 18, not just those who are under 18. If the Government do that, I am sure that it will be cause for celebration and that Members on both sides of the House will give their support.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is very important to emphasise that, regardless of size, all platforms in the scope of the Bill are covered if there are risks to children.

A number of Members, including the right hon. Member for Barking (Dame Margaret Hodge) and my hon. Friend the Member for Brigg and Goole (Andrew Percy), have raised the issue of small platforms that are potentially harmful. I will give some thought to how the question of small but high-risk platforms can be covered. However, all platforms, regardless of size, are in scope with regard to content that is illegal and to content that is harmful to children.

For too long, social media firms have also arbitrarily censored content just because they do not like it. With the passage of this Bill, all those things will be no more, because it creates parliamentary sovereignty over how the internet operates, and I am glad that the principles in the Bill command widespread cross-party support.

The pre-legislative scrutiny that we have gone through has been incredibly intensive. I thank and pay tribute to the DCMS Committee and the Joint Committee for their work. We have adopted 66 of the Joint Committee’s recommendations. The Bill has been a long time in preparation. We have been thoughtful, and the Government have listened and responded. That is why the Bill is in good condition.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

Will the Minister give way?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I must make some progress, because I am almost out of time and there are lots of things to reply to.

I particularly thank previous Ministers, who have done so much fantastic work on the Bill. With us this evening are my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friends the Members for Maldon (Mr Whittingdale) and for Basingstoke (Mrs Miller), but not with us this evening are my right hon. and learned Friend the Member for Kenilworth and Southam (Jeremy Wright), who I think is in America, and my right hon. Friends the Members for Hertsmere (Oliver Dowden) and for Staffordshire Moorlands (Karen Bradley), all of whom showed fantastic leadership in getting the Bill to where it is today. It is a Bill that will stop illegal content circulating online, protect children from harm and make social media firms be consistent in the way they handle legal but harmful content, instead of being arbitrary and inconsistent, as they are at the moment.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I have so many points to reply to that I have to make some progress.

The Bill also enshrines, for the first time, free speech—something that we all feel very strongly about—but it goes beyond that. As well as enshrining free speech in clause 19, it gives special protection, in clauses 15 and 16, for content of journalistic and democratic importance. As my right hon. Friend the Secretary of State indicated in opening the debate, we intend to table a Government amendment—a point that my right hon. Friends the Members for Maldon and for Ashford (Damian Green) asked me to confirm—to make sure that journalistic content cannot be removed until a proper right of appeal has taken place. I am pleased to confirm that now.

We have made many changes to the Bill. Online fraudulent advertisers are now banned. Senior manager liability will commence immediately. Online porn of all kinds, including commercial porn, is now in scope. The Law Commission communication offences are in the Bill. The offence of cyber-flashing is in the Bill. The priority offences are on the face of the Bill, in schedule 7. Control over anonymity and user choice, which was proposed by my hon. Friend the Member for Stroud (Siobhan Baillie) in her ten-minute rule Bill, is in the Bill. All those changes have been made because this Government have listened.

Let me turn to some of the points made from the Opposition Front Bench. I am grateful for the in-principle support that the Opposition have given. I have enjoyed working with the shadow Minister and the shadow Secretary of State, and I look forward to continuing to do so during the many weeks in Committee ahead of us, but there were one or two points made in the opening speech that were not quite right. This Bill does deal with systems and processes, not simply with content. There are risk assessment duties. There are safety duties. There are duties to prevent harm. All those speak to systems and processes, not simply content. I am grateful to the Chairman of the Joint Committee, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), for confirming that in his excellent speech.

If anyone in this House wants confirmation of where we are on protecting children, the Children’s Commissioner wrote a joint article with the Secretary of State in the Telegraph—I think it was this morning—confirming her support for the measures in the Bill.

When it comes to disinformation, I would make three quick points. First, we have a counter-disinformation unit, which is battling Russian disinformation night and day. Secondly, any disinformation that is illegal, that poses harm to children or that comes under the definition of “legal but harmful” in the Bill will be covered. And if that is not enough, the Minister for Security and Borders, who is sitting here next to me, intends to bring forward legislation at the earliest opportunity to cover counter-hostile state threats more generally. This matter will be addressed in the Bill that he will prepare and bring forward.

I have only four minutes left and there are so many points to reply to. If I do not cover them all, I am very happy to speak to Members individually, because so many important points were made. The right hon. Member for Barking asked who was going to pay for all the Ofcom enforcement. The taxpayer will pay for the first two years while we get ready—£88 million over two years—but after that Ofcom will levy fees on these social media firms, so they will pay for regulating their activities. I have already replied to the point she rightly raised about smaller but very harmful platforms.

My hon. Friend the Member for Meriden (Saqib Bhatti) has been campaigning tirelessly on the question of combating racism. This Bill will deliver what he is asking for.

The hon. Member for Batley and Spen (Kim Leadbeater) and my hon. Friend the Member for Watford (Dean Russell) asked about Zach’s law. Let me take this opportunity to confirm explicitly that clause 150—the harmful communication clause, for where a communication is intended to cause psychological distress—will cover epilepsy trolling. What happened to Zach will be prevented by this Bill. In addition, the Ministry of Justice and the Law Commission are looking at whether we can also have a standalone provision, but let me assure them that clause 150 will protect Zach.

My right hon. Friend the Member for Maldon asked a number of questions about definitions. Companies can move between category 1 and category 2, and different parts of a large conglomerate can be regulated differently depending on their activities. Let me make one point very clear—the hon. Member for Bristol North West (Darren Jones) also raised this point. When it comes to the provisions on “legal but harmful”, neither the Government nor Parliament are saying that those things have to be taken down. We are not censoring in that sense. We are not compelling social media firms to remove content. All we are saying is that they must do a risk assessment, have transparent terms and conditions, and apply those terms and conditions consistently. We are not compelling, we are not censoring; we are just asking for transparency and accountability, which is sorely missing at the moment. No longer will those in Silicon Valley be able to behave in an arbitrary, censorious way, as they do at the moment—something that Members of this House have suffered from, but from which they will no longer suffer once this Bill passes.

The hon. Member for Bristol North West, who I see is not here, asked a number of questions, one of which was about—[Interruption.] He is here; I do apologise. He has moved—I see he has popped up at the back of the Chamber. He asked about codes of practice not being mandatory. That is because the safety duties are mandatory. The codes of practice simply illustrate ways in which those duties can be met. Social media firms can meet them in other ways, but if they fail to meet those duties, Ofcom will enforce. There is no loophole here.

When it comes to the ombudsman, we are creating an internal right of appeal for the first time, so that people can appeal to the social media firms themselves. There will have to be a proper right of appeal, and if there is not, they will be enforced against. We do not think it appropriate for Ofcom to consider every individual complaint, because it will simply be overwhelmed, by probably tens of thousands of complaints, but Ofcom will be able to enforce where there are systemic failures. We feel that is the right approach.

I say to the hon. Member for Plymouth, Sutton and Devonport (Luke Pollard) that my right hon. Friend the Minister for Security and Borders will meet him about the terrible Keyham shooting.

The hon. Member for Washington and Sunderland West (Mrs Hodgson) raised a question about online fraud in the context of search. That is addressed by clause 35, but we do intend to make drafting improvements to the Bill, and I am happy to work with her on those drafting improvements.

I have been speaking as quickly as I can, which is quite fast, but I think time has got away from me. This Bill is groundbreaking. It will protect our citizens, it will protect our children—[Hon. Members: “Sit down!”]—and I commend it to the House.

Question put and agreed to.

Bill accordingly read a Second time.

Baroness Laing of Elderslie Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

The Minister just made it. I have rarely seen a Minister come so close to talking out his own Bill.

Online Safety Bill (Programme)

Motion made, and Question put forthwith (Standing Order No. 83A(7)),

That the following provisions shall apply to the Online Safety Bill:

Committal

(1) The Bill shall be committed to a Public Bill Committee.

Proceedings in Public Bill Committee

(2) Proceedings in the Public Bill Committee shall (so far as not previously concluded) be brought to a conclusion on Thursday 30 June 2022.

(3) The Public Bill Committee shall have leave to sit twice on the first day on which it meets.

Consideration and Third Reading

(4) Proceedings on Consideration shall (so far as not previously concluded) be brought to a conclusion one hour before the moment of interruption on the day on which those proceedings are commenced.

(5) Proceedings on Third Reading shall (so far as not previously concluded) be brought to a conclusion at the moment of interruption on that day.

(6) Standing Order No. 83B (Programming committees) shall not apply to proceedings on Consideration and Third Reading.

Other proceedings

(7) Any other proceedings on the Bill may be programmed.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Money)

Queen’s recommendation signified.

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise the payment out of money provided by Parliament of:

(1) any expenditure incurred under or by virtue of the Act by the Secretary of State, and

(2) any increase attributable to the Act in the sums payable under any other Act out of money so provided.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (Ways and Means)

Motion made, and Question put forthwith (Standing Order No. 52(1)(a)),

That, for the purposes of any Act resulting from the Online Safety Bill, it is expedient to authorise:

(1) the charging of fees under the Act, and

(2) the payment of sums into the Consolidated Fund.—(Michael Tomlinson.)

Question agreed to.

Deferred Divisions

Motion made, and Question put forthwith (Standing Order No. 41A(3)),

That at this day’s sitting, Standing Order 41A (Deferred divisions) shall not apply to the Motion in the name of Secretary Nadine Dorries relating to Online Safety Bill: Carry-over.—(Michael Tomlinson.)

Question agreed to.

Online Safety Bill (First sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage
Tuesday 24th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Q But as the Bill stands, there is a very clear point about stopping harmful content being sent to people, so I imagine that would cover it at least in that sense, would it not?

Kevin Bakhurst: This is a really important point, which Richard just tried to make. The Bill gives us a great range of tools to try and prevent harm as far as possible; I just think we need to get expectations right here. Unfortunately, this Bill will not result in no harm of any type, just because of the nature of the internet and the task that we face. We are ambitious about driving constant improvement and stopping and addressing the main harms, but it is not going to stop any particular harm. We will absolutely focus on the ones that have a significant impact, but unfortunately that is the nature of the web.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Just to continue the point made by my colleague, you are right to say that Ministry of Justice colleagues are considering the flashing image offence as a separate matter. But would you agree that clause 150, on harmful communications, does criminalise and therefore place into the scope of the Bill communications intended to cause harm to a “likely audience” where such harm is

“psychological harm amounting to serious distress”?

Therefore, sending somebody a flashing image with the intention of inducing an epileptic fit would be likely caught under this new harmful communications offence in clause 150, even before a separate future offence that may be introduced.

Richard Wronka: I think we can certainly understand the argument. I think it is important that the Bill is as clear as possible. Ultimately, it is for the courts to decide whether that offence would pick up these kinds of issues that we are talking about around flashing imagery.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I would suggest that the definition in clause 150 would cover epilepsy trolling.

You mentioned that you met recently with European regulators. Briefly, because we are short of time, were there any particular messages, lessons or insights you picked up in those meetings that might be of interest to the Committee?

Kevin Bakhurst: Yes, there were a number, and liaising with European regulators and other global regulators in this space is a really important strand of our work. It often said that this regime is a first globally. I think that is true. This is the most comprehensive regime, and it is therefore potentially quite challenging for the regulator. That is widely recognised.

The second thing I would say is that there was absolute recognition of how advanced we are in terms of the recruitment of teams, which I touched on before, because we have had the funding available to do it. There are many countries around Europe that have recruited between zero and 10 and are imminently going to take on some of these responsibilities under the Digital Services Act, so I think they are quite jealous.

The last thing is that we see continued collaboration with other regulators around the world as a really important strand, and we welcome the information-sharing powers that are in the Bill. There are some parallels, and we want to take similar approaches on areas such as transparency, where we can collaborate and work together. I think it is important—

None Portrait The Chair
- Hansard -

Order. I am afraid we have come to the end of the allotted time for questions. On behalf of the Committee, I thank our witnesses for their evidence.

Examination of Witnesses

Dame Rachel de Souza, Lynn Perry MBE and Andy Burrows gave evidence.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Q I have a quick question on parental digital literacy. You mentioned the panel that you put together of 16 to 21-year-olds. Do you think that today’s parents have the experience, understanding, skills and tools to keep their children properly safe online? Even if they are pretty hands-on and want to do that, do you think that they have all the tools they need to be able to do that?

Dame Rachel de Souza: It is a massive concern to parents. Parents talk to me all the time about their worries: “Do we know enough?” They have that anxiety, especially as their children turn nine or 10; they are thinking, “I don’t even know what this world out there is.” I think that our conversations with 16 to 21-year-olds were really reassuring, and we have produced a pamphlet for parents. It has had a massive number of downloads, because parents absolutely want to be educated in this subject.

What did young people tell us? They told us, “Use the age controls; talk to us about how much time we are spending online; keep communication open; and talk to us.” Talk to children when they’re young, particularly boys, who are likely to be shown pornography for the first time, even if there are parental controls, around the age of nine or 10. So have age-appropriate conversations. There was some very good advice about online experiences, such as, “Don’t worry; you’re not an expert but you can talk to us.” I mean, I did not grow up with the internet, but I managed parenting relatively well—my son is 27 now. I think this is a constant concern for parents.

I do think that the tech companies could be doing so much more to assist parents in digital media literacy, and in supporting them in how to keep their child safe. We are doing it as the Office of the Children’s Commissioner. I know that we are all trying to do it, but we want to see everyone step up on this, particularly the tech companies, to support parents on this issue.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking the NSPCC and you, Dame Rachel, and your office for the huge contribution that you have made to the Bill as it has developed? A number of changes have been made as a result of your interventions, so I would just like to start by putting on the record my thanks to both of you and both your organisations for the work that you have done so far.

Could you outline for the Committee the areas where you think the Bill, as currently drafted, contains the most important provisions to protect children?

Dame Rachel de Souza: I was really glad to see, in the rewrite of the Online Safety Bill, a specific reference to the role of age assurance to prevent children from accessing harmful content. That has come across strongly from children and young people, so I was very pleased to see that. It is not a silver bullet, but for too long children have been using entirely inappropriate services. The No. 1 recommendation from the 16 to 21-year-olds, when asked what they wish their parents had known and what we should do, was age assurance, if you are trying to protect a younger sibling or are looking at children, so I was pleased to see that. Companies cannot hope to protect children if they do not know who the children are on their platforms, so I was extremely pleased to see that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, Dame Rachel, but do you agree that it is not just about stopping under-18s viewing pornography; it also includes stopping children under 13 accessing social media entirely, as per those companies’ purported terms and conditions, which are frequently flouted?

Dame Rachel de Souza: Absolutely. I have called together the tech companies. I have met the porn companies, and they reassured me that as long as they were all brought into the scope of this Bill, they would be quite happy as this is obviously a good thing. I brought the tech companies together to challenge them on their use of age assurance. With their artificial intelligence and technology, they know the age of children online, so they need to get those children offline. This Bill is a really good step in that direction; it will hold them to account and ensure they get children offline. That was a critically important one for me.

I was also pleased to see the holding to account of companies, which is very important. On full coverage of pornography, I was pleased to see the offence of cyber-flashing in the Bill. Again, it is particularly about age assurance.

What I would say is that nudge is not working, is it? We need this in the Bill now, and we need to get it there. In my bit of work with those 2,000 young people, we asked what they had seen in the last month, and 40% of them have not had bad images taken down. Those aspects of the Bill are key.

Andy Burrows: This is a landmark Bill, so we thank you and the Government for introducing it. We should not lose sight of the fact that, although this Bill is doing many things, first and foremost it will become a crucial part of the child protection system for decades to come, so it is a hugely important and welcome intervention in that respect.

What is so important about this Bill is that it adopts a systemic approach. It places clear duties on platforms to go through the process of identifying the reasonably foreseeable harms and requiring that reasonable steps be taken to mitigate them. That is hugely important from the point of view of ensuring that this legislation is future-proofed. I know that many companies have argued for a prescriptive checklist, and then it is job done—a simple compliance job—but a systemic approach is hugely important because it is the basis upon which companies have very clear obligations. Our engagement is very much about saying, “How can we make sure this Bill is the best it can possibly be?” But that is on the bedrock of that systemic approach, which is fundamental if we are to see a culture shift in these companies and an emphasis on safety by design—designing out problems that do not have to happen.

I have engaged with companies where child safety considerations are just not there. One company told me that grooming data is a bad headline today and tomorrow’s chip shop wrapper. A systemic approach is the key to ensuring that we start to address that balance.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. I obviously strongly agree with those comments.

I would like to turn to a one or two points that came up in questioning, and then I would like to probe a couple of points that did not. Dame Rachel mentioned advocacy and ensuring that the voice of particular groups—in this context, particularly that of children—is heard. In that context, I would like to have a look at clause 140, which relates to super-complaints. Subsection (4) says that the Secretary of State can, by regulations, nominate which organisations are able to bring super-complaints. These are complaints whereby you go to Ofcom and say that there is a particular company that is failing in its systemic duties.

Subsection (4) makes it clear that the entities nominated to be an authorised super-complainant would include

“a body representing the interests of users of regulated services”,

which would obviously include children. If an organisation such as the Office of the Children’s Commissioner or the NSPCC—I am obviously not prejudicing the future process—were designated as a super-complainant that was able to bring super-complaints to Ofcom, would that address your point about the need for proper advocacy for children?

Dame Rachel de Souza: Absolutely. I stumbled over that a bit when Maria asked me the question, but we absolutely need people who work with children, who know children and are trusted by children, and who can do that nationally in order to be the super-complainants. That is exactly how I would envisage it working.

Andy Burrows: The super-complaint mechanism is part of the well-established arrangements that we see in other sectors, so we are very pleased to see that that is included in the Bill. I think there is scope to go further and look at how the Bill could mirror the arrangements that we see in other sectors—I mentioned the energy, postal and water sectors earlier as examples—so that the statutory user advocacy arrangements for inherently vulnerable children, including children at risk of sexual abuse, mirror the arrangements that we see in those other sectors. That is hugely important as a point of principle, but it is really helpful and appropriate for ensuring that the legislation can unlock the positive regulatory outcomes that we all want to see, so I think it contributes towards really effective regulatory design.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Andy. I am conscious of the time, so I will be brief with my final three questions. You made a valid point about large social media platforms receiving complaints generally, but in this case from children, about inappropriate content, such as photographs of them on a social media platform that do not get taken down—the complaint gets ignored, or it takes a very long time. In clause 18, we have duties on the complaints procedures that the big social media firms will now have to follow. I presume that you would join me in urging Ofcom to ensure that how it enforces the duties in clause 18 includes ensuring that big social media firms are responsive and quick in how they deal with complaints. Children are specifically referred to in the clause—for example, in subsection (3) and elsewhere.

Dame Rachel de Souza: Yes, and I was so pleased to see that. The regulator needs to have teeth for it to have any effect—I think that is what we are saying. I want named senior managers to be held accountable for breaches of their safety duties to children, and I think that senior leaders should be liable to criminal sanctions when they do not uphold their duty of care to children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good—thank you. I want to say something about gaming, because Kirsty Blackman asked about it. If messages are being sent back and forth in a gaming environment, which is probably the concern, those are in scope of the Bill, because they are user-to-user services.

I will put my last two questions together. Are you concerned about the possibility that encryption in messaging services might impede the automatic scanning for child exploitation and abuse images that takes place, and would you agree that we cannot see encryption happen at the expense of child safety? Secondly, in the context of the Molly Russell reference earlier, are you concerned about the way that algorithms can promote and essentially force-feed children very harmful content? Those are two enormous questions, and you have only two minutes to answer them, so I apologise.

Dame Rachel de Souza: I am going to say yes and yes.

Andy Burrows: I will say yes and yes as well. The point about end-to-end encryption is hugely important. Let us be clear: we are not against end-to-end encryption. Where we have concerns is about the risk profile that end-to-end encryption introduces, and that risk profile, when we are talking about it being introduced into social networking services and bundled with other sector functionality, is very high and needs to be mitigated.

About 70% of child abuse reports that could be lost with Meta going ahead. That is 28 million reports in the past six months, so it is very important that the Bill can require companies to demonstrate that if they are running services, they can acquit themselves in terms of the risk assessment processes. We really welcome the simplified child sexual exploitation warning notices in the Bill that will give Ofcom the power to intervene when companies have not demonstrated that they have been able to introduce end-to-end encryption in a safe and effective way.

One area in which we would like to see the Bill—

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions of this panel. On behalf of the Committee, I thank our witnesses for their evidence, and I am really sorry that we could not get Lynn Perry online. Could we move on to the last panel? Thank you very much.

Examination of Witnesses

Ben Bradley and Katy Minshall gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Sorry, I have to interrupt you there. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for coming to give evidence to the Committee. On the question about user choice around identity verification, is this not conceptually quite similar to the existing blue tick regime that Twitter operates successfully?

Katy Minshall: As I say, we share your policy objective of giving users more choice. For example, at present we are testing a tool where Twitter automatically blocks abusive accounts on your behalf. We make the distinction based on an account’s behaviour and not on whether it has verified itself in some way.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, I’d be grateful if you applied that to my account as quickly as possible!

I do not think that the concept would necessarily operate as you suggested at the beginning. You suggested that people might end up not seeing content posted by the Prime Minister or another public figure. The concept is that, assuming a public figure would choose to verify themselves, content that they posted would be visible to everybody because they had self-verified. The content in the other direction may or may not be, depending on whether the Prime Minister or the Leader of the Opposition chose to see all content or just verified content, but their content—if they verified themselves—would be universally visible, regardless of whatever choice anyone else exercised.

Katy Minshall: Yes, sorry if I was unclear. I totally accept that point, but it would mean that some people would be able to reply to Boris Johnson and others would not. I know we are short on time, but it is worth pointing out that in a YouGov poll in April, nearly 80% of people said that they would not choose to provide ID documents to access certain websites. The requirements that you describe are based on the assumption that lots of people will choose to do it, when in reality that might not be the case.

A public figure might think, “Actually, I really appreciate that I get retweets, likes and people replying to my tweets,” but if only a small number of users have taken the opportunity to verify themselves, that is potentially a disincentive even to use this system in the first place—and all the while we were creating a system, we could have been investing in or trying to develop new solutions, such as safety mode, which I described and which tries to prevent abusive users from interacting with you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to move on to the next question because we only have two minutes left.

Ben, you talked about the age verification measures that TikTok currently takes. For people who do not come via an age-protected app store, it is basically self-declared. All somebody has to do is type in a date of birth. My nine-year-old children could just type in a date of birth that was four years earlier than their real date of birth, and off they would go on TikTok. Do you accept that that is wholly inadequate as a mechanism for policing the age limit of 13?

Ben Bradley: That is not the end of our age assurance system; it is just the very start. Those are the first two things that we have to prevent sign-up, but we are also proactive in surfacing and removing under-age accounts. As I said, we publish every quarter how many suspected under-13s get removed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If I understood your answer correctly, that is only if a particular piece of content comes to the attention of your moderators. I imagine that only 0.01% or some tiny fraction of content on TikTok comes to the attention of your moderators.

Ben Bradley: It is based on a range of signals that they have available to them. As I said, we publish a number every quarter. In the last quarter, we removed 14 million users across the globe who were suspected to be under the age of 13. That is evidence of how seriously we take the issue. We publish that information because we think it is important to be transparent about our efforts in this space, so that we can be judged accordingly.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Forgive me for moving on in the interests of time.

Earlier, we debated content of democratic importance and the protections that that and free speech have in the Bill. Do you agree that a requirement to have some level of consistency in the way that that is treated is important, particularly given that there are some glaring inconsistencies in the way in which social media firms treat content at the moment? For example, Donald Trump has been banned, while flagrant disinformation by the Russian regime, lying about what they are doing in Ukraine, is allowed to propagate—including the tweets that I drew to your attention a few weeks ago, Katy.

Katy Minshall: I agree that freedom of expression should be top of mind as companies develop safety and policy solutions. Public interest should always be considered when developing policies. From the perspective of the Bill, I would focus on freedom of expression for everyone, and not limit it to content that could be related to political discussions or journalistic content. As Ben said, there are already wider freedom of expression duties in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q To be clear, those freedom of expression duties in clause 19(2) do apply to everyone.

Katy Minshall: Sorry, but I do not know the Bill in those terms, so you would have to tell me the definition.

None Portrait The Chair
- Hansard -

Order. I am afraid that that brings us to the end of the time allotted for the Committee to ask questions in this morning’s sitting. On behalf of the Committee, I thank our witnesses for their evidence. We will meet again at 2 pm in this room to hear further oral evidence.

Online Safety Bill (Second sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage
Tuesday 24th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 24 May 2022 - (24 May 2022)

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

None Portrait The Chair
- Hansard -

I am sorry, but I must move on. Minister, I am afraid you only have five minutes.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Welcome to the Committee’s proceedings and thank you for joining us this afternoon. I would like to start on the question of the algorithmic promotion of content. Last week, I met with the Facebook whistleblower, Frances Haugen, who spoke in detail about she had found when working for Facebook, so I will start with you, Richard. On the question of transparency, which other Members of the Committee have touched on, would you have any objection to sharing all the information you hold internally with trusted researchers?

Richard Earley: What information are you referring to?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Data, in particular on the operation of algorithmic promotion of particular kinds of content.

Richard Earley: We already do things like that through the direct opportunity that anyone has to see why a single post has been chosen for them in their feed. You can click on the three dots next to any post and see that. For researcher access and support, as I mentioned, we have contributed to the publishing of more than 400 reports over the last year, and we want to do more of that. In fact, the Bill requires Ofcom to conduct a report on how to unlock those sorts of barriers, which we think should be done as soon as possible. Yes, in general we support that sort of research.

I would like say one thing, though. I have worked at Facebook—now Meta—for almost five years, and nobody at Facebook has any obligation, any moral incentive, to do anything other than provide people with the best, most positive experience on our platform, because we know that if we do not give people a positive experience, through algorithms or anything else, they will leave our platform and will not use it. They tell us that and they do it, and the advertisers who pay for our services do not want to see that harmful content on our platforms either. All of our incentives are aligned with yours, which are to ensure that our users have a safe and positive experience on our platforms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yet the algorithms that select particular content for promotion are optimised for user engagement —views, likes and shares—because that increases user stickiness and keeps them on the site for longer. The evidence seems to suggest that, despite what people say in response to the surveys you have just referenced, what they actually interact with the most—or what a particular proportion of the population chooses to interact with the most—is content that would be considered in some way extreme, divisive, or so on, and that the algorithms, which are optimised for user engagement, notice that and therefore uprank that content. Do you accept that your algorithms are optimised for user engagement?

Richard Earley: I am afraid to say that that is not correct. We have multiple algorithms on our services. Many of them, in fact, do the opposite of what you have just described: they identify posts that might be violent, misleading or harmful and reduce the prevalence of them within our feed products, our recommendation services and other parts of the service.

We optimise the algorithm that shows people things for something called meaningful social interaction. That is not just pure engagement; in fact, its focus—we made a large change to our algorithms in 2018 to focus on this—is on the kinds of activities online that research shows are correlated with positive wellbeing outcomes. Joining a group in your local area or deciding to go to an event that was started by one of your friends—that is what our algorithms are designed to promote. In fact, when we made that switch in 2018, we saw a decrease in more than 50 million hours of Facebook use every day as a result of that change. That is not the action of a company that is just focused on maximising engagement; it is a company that is focused on giving our users a positive experience on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You have alluded to some elements of the algorithmic landscape, but do you accept that the dominant feature of the algorithm that determines which content is most promoted is based on user engagement, and that the things you have described are essentially second-order modifications to that?

Richard Earley: No, because as I just said, when we sent the algorithm this instruction to focus on social interaction it actually decreased the amount of time people spent on our platform.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q It might have decreased it, but the meaningful social interaction score is, not exclusively, as you said, but principally based on user engagement, isn’t it?

Richard Earley: As I said, it is about ensuring that people who spend time on our platform come away feeling that they have had a positive experience.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That does not quite answer the question.

Richard Earley: I think that a really valuable part of the Bill that we are here to discuss is the fact that Ofcom will be required, and we in our risk assessments will be required, to consider the impact on the experience of our users of multiple different algorithms, of which we have hundreds. We build those algorithms to ensure that we reduce the prevalence of harmful content and give people the power to connect with those around them and build community. That is what we look forward to demonstrating to Ofcom when this legislation is in place.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but in her testimony to, I think, the Joint Committee and the US Senate, in a document that she released to The Wall Street Journal, and in our conversation last week, Frances Haugen suggested that the culture inside Facebook, now Meta, is that measures that tend to reduce user engagement do not get a very sympathetic hearing internally. However, I think we are about to run out of time. I have one other question, which I will direct, again, to Richard. Forgive me, Katie and Becky, but it is probably most relevant for Meta.

None Portrait The Chair
- Hansard -

Q Just one moment, please. Is there anything that the other witnesses need to say about this before we move on? It will have to be very brief.

Katie O'Donovan: I welcome the opportunity to address the Committee. It is so important that this Bill has parliamentary scrutiny. It is a Bill that the DCMS has spent a lot of time on, getting it right and looking at the systems and the frameworks. However, it will lead to a fundamentally different internet for UK users versus the rest of the world. It is one of the most complicated Bills we are seeing anywhere in the world. I realise that it is very important to have scrutiny of us as platforms to determine what we are doing, but I think it is really important to also look at the substance of the Bill. If we have time, I would welcome the chance to give a little feedback on the substance of the Bill too.

Becky Foreman: I would add that the Committee spent a lot of time talking to Meta, who are obviously a big focus for the Bill, but it is important to remember that there are numerous other networks and services that potentially will be caught by the Bill and that are very different from Meta. It is important to remember that.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

While the Bill is proportionate in its measures, it is not designed to impose undue burdens on companies that are not high risk. I have one more question for Richard. I think Katie was saying that she wanted to make a statement?

None Portrait The Chair
- Hansard -

We are out of time. I am sorry about this; I regard it as woefully unsatisfactory. We have got three witnesses here, a lot of questions that need to be answered, and not enough time to do it. However, we have a raft of witnesses coming in for the rest of the day, so I am going to have to draw a line under this now. I am very grateful to you for taking the trouble to come—the Committee is indebted to you. You must have the opportunity to make your case. Would you be kind enough to put any comments that you wish to make in writing so that the Committee can have them. Feel free to go as broad as you would like because I feel very strongly that you have been short-changed this afternoon. We are indebted to you. Thank you very much indeed.

Richard Earley: We will certainly do that and look forward to providing comments in writing.

Examination of Witnesses

Professor Clare McGlynn, Jessica Eagelton and Janaya Walker gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Minister?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you to the witnesses for coming in and giving very clear, helpful and powerful evidence to the Committee this afternoon. On the question of age verification or age assurance that we have just spoken about, clause 11(14) of the Bill sets a standard in the legislation that will be translated into the codes of practice by Ofcom. It says that, for the purposes of the subsection before on whether or not children can access a particular set of content, a platform is

“only entitled to conclude that it is not possible for children to access a service…if there are systems or processes in place…that achieve the result that children are not normally able to access the service”.

Ofcom will then interpret in codes of practice what that means practically. Professor McGlynn, do you think that standard set out there—

“the result that children are not normally able to access the service or that part of it”

—is sufficiently high to address the concerns we have been discussing in the last few minutes?

Professor Clare McGlynn: At the moment, the wording with regard to age assurance in part 5—the pornography providers—is slightly different, compared with the other safety duties. That is one technicality that could be amended. As for whether the provision you just talked about is sufficient, in truth I think it comes down, in the end, to exactly what is required, and of course we do not yet know what the nature of the age verification or age assurance requirements will actually be and what that will actually mean.

I do not know what that will actually mean for something like Twitter. What will they have to do to change it? In principle, that terminology is possibly sufficient, but it kind of depends in practice what it actually means in terms of those codes of practice. We do not yet know what it means, because all we have in the Bill is about age assurance or age verification.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, you are quite right that the Ofcom codes of practice will be important. As far as I can see, the difference between clauses 68 and 11(14) is that one uses the word “access” and the other uses the word “encounter”. Is that your analysis of the difference as well?

Professor Clare McGlynn: My understanding as well is that those terms are, at the moment, being interpreted slightly differently in terms of the requirements that people will be under. I am just making a point about it probably being easier to harmonise those terms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. I wanted to ask you a different question—one that has not come up so far in this session but has been raised quite frequently in the media. It concerns freedom of speech. This is probably for Professor McGlynn again. I am asking you this in your capacity as a professor of law. Some commentators have suggested that the Bill will have an adverse impact on freedom of speech. I do not agree with that. I have written an article in The Times today making that case, but what is your expert legal analysis of that question?

Professor Clare McGlynn: I read your piece in The Times this morning, which was a robust defence of the legislation, in that it said that it is no threat to freedom of speech, but I hope you read my quote tweet, in which I emphasised that there is a strong case to be made for regulation to free the speech of many others, including women and girls and other marginalised people. For example, the current lack of regulation means that women’s freedom of speech is restricted because we fear going online because of the abuse we might encounter. Regulation frees speech, while your Bill does not unduly limit freedom of speech.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, I take your second point, but did you agree with the point that the Bill as crafted does not restrict what you would ordinarily consider to be free speech?

Professor Clare McGlynn: There are many ways in which speech is regulated. The social media companies already make choices about what speech is online and offline. There are strengths in the Bill, such as the ability to challenge when material is taken offline, because that can impact on women and girls as well. They might want to put forward a story about their experiences of abuse, for example. If that gets taken down, they will want to raise a complaint and have it swiftly dealt with, not just left in an inbox.

There are lots of ways in which speech is regulated, and the idea of having a binary choice between free speech and no free speech is inappropriate. Free speech is always regulated, and it is about how we choose to regulate it. I would keep making the point that the speech of women and girls and other marginalised people is minimised at the moment, so we need regulation to free it. The House of Lords and various other reports about free speech and regulation, for example, around extreme pornography, talk about regulation as being human-rights-enhancing. That is the approach we need to take.

None Portrait The Chair
- Hansard -

Thank you very much indeed. Once again, I am afraid I have to draw the session to a close, and once again we have probably not covered all the ground we would have liked. Professor McGlynn, Ms Walker, Ms Eagleton, thank you very much indeed. As always, if you have further thoughts or comments, please put them in writing and let us know. We are indebted to you.

Examination of Witnesses

Lulu Freemont, Ian Stevenson and Adam Hildreth gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. I call the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Sir Roger, and thank you very much indeed for joining us for this afternoon’s session. Adam, we almost met you in Leeds last October or November, but I think you were off with covid at the time.

Adam Hildreth: I had covid at the time, yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Covid struck. I would like to ask Adam and Ian in particular about the opportunities provided by emerging and new technology to deliver the Bill’s objectives. I would like you both to give examples of where you think new tech can help deliver these safety duties. I ask you to comment particularly on what it might do on, first, age assurance—which we debated in our last session—and secondly, scanning for child sexual abuse images in an end-to-end encrypted environment. Adam, do you want to go first?

Adam Hildreth: Well, if Ian goes first, the second question would be great for him to answer, because we worked on it together.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Fair enough. Ian?

Ian Stevenson: Yes, absolutely. The key thing to recognise is that there is a huge and growing cohort of companies, around the world but especially in the UK, that are working on technologies precisely to try to support those kinds of safety measures. Some of those have been supported directly by the UK Government, through the safety tech challenge fund, to explore what can be done around end-to-end encrypted messaging. I cannot speak for all the participants, but I know that many of them are members of the safety tech industry association.

Between us, we have demonstrated a number of different approaches. My own company, Cyacomb, demonstrated technology that could block known child abuse within encrypted messaging environments without compromising the privacy of users’ messages and communications. Other companies in the UK, including DragonflAI and Yoti, demonstrated solutions based on detecting nudity and looking at the ages of the people in those images, which are again hugely valuable in this space. Until we know exactly what the regulation is going to demand, we cannot say exactly what the right technology to solve it is.

However, I think that the fact that that challenge alone produced five different solutions looking at the problem from different angles shows just how vibrant the innovation ecosystem can be. My background in technology is long and mixed, but I have seen a number of sectors emerge—including cyber-security and fintech—where, once the foundations for change have been created, the ability of innovators to come up with answers to difficult questions is enormous. The capacity to do that is enormous.

There are a couple of potential barriers to that. The strength of the regulation is that it is future proof. However, until we start answering the question, “What do we need to do and when? What will platforms need to do and when will they need to do it?” we do not really create in the commercial market the innovation drivers for the technical solutions that will deliver this. We do not create the drivers for investment. It is really important to be as specific as we can about what needs to be done and when.

The other potential barrier is regulation. We have already had a comment about how there should be a prohibition of general monitoring. We have seen what has happened in the EU recently over concerns about safety technologies that are somehow looking at traffic on services. We need to be really clear that, while safety technologies must protect privacy, there needs to be a mechanism so that companies can understand when they can deploy safety technologies. At the moment there are situations where we talk to potential customers for safety technologies and they are unclear as to whether it would be proportionate to deploy those under, for example, data protection law. There are areas, even within the safety tech challenge fund work on end-to-end encrypted messaging, where it was unclear whether some of the technologies—however brilliant they were at preventing child abuse in those encrypted environments —would be deployable under current data protection and privacy of electronic communications regulations.

There are questions there. We need to make sure that when the Online Safety Bill comes through, it makes clear what is required and how it fits together with other regulations to enable that. Innovators can do almost anything if you give them time and space. They need the certainty of knowing what is required, and an environment where solutions can be deployed and delivered.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Ian, thank you very much. I am encouraged by your optimism about what innovation can ultimately deliver. Adam, let me turn to you.

Adam Hildreth: I agree with Ian that the level of innovation is amazing. If we start talking about age verification and end-to-end encryptions, for me—I am going to say that same risk assessment phrase again—it absolutely depends on the type of service, who is using the service and who is exploiting the service, as to which safety technologies should be employed. I think it is dangerous to say, “We are demanding this type of technology or this specific technology to be deployed in this type of instance,” because that removes the responsibility from the people who are creating it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry to interject, but to be clear, the Bill does not do that. The Bill specifies the objectives, but it is tech agnostic. The manner of delivering those is, of course, not specified, either in the Bill or by Ofcom.

Adam Hildreth: Absolutely. Sorry, I was saying that I agree with how it has been worded. We know what is available, but technology changes all the time and solutions change all the time—we can do things in really innovative ways. However, the risk assessment has to bring together freedom of speech versus the types at risk of abuse. Is it children who are at risk, and if so, what are they at risk from? That changes the space massively when compared with some adult gaming communities, where what is harmful to them is very different from what harms other audiences. That should dictate for them what system and technology is deployed. Once we understand what best of breed looks like for those types of companies, we should know what good is.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Adam. We only have one minute left, so what is your prediction for the potential possibilities that emerging tech presents to deal with the issues of age assurance, which are difficult, and CSEA scanning, given end-to-end encrypted environments?

Adam Hildreth: The technology is there. It exists and it is absolutely deployable in the environments that need it. I am sure Ian would agree; we have seen it and done a lot of testing on it. The technology exists in the environments that need it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Including inside the end-to-end encrypted environment, rather than just at the device level? Quite a few of the safety challenge solutions that Ian mentioned are at the device level; they are not inside the encryption.

Adam Hildreth: There are ways that can work. Again, it brings in freedom of expression, global businesses and some other areas, so it is more about regulation and consumer concerns about the security of data, rather than whether technological solutions are available.

None Portrait The Chair
- Hansard -

Ms Freemont, Mr Hildreth and Mr Stevenson, thank you all very much indeed. We have run out of time. As ever, if you have any further observations that you wish to make, please put them in writing and let the Committee have them; we shall welcome them. Thank you for your time this afternoon. We are very grateful to you.

Examination of Witnesses

Jared Sine, Nima Elmi and Dr Rachel O’Connell gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Right. For once, we seem to have run out of questions. Minister, do you wish to contribute?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Everything I was going to ask has already been asked by my colleagues, so I will not duplicate that.

None Portrait The Chair
- Hansard -

Q In that case, given that we have the time, rather than doing what I normally do and inviting you to make any further submissions in writing, if there are any further comments that you would like to make about the Bill, the floor is yours. Let us start with Mr Sine.

Jared Sine: I would just make one brief comment. I think it has been mentioned by everyone here. Everyone has a role to play. Clearly, the Government have a role in proposing and pushing forward the legislation. The platforms that have the content have an obligation and a responsibility to try to make sure that their users are safe. One of the things that Dr O’Connell mentioned is age verification and trying to make sure that we keep young kids off platforms where they should not be.

I think there is a big role to play for the big tech platforms—the Apples and Googles—who distribute our apps. Over the years, we have said again and again to both of those companies, “We have age-gated our apps at 18, yet you will allow a user you know is 15, 14, 16—whatever it is—to download that app. That person has entered that information and yet you still allow that app to be downloaded.” We have begged and pleaded with them to stop and they will not stop. I am not sure that that can be included in the Bill, but if it could be, it would be powerful.

If Apple and Google could not distribute any of our apps—Hinge, Match, Tinder—to anyone under the age of 18, that solves it right there. It is the same methodology that has been used at clubs with bouncers—you have a bouncer at the door who makes sure you are 21 before you go in and have a drink. It should be the same thing with these technology platforms. If they are going to distribute and have these app stores, the store should then have rules that show age-gated apps—“This is for 17-plus or 18-plus”—and should also enforce that. It is very unfortunate that our calls on this front have gone unanswered. If the Bill could be modified to include that, it would really help to address the issue.

Dr Rachel O'Connell: Absolutely. I 100% support that. There is a tendency for people to say, “It is very complex. We need a huge amount of further consultation.” I started my PhD in 1996. This stuff has been going on for all that time. In 2008, there was a huge push by the Attorneys General, which I mentioned already, which brought all of the industry together. That was 2008. We are in 2022 now. 2017 was the Internet Safety Strategy Green Paper. We know what the risks are. They are known; we understand what they are. We understand the systems and processes that facilitate them. We understand what needs to be done to mitigate those risks and harms. Let’s keep on the track that we are going on.

Regarding industry’s concerns, a lot of them will be ironed out when companies are required to conduct risk assessments and impact assessments. They might ask, what are the age bands of your users? What are the risks associated with the product features that you are making available? What are the behaviour modification techniques that you are using, like endless scroll and loot boxes that get kids completely addicted? Are those appropriate for those ages? Then you surface the decision making within the business that results in harms and also the mitigations.

I urge you to keep going on this; do not be deterred from it. Keep the timeframe within which it comes into law fairly tight, because there are children out there who are suffering. As for the harassment—I have experienced it myself, it is horrible.

Those would be my final words.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Thank you for your very powerful testimony, Rhiannon. I appreciate that could not have been easy. Going back to the digital literacy piece, it feels like we were talking about digital literacy in the Bill when it started coming through, and that has been removed now. How important do you think it is that we have a digital literacy strategy, and that we hold social media providers in particular to having a strategy on digital education for young people?

Rhiannon-Faye McDonald: It is incredibly important that we have this education piece. Like Susie said, we cannot rely on technology or any single part of this to solve child sexual abuse, and we cannot rely on the police to arrest their way out of the problem. Education really is the key. That is education in all areas—educating the child in an appropriate way and educating parents. We hold parenting workshops. Parents are terrified; they do not know what to do, what platforms are doing what, or what to do when things go wrong. They do not even know how to talk to children about the issue; it is embarrassing for them and they cannot bring it up. Educating parents is a huge thing. Companies have a big responsibility there. They should have key strategies in place on how they are going to improve education.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by thanking both Rhiannon-Faye and Susie for coming and giving evidence, and for all the work they are doing in this area? I know it has been done over many years in both cases.

I would like to pick up on a point that has arisen in the discussion so far—the point that Susie raised about the risks posed by Meta introducing end-to-end encryption, particularly on the Facebook Messenger service. You have referenced the fact that huge numbers of child sexual exploitation images are identified by scanning those communications, leading to the arrests of thousands of paedophiles each year. You also referenced the fact that when this was temporarily turned off in Europe owing to the privacy laws there—briefly, thankfully—there was a huge loss of information. We will come on to the Bill in a minute, but as technology stands now, if Meta did proceed with end-to-end encryption, would that scanning ability be lost?

Susie Hargreaves: Yes. It would not affect the Internet Watch Foundation, but it would affect the National Centre for Missing and Exploited Children. Facebook, as a US company, has a responsibility to do mandatory reporting to NCMEC, which will be brought in with the Bill in this country. Those millions of images would be lost, as of today, if they brought end-to-end encryption in now.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Why would it not affect the Internet Watch Foundation?

Susie Hargreaves: Because they are scanning Facebook—sorry, I am just trying to unpack the way it works. It will affect us, actually. Basically, when we provide our hash list to Facebook, it uses that to scan Messenger, but the actual images that are found—the matches—are not reported to us; they are reported into NCMEC. Facebook does take our hash list. For those of you who do not know about hashing, it is a list of digital fingerprints—unique images of child sexual abuse. We currently have about 1.3 million unique images of child sexual abuse. Facebook does use our hash list, so yes it does affect us, because it would still take our hash list to use on other platforms, but it would not use it on Messenger. The actual matches would go into NCMEC. We do not know how many matches it gets against our hash list, because it goes into NCMEC.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But its ability to check images going across Messenger against your list would effectively terminate.

Susie Hargreaves: Yes, sorry—I was unclear about that. Yes, it would on Messenger.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Clearly the Bill cannot compel the creation of technology that does not exist yet. It is hoped that there will be technology—we heard evidence earlier suggesting that it is very close to existing—that allows scanning in an end-to-end encrypted environment. Do you have any update on that that you can give the Committee? If there is no such technology, how do you think the Bill should address that? Effectively there would be a forced choice between end-to-end encryption and scanning for CSEA content.

Susie Hargreaves: As I said before, it is essential that we do not demonise end-to-end encryption. It is really important. There are lots of reasons why, from a security and privacy point of view, people want to be able to use end-to-end encryption.

In terms of whether the technology is there, we all know that there are things on the horizon. As Ian said in the previous session, the technology is there and is about to be tried out. I cannot give any update at this meeting, but in terms of what we would do if end-to-end encryption is introduced and there is no ability to scan, we could look at on-device scanning, which I believe you mentioned before, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes.

Susie Hargreaves: That is an option. That could be a backstop position. I think that, at the moment, we should stand our ground on this and say, “No, we need to ensure that we have some form of scanning in place if end-to-end encryption is introduced.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q For complete clarity, do you agree that the use of end-to-end encryption cannot be allowed at the expense of child safety?

Susie Hargreaves: I agree 100%.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you.

None Portrait The Chair
- Hansard -

Thank you very much indeed, Ms McDonald and Ms Hargreaves. We are most grateful to you; thank you for your help.

Examination of Witnesses

Ellen Judson and Kyle Taylor gave evidence.

--- Later in debate ---
Baroness Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Q I have a really simple question. You have touched on the balance between free speech rights and the rights of people who are experiencing harassment, but does the Bill do enough to protect human rights?

Ellen Judson: At the moment, no. The rights that are discussed in the Bill at the minute are quite limited: primarily, it is about freedom of expression and privacy, and the way that protections around privacy have been drafted is less strong than for those around freedom of expression. Picking up on the question about setting precedents, if we have a Bill that is likely to lead to more content moderation and things like age verification and user identity verification, and if we do not have strong protections for privacy and anonymity online, we are absolutely setting a bad precedent. We would want to see much more integration with existing human rights legislation in the Bill.

Kyle Taylor: All I would add is that if you look at the exception for content of democratic importance, and the idea of “active political issue”, right now, conversion therapy for trans people—that has been described by UN experts as torture—is an active political issue. Currently, the human rights of trans people are effectively set aside because we are actively debating their lives. That is another example of how minority and marginalised people can be negatively impacted by this Bill if it is not more human rights-centred.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me start with this concept—this suggestion, this claim—that there is special protection for politicians and journalists. I will come to clause 50, which is the recognised news publisher exemption, in a moment, but I think you are referring to clauses 15 and 16. If we turn to those clauses and read them carefully, they do not specifically protect politicians and journalists, but “content of democratic importance” and “journalistic content”. It is about protecting the nature of the content, not the person who is speaking it. Would you accept that?

Ellen Judson: I accept that that is what the Bill currently says. Our point was thinking about how it will be implemented in practice. If platforms are expected to prove to a regulator that they are taking certain steps to protect content of democratic importance—in the explanatory notes, that is content related to Government policy and political parties—and they are expected to prove that they are taking a special consideration of journalistic content, the most straightforward way for them to do that will be in relation to journalists and politicians. Given that it is such a broad category and definition, that seems to be the most likely effect of the regime.

Kyle Taylor: It is potentially—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Sorry, Kyle, do come in in a second, but I just want to come back on that point.

Is it not true that a member of the public or anyone debating a legitimate political topic would also benefit from these measures? It is likely that MPs would automatically benefit—near automatically—but a member of the public might equally benefit if the topic they are talking about is of democratic or journalistic importance.

Ellen Judson: Our concern is that defining what is a legitimate political debate is itself already privileging. As you said, an MP is very likely automatically to benefit.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Well, it is likely; I would not say it is guaranteed.

Ellen Judson: A member of the public may be discussing something—for example, an active political debate that is not about the United Kingdom, which I believe would be out of scope of that protection. They would be engaged in political discussion and exercising freedom of expression, and if they were not doing so in a way that met the threshold for action based on harm, their speech should also come under those protections.

Kyle Taylor: I would add that the way in which you have described it would be so broad as to effectively be meaningless in the context of the Bill, and that instead we should be looking for universal free expression protections in that part of the Bill, and removing this provision. Because what is not, in a liberal democracy, speech of democratic importance? Really, that is everything. When does it reach the threshold where it is an active political debate? Is it when enough people speak about it or enough politicians bring it up? It is so subjective and so broad effectively to mean that everything could qualify. Again, this is not taking a harms-based approach to online safety, because the question is not “Who is saying it?” or “In what context?”; the question is, “Does this have the propensity to cause harm at scale?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The harms are covered elsewhere in the Bill. This is saying what you have to take into account. In fact, at the very beginning of your remarks, Kyle, you said that some of the stuff in the US a week or two ago might have been allowed to stand under these provisions, but the provision does not provide an absolute protection; it simply says that the provider has to take it into account. It is a balancing exercise. Other parts of the Bill say, “You’ve got to look at the harm on a systemic basis.” This is saying, “You’ve got to take into account whether the content is of democratic or journalistic importance.” You made a point a second ago about general protection on free speech, which is in clause 19(2).

Kyle Taylor: Can I respond to that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Yes, sure.

Kyle Taylor: My point is that if there is a provision in the Bill about freedom of expression, it should be robust enough that this protection does not have to be in the Bill. To me, this is saying, “Actually, our free expression bit isn’t strong enough, so we’re going to reiterate it here in a very specific context, using very select language”. That may mean that platforms decide not to act for fear of reprisal, as opposed to pursuing online safety. I suggest strengthening the freedom of expression section so that it hits all the points that the Government intend to hit, and removing those qualifiers that create loopholes and uncertainty for a regime that, if it is systems-based, does not have loopholes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand the point you are making, logically. Someone mentioned the human rights element earlier. Of course, article 10 of the European convention on human rights expresses the right to freedom of speech. The case law deriving from that ECHR article provides an enhanced level of protection, particularly for freedom of the press relative to otherwise, so there is some established case law which makes that point. You were talking about human rights earlier, weren’t you?

Ellen Judson: We absolutely recognise that. There is discussion in terms of meeting certain standards of responsible journalism in relation to those protections. Our concern is very much that the people and actors who would most benefit from the journalistic protections specifically would be people who do not meet those standards and cannot prove that they meet those standards, because the standards are very broad. If you intend your content to be journalistic, you are in scope, and that could apply to extremists as much as to people meeting standards of responsible journalism.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q If you are talking about clause 16, it is not that you intend it to be journalistic content; it is that it is journalistic content. You might be talking about clause 50, which is the general exemption to recognise news publishers from the provisions of the Bill. That of course does not prevent social media platforms from choosing to apply their terms and conditions to people who are recognised news publishers; it is just that the Bill is not compelling them. It is important to make that clear—that goes back to the point you made right at the beginning, Kyle. A couple of times in your testimony so far, you have said that you think the way the definition of “recognised news publisher” is drafted in clause 50 is too wide, and potentially susceptible to, basically, abuse by people who are in essence pretending to be news publishers, but who are not really. They are using this as a way to get a free pass from the provisions of the Bill. I completely understand that concern. Do you have any specific suggestions for the Committee about how that concern might be addressed? How could we change the drafting of the Bill to deal with that issue?

Kyle Taylor: Remove the exemption.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q You mean completely? Just delete it?

Kyle Taylor: Well, I am struggling to understand how we can look at the Bill and say, “If this entity says it, it is somehow less harmful than if this entity says it.” That is a two-tiered system and that will not lead to online safety, especially when those entities that are being given privilege are the most likely and largest sources and amplifiers of harmful content online. We sit on the frontlines of this every day, looking at social media, and we can point to countless examples from around the world that will show that, with these exemptions, exceptions and exclusions, you will actually empower those actors, because you explicitly say that they are special. You explicitly say that if they cause harm, it is somehow not as bad as if a normal user with six followers on Twitter causes harm. That is the inconsistency and incoherency in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are talking here about the press, not about politicians—

Kyle Taylor: Yes, but the press and media entities spread a lot of disinformation—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I get that. You have mentioned Victor Orbán and the press already in your comments. There is a long-standing western tradition of treating freedom of the press as something that is sacrosanct and so foundational to the functioning of democracy that you should not infringe or impair it in any way. That is the philosophy that underpins this exclusion.

Kyle Taylor: Except that that is inconsistent in the Bill, because you are saying that for broadcast, they must have a licence, but for print press, they do not have to subscribe to an independent standards authority or code. Even within the media, there is this inconsistency within the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a point that applies regardless of the Bill. The fact is that UK broadcast is regulated whereas UK newspapers are not regulated, and that has been the case for half a century. You can debate whether that is right or wrong, but—

Kyle Taylor: We are accepting that newspapers are not regulated then.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That matter stands outside the scope of the Bill. If one was minded to tighten this up—I know that you have expressed a contrary view to the thing just being deleted—and if you were to accept that the freedom of the press is something pretty sacrosanct, but equally you don’t want it to be abused by people using it as a fig leaf to cover malfeasant activity, do you have any particular suggestions as to how we can improve the drafting of that clause?

Kyle Taylor: I am not suggesting that the freedom of the press is not sacrosanct. Actually, I am expressing the opposite, which is that I believe that it is so sacrosanct that it should be essential to the freedom-of-expression portion of the Bill, and that the press should be set to a standard that meets international human rights and journalistic standards. I want to be really clear that I absolutely believe in freedom of the press, and it is really important that we don’t leave here suggesting that we don’t think that the press should be free—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I got that, but as I say, article 10 case law does treat the press a little differently. We are about to run out of time. I wanted to ask about algorithms, which I will probably not have a chance to do, but are there any specific changes to the clause that you would urge us to make?

Ellen Judson: To the media exemption—

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

To clause 50, “Recognised news publisher”.

Ellen Judson: One of the changes that the Government have indicated that they are minded to make—please correct me if I misunderstood—is to introduce a right to appeal.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Correct.

Ellen Judson: Content having to stay online while the appeal was taking place I would very much urge not to be introduced, on the grounds that the content staying online might then be found to be incredibly harmful, and by the time you have got through an appeals process, it will already have done the damage it was going to do. So, if there is a right to appeal—I would urge there not to be a particular right to appeal beyond what is already in the Bill, but if that is to be included, not having the restriction that the platforms must carry the content while the appeal process is ongoing would be important.

Kyle Taylor: You could require an independent standards code as a benchmark at least.

None Portrait The Chair
- Hansard -

Order. I am afraid that brings us to the end of the time allotted for the Committee to ask questions. It also brings us to the end of the day’s sitting. On behalf of the Committee, I thank the witnesses for your evidence. As you ran out of time and the opportunity to frame answers, if you want to put them in writing and offer them to the Minister, I am sure they will be most welcome. The Committee will meet again on Thursday at 11.30 am in this room to hear further evidence on the Bill.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Third sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage & Committee Debate - 3rd sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Thank you to the witnesses for joining us and giving us such thorough and clear responses to the various questions. I want to start on a topic that William Perrin and William Moy touched on—the exemption for recognised news publishers, set out in clause 50. You both said you have some views on how that is drafted. As you said, I asked questions on Tuesday about whether there are ways in which it could be improved to avoid loopholes—not that I am suggesting there are any, by the way. Mr Perrin and Mr Moy, could you elaborate on the specific areas where you think it might be improved?

William Moy: Essentially, the tests are such that almost anyone could pass them. Without opening the Bill, you have to have a standards code, which you can make up for yourself, a registered office in the UK and so on. It is not very difficult for a deliberate disinformation actor to pass the set of tests in clause 50 as they currently stand.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q How would you change it to address that, if you think it is an issue?

William Moy: This would need a discussion. I have not come here with a draft amendment—frankly, that is the Government’s job. There are two areas of policy thinking over the last 10 years that provide the right seeds and the right material to go into. One is the line of thinking that has been done about public benefit journalism, which has been taken up in the House of Lords Communications and Digital Committee inquiry and the Cairncross review, and is now reflected in recent Charity Commission decisions. Part of Full Fact’s charitable remit is as a publisher of public interest journalism, which is a relatively new innovation, reflecting the Cairncross review. If you take that line of thinking, there might be some useful criteria in there that could be reflected in this clause.

I hate to mention the L-word in this context, but the other line of thinking is the criteria developed in the context of the Leveson inquiry for what makes a sensible level of self-regulation for a media organisation. Although I recognise that that is a past thing, there are still useful criteria in that line of thinking, which would be worth thinking about in this context. As I said, I would be happy to sit down, as a publisher of journalism, with your officials and industry representatives to work out a viable way of achieving your political objectives as effectively as possible.

William Perrin: Such a definition, of course, must satisfy those who are in the industry, so I would say that these definitions need to be firmly industry-led, not simply by the big beasts—for whom we are grateful, every day, for their incredibly incisive journalism—but by this whole spectrum of new types of news providers that are emerging. I have mentioned my experience many years ago of explaining what a blog was to DCMS.

The news industry is changing massively. I should declare an interest: I was involved in some of the work on public-benefit journalism in another capacity. We have national broadcasters, national newspapers, local papers, local broadcasters, local bloggers and local Twitter feeds, all of which form a new and exciting news media ecosystem, and this code needs to work for all of them. I suppose that you would need a very deep-dive exercise with those practitioners to ensure that they fit within this code, so that you achieve your policy objective.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. I am not sure that I can take anything specific away from that. Perhaps that illustrates the difficulty of legislating. The clause, as drafted, obviously represents the best efforts, thus far, to deal with an obviously difficult and complicated issue.

We heard some commentary earlier—I think from Mr Moy—about the need to address misinformation, particularly in the context of a serious situation such as the recent pandemic. I think you were saying that there was a meeting, in March or April 2020, for the then Secretary of State and social media firms to discuss the issue and what steps they might take to deal with it. You said that it was a private meeting and that it should perhaps have happened more transparently.

Do you accept that the powers conferred in clause 146, as drafted, do, in fact, address that issue? They give the Secretary of State powers, in emergency situations—a public health situation or a national security situation, as set out in clause 146(1)—to address precisely that issue of misinformation in an emergency context. Under that clause, it would happen in a way that was statutory, open and transparent. In that context, is it not a very welcome clause?

William Moy: I am sorry to disappoint you, Minister, but no, I do not accept that. The clause basically attaches to Ofcom’s fairly weak media literacy duties, which, as we have already discussed, need to be modernised and made harms-based and safety-based.

However, more to the point, the point that I was trying to make is that we have normalised a level of censorship that was unimaginable in previous generations. A significant part of the pandemic response was, essentially, some of the main information platforms in all of our day-to-day lives taking down content in vast numbers and restricting what we can all see and share. We have started to treat that as a normal part of our lives, and, as someone who believes that the best way to inform debate in an open society is freedom of expression, which I know you believe, too, Minister, I am deeply concerned that we have normalised that. In fact, you referred to it in your Times article.

I think that the Bill needs to step in and prevent that kind of overreach, as well as the triggering of unneeded reactions. In the pandemic, the political pressure was all on taking down harmful health content; there was no countervailing pressure to ensure that the systems did not overreach. We therefore found ridiculous examples, such as police posts warning of fraud around covid being taken down by the internet companies’ automated systems because those systems were set to, essentially, not worry about overreach.

That is why we are saying that we need, in the Bill, a modern, open-society approach to misinformation. That starts with it recognising misinformation in the first place. That is vital, of course. It should then go on to create a modern, harms-based media literacy framework, and to prefer content-neutral and free-speech-based interventions over content-restricting interventions. That was not what was happening during the pandemic, and it is not what will happen by default. It takes Parliament to step in and get away from this habitual, content-restriction reaction and push us into an open-society-based response to misinformation.

William Perrin: Can I just add that it does not say “emergency”? It does not say that at all. It says “reasonable grounds” that “present a threat”—not a big threat—under “special circumstances”. We do not know what any of that means, frankly. With this clause, I get the intent—that it is important for national security, at times, to send messages—but this has not been done in the history of public communication before. If we go back through 50 or 60 years, even 70 years, of Government communication, the Government have bought adverts and put messages transparently in place. Apart from D-notices, the Government have never sought to interfere in the operations of media companies in quite the way that is set out here.

If this clause is to stand, it certainly needs a much higher threshold before the Secretary of State can act—such as who they are receiving advice from. Are they receiving advice from directors of public health, from the National Police Chiefs’ Council or from the national security threat assessment machinery? I should declare an interest; I worked in there a long time ago. It needs a higher threshold and greater clarity, but you could dispense with this by writing to Ofcom and saying, “Ofcom, you should have regard to these ‘special circumstances’. Why don’t you take actions that you might see fit to address them?”

Many circumstances, such as health or safety, are national security issues anyway if they reach a high enough level for intervention, so just boil it all down to national security and be done with it.

Professor Lorna Woods: If I may add something about the treatment of misinformation more generally, I suspect that if it is included in the regime, or if some subset such as health misinformation is included in the regime, it will be under the heading of “harmful to adults”. I am picking up on the point that Mr Moy made that the sorts of interventions will be more about friction and looking at how disinformation is incentivised and spread at an earlier stage, rather than reactive takedown.

Unfortunately, the measures that the Bill currently envisages for “harmful but legal” seem to focus more on the end point of the distribution chain. We are talking about taking down content and restricting access. Clause 13(4) gives the list of measures that a company could employ in relation to priority content harmful to adults.

I suppose that you could say, “Companies are free to take a wider range of actions”, but my question then is this: where does it leave Ofcom, if it is trying to assess compliance with a safety duty, if a company is doing something that is not envisaged by the Act? For example, taking bot networks offline, if that is thought a key factor in the spreading of disinformation—I see that Mr Moy is nodding. A rational response might be, “Let’s get rid of bot networks”, but that, as I read it, does not seem to be envisaged by clause 13(4).

I think that is an example of a more general problem. With “harmful but legal”, we would want to see less emphasis on takedown and more emphasis on friction, but the measures listed as envisaged do not go that far up the chain.

None Portrait The Chair
- Hansard -

Minister, we have just got a couple of minutes left, so perhaps this should be your last question.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. On clause 13(4), the actions listed there are quite wide, given that they include not just “taking down the content”—as set out in clause 13(4)(a) —but also

“(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content.”

I would suggest that those actions are pretty wide, as drafted.

One of the witnesses—I think it was Mr Moy—talked about what were essentially content-agnostic measures to impede virality, and used the word “friction”. Can you elaborate a little bit on what you mean by that in practical terms?

William Moy: Yes, I will give a couple of quick examples. WhatsApp put a forwarding limit on WhatsApp messages during the pandemic. We knew that WhatsApp was a vector through which misinformation could spread, because forwarding is so easy. They restricted it to, I think, six forwards, and then you were not able to forward the message again. That is an example of friction. Twitter has a note whereby if you go to retweet something but you have not clicked on the link, it says, “Do you want to read the article before you share this?” You can still share it, but it creates that moment of pause for people to make a more informed decision.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Would you accept that the level of specificity that you have just outlined there is very difficult, if not impossible, to put in a piece of primary legislation?

William Moy: But that is not what I am suggesting you do. I am suggesting you say that this Parliament prefers interventions that are content-neutral or free speech-based, and that inform users and help them make up their own minds, to interventions that restrict what people can see and share.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q But a piece of legislation has to do more than express a preference; it has to create a statutory duty. I am just saying that that is quite challenging in this context.

William Moy: I do not think it is any more challenging than most of the risk assessments, codes of practice and so on, but I am willing to spend as many hours as it takes to talk through it with you.

None Portrait The Chair
- Hansard -

Order. I am afraid that we have come to the end of our allotted time for questions. On behalf of the Committee, I thank the witnesses for all their evidence.

Examination of Witnesses

Danny Stone MBE, Stephen Kinsella OBE and Liron Velleman gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Would any other witness like to contribute? No.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you again to the witnesses for joining us this morning. I will start with Stephen Kinsella. You have spoken already about some of the issues to do with anonymity. Can you share with the Committee your view on the amendments made to the Bill, when it was introduced a couple of months ago, to give users choices over self-verification and the content they see? Do you think they are useful and helpful updates to the Bill?

Stephen Kinsella: Yes. We think they are extremely helpful. We welcome what we see in clause 14 and clause 57. There is thus a very clear right to be verified, and an ability to screen out interactions with unverified accounts, which is precisely what we asked for. The Committee will be aware that we have put forward some further proposals. I would really hesitate to describe them as amendments; I see them as shading-in areas—we are not trying to add anything. We think that it would be helpful, for instance, when someone is entitled to be verified, that verification status should also be visible to other users. We think that should be implicit, because it is meant to act as a signal to others as to whether someone is verified. We hope that would be visible, and we have suggested the addition of just a few words into clause 14 on that.

We think that the Bill would benefit from a further definition of what it means by “user identity verification”. We have put forward a proposal on that. It is such an important term that I think it would be helpful to have it as a defined term in clause 189. Finally, we have suggested a little bit more precision on the things that Ofcom should take into account when dealing with platforms. I have been a regulatory lawyer for nearly 40 years, and I know that regulators often benefit from having that sort of clarity. There is going to be negotiation between Ofcom and the platforms. If Ofcom can refer to a more detailed list of the factors it is supposed to take into account, I think that will speed the process up.

One of the reasons we particularly welcomed the structure of the Bill is that there is no wait for detailed codes of conduct because these are duties that we will be executing immediately. I hope Ofcom is working on the guidance already, but the guidance could come out pretty quickly. Then there would be the process of—maybe negotiating is the wrong word—to-and-fro with the platforms. I would be very reluctant to take too much on trust. I do not mean on trust from the Government; I mean on trust from the platforms—I saw the Minister look up quickly then. We have confidence in Government; it is the platforms we are little bit wary of. I heard the frustration expressed on Tuesday.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

indicated assent.

Stephen Kinsella: I think you said, “If platforms care about the users, why aren’t they already implementing this?” Another Member, who is not here today, said, “Why do they have to be brought kicking and screaming?” Yet, every time platforms were asked, we heard them say, “We will have to wait until we see the detail of—”, and then they would fill in whatever thing is likely to come last in the process. So we welcome the approach. Our suggestions are very modest and we are very happy to discuss them with you.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, and thank you for the work that you have done on this issue, together with Siobhan Baillie, my hon. Friend the Member for Stroud, which the Government adopted. Some of the areas that you have referred to could be dealt with in subsequent Ofcom codes of practice, but we are certainly happy to look at your submissions. Thank you for the work that you have done in this area.

Danny, we have had some fairly extensive discussions on the question of small but toxic platforms such as 4chan and BitChute—thank you for coming to the Department to discuss them. I heard your earlier response to the shadow Minister, but do you accept that those platforms should be subject to duties in the Bill in relation to content that is illegal and content that is already harmful to children?

Danny Stone: Yes, that is accurate. My position has always been that that is a good thing. The extent and the nature of the content that is harmful to adults on such platforms—you mentioned BitChute but there are plenty of others—require an additional level of regulatory burden and closer proximity to the regulator. Those platforms should have to account for it and say, “We are the platforms; we are happy that this harm is on our platform and”—as the Bill says—“we are promoting it.” You are right that it is captured to some degree; I think it could be captured further.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I understand; thank you. Liron, in an earlier answer, you referred to the protections for content of democratic importance and journalistic content, which are set out in clauses 15 and 16. You suggested and were concerned that they could act as a bar to hateful, prohibited or even illegal speech being properly enforced against. Do you accept that clauses 15 and 16 do not provide an absolute protection for content of democratic importance or journalistic content, and that they do not exempt such content from the Bill’s provisions? They simply say that in discharging duties under the Bill, operators must use

“proportionate systems and processes…to ensure that…content of democratic”—

or journalistic—

“importance is taken into account”.

That is not an absolute protection; it is simply a requirement to take into account and perform a proportionate and reasonable balancing exercise. Is that not reasonable?

Liron Velleman: I have a couple of things to say on that. First, we and others in civil society have spent a decade trying to de-platform some of the most harmful actors from mainstream social media companies. What we do not want to see after the Bill becomes an Act are massive test cases where we do not know which way they will go and where it will be up to either the courts or social media companies to make their own decisions on how much regard they place in those exemptions at the same time as all the other clauses.

Secondly, one of our main concerns is the time it takes for some of that content to be removed. If we have a situation in which there is an expediated process for complaints to be made, and for journalistic content to remain on the platform for an announced time until the platform is able take it down, that could move far outside the realms of that journalistic or democratically important content. Again, using the earlier examples, it does not take long for content such as a livestream of a terrorist attack to be up on the Sun or the Daily Mirror websites and for lots of people to modify that video and bypass content, which can then be shared and used to recruit new terrorists and allow copycat attacks to happen, and can go into the worst sewers of the internet. Any friction that is placed on stopping platforms being able to take down some of that harm is definitely of particular concern to us.

Finally, as we heard on Tuesday, social media platforms—I am not sure I would agree with much of what they would say about the Bill, but I think this is true—do not really understand what they are meant to do with these clauses. Some of them are talking about flowcharts and whether this is a point-scoring system that says, “You get plus one for being a journalist, but minus two for being a racist.” I am not entirely sure that platforms will exercise the same level of regard. If, with some of the better-faith actors in the social media space, we have successfully taken down huge reams of the most harmful content and moved it away from where millions of people can see it to where only tens of thousands can see it, we do not want in any way the potential to open up the risk that hundreds of people could argue that they should be back on platforms when they are currently not there.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Okay, thank you. My last question touches on those issues and is for each of the panel in turn. Some people have claimed—I think wrongly—that the provisions in the Bill in some way threaten free speech. As you will have seen in the article I wrote in The Times earlier this week, I do not think, for a number of reasons, that that is remotely true, but I would be interested in hearing the views of each of the panel members on whether there is any risk to freedom of speech in the work that the Bill does in terms of protecting people from illegal content, harm to children and content that is potentially harmful to adults.

Danny Stone: My take on this—I think people have misunderstood the Bill—is that it ultimately creates a regulated marketplace of harm. As a user, you get to determine how harmful a platform you wish to engage with—that is ultimately what it does. I do not think that it enforces content take-downs, except in relation to illegal material. It is about systems, and in some places, as you have heard today, it should be more about systems, introducing friction, risk-assessing and showing the extent to which harm is served up to people. That has its problems.

The only other thing on free speech is that we sometimes take too narrow a view of it. People are crowded out of spaces, particularly minority groups. If I, as a Jewish person, want to go on 4chan, it is highly unlikely that I will get a fair hearing there. I will be threatened or bullied out of that space. Free speech has to apply across the piece; it is not limited. We need to think about those overlapping harms when it comes to human rights—not just free speech but freedom from discrimination. We need to be thinking about free speech in its widest context.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. You made a very important point: there is nothing in the Bill that requires censorship or prohibition of content that is legal and harmless to children. That is a really important point.

Stephen Kinsella: I agree entirely with what Danny was saying. Of course, we would say that our proposals have no implications for free speech. What we are talking about is the freedom not to be shouted at—that is really what we are introducing.

On disinformation, we did some research in the early days of our campaign that showed that a vast amount of the misinformation and disinformation around the 5G covid conspiracy was spread and amplified by anonymous or unverified accounts, so they play a disproportionate role in disseminating that. They also play a disproportionate role in disseminating abuse, and I think you may have a separate session with Kick It Out and the other football bodies. They have some very good research that shows the extent to which abusive language is from unverified or anonymous accounts. So, no, we do not have any free speech concerns at Clean up the Internet.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Good. Thank you, Stephen. Liron?

Liron Velleman: We are satisfied that the Bill adequately protects freedom of speech. Our key view is that, if people are worried that it does not, beefing up the universal protections for freedom of speech should be the priority, instead of what we believe are potentially harmful exemptions in the Bill. We think that freedom of speech for all should be protected, and we very much agree with what Danny said—that the Bill should be about enhancing freedom of speech. There are so many communities that do not use social media platforms because of the harm that exists currently on platforms.

On children, the Bill should not be about limiting freedom of speech, but a large amount of our work covers the growth of youth radicalisation, particularly in the far right, which exists primarily online and which can then lead to offline consequences. You just have to look at the number of arrests of teenagers for far-right terrorism, and so much of that comes from the internet. Part of the Bill is about moderating online content, but it definitely serves to protect against some of the offline consequences of what exists on the platform. We would hope that if people are looking to strengthen freedom of speech, that is a universalist principle in the Bill, and not for some groups but not others.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Good. Thank you. I hope the Committee is reassured by those comments on the freedom of speech question.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Q I will use the small amount of time we have left to ask one question. A number of other stakeholders and witnesses have expressed concerns regarding the removal of a digital media literacy strategy from the Bill. What role do you see a digital media literacy strategy playing in preventing the kind of abuse that you have been describing?

Danny Stone: I think that a media literacy strategy is really important. There is, for example, UCL data on the lack of knowledge of the word “antisemitism”: 68% of nearly 8,000 students were unfamiliar with the term’s meaning. Dr Tom Harrison has discussed cultivating cyber-phronesis—this was also in an article by Nicky Morgan in the “Red Box” column some time ago—which is a method of building practical knowledge over time to make the right decisions when presented with a moral challenge. We are not well geared up as a society—I am looking at my own kids—to educate young people about their interactions, about what it means when they are online in front of that box and about to type something, and about what might be received back. I have talked about some of the harms people might be directed to, even through Alexa, but some kind of wider strategy, which goes beyond what is already there from Ofcom—during the Joint Committee process, the Government said that Ofcom already has its media literacy requirements—and which, as you heard earlier, updates it to make it more fit for purpose for the modern age, would be very appropriate.

Stephen Kinsella: I echo that. We also think that that would be welcome. When we talk about media literacy, we often find ourselves with the platforms throwing all the obligation back on to the users. Frankly, that is one of the reasons why we put forward our proposal, because we think that verification is quite a strong signal. It can tell you quite a lot about how likely it is that what you are seeing or reading is going to be true if someone is willing to put their name to it. Seeing verification is just one contribution. We are really talking about trying to build or rebuild trust online, because that is what is seriously lacking. That is a system and design failure in the way that these platforms have been built and allowed to operate.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The shadow Minister’s question is related to the removal of what was clause 103 in the old draft of the Bill. As she said, that related to media literacy. Does the panel draw any comfort from three facts? First, there is already a media literacy duty on Ofcom under section 11 of the Communications Act 2003—the now deleted clause 103 simply provided clarification on an existing duty. Secondly, last December, after the Joint Committee’s deliberations, but before the updated Bill was published, Ofcom published its own updated approach to online media literacy, which laid out the fact that it was going to expand its media literacy programme beyond what used to be in the former clause 103. Finally, the Government also have their own media literacy strategy, which is being funded and rolled out. Do those three things—including, critically, Ofcom’s own updated guidance last December—give the panel comfort and confidence that media literacy is being well addressed?

Liron Velleman: If the Bill is seeking to make the UK the safest place to be on the internet, it seems to be the obvious place to put in something about media literacy. I completely agree with what Danny said earlier: we would also want to specifically ensure—although I am sure this already exists in some other parts of Ofcom and Government business—that there is much greater media literacy for adults as well as children. There are lots of conversations about how children understand use of the internet, but what we have seen, especially during the pandemic, is the proliferation of things like community Facebook groups, which used to be about bins and a fair that is going on this weekend, becoming about the worst excesses of harmful content. People have seen conspiracy theories, and that is where we have seen some of the big changes to how the far-right and other hateful groups operate, in terms of being able to use some of those platforms. That is because of a lack of media literacy not just among children, but among the adult population. I definitely would encourage that being in the Bill, as well as anywhere else, so that we can remove some of those harms.

Danny Stone: I think it will need further funding, beyond what has already been announced. That might put a smile on the faces of some Department for Education officials, who looked so sad during some of the consultation process—trying to ensure that there is proper funding. If you are going to roll this out across the country and make it fit for purpose, it is going to cost a lot of money.

None Portrait The Chair
- Hansard -

Thank you. As there are no further questions from Members, I thank the witnesses for their evidence. That concludes this morning’s sitting.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fourth sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage & Committee Debate - 4th sitting
Thursday 26th May 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 26 May 2022 - (26 May 2022)

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q You have no concerns about that.

Stephen Almond: No.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Q Mr Almond, welcome to the Committee. Thank you for joining us this afternoon. Can I start with co-operation? You mentioned a moment ago in answer to Maria Miller that co-operation between regulators, particularly in this context the ICO and Ofcom, was going to be very important. Would you describe the co-operative work that is happening already and that you will be undertaking in the future, and comment on the role that the Digital Regulation Cooperation Forum has in facilitating that?

Stephen Almond: Thank you very much. I will start by explaining the Digital Regulation Cooperation Forum. It is a voluntary, not statutory, forum that brings together ourselves, Ofcom, the Competition and Markets Authority and the Financial Conduct Authority—some of the regulators with the greatest interest in digital regulation—to make sure that we have a coherent approach to the regulation of digital services in the interests of the public and indeed the economy.

We are brought together through our common interest. We do not require a series of duties or statutory frameworks to make us co-operate, because the case for co-operation is very, very clear. We will deliver better outcomes by working together and by joining up where our powers align. I think that is what you are seeing in practice in some of the work we have done jointly—for example, around the implementation of the children’s code alongside Ofcom’s implementation of the video-sharing platform regime. A joined-up approach to questions about, for example, how you assure the age of children online is really important. That gives me real confidence in reassuring the Committee that the ICO, Ofcom and other digital regulators will be able to take a very joined-up approach to regulating in the context of the new online safety regime.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you very much. That is extremely helpful. From the perspective of privacy, how satisfied are you that the Bill as constructed gives the appropriate protections to users’ privacy?

Stephen Almond: In our view, the Bill strikes an appropriate balance between privacy and online safety. The duties in the Bill should leave service providers in no doubt that they must comply with data protection law, and that they should guard against unwarranted intrusion of privacy. In my discourse with firms, I am very clear that this is not a trade-off between online safety and privacy: it is both. We are firmly expecting that companies take that forward and work out how they are going to adopt both a “privacy by design” and a “safety by design” approach to the delivery of their services. They must deliver both.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. My final question is this: do you feel the Bill has been constructed in such a way that it works consistently with the data protection provisions, such as UK GDPR and the Data Protection Act 2018?

Stephen Almond: In brief, yes. We feel that the Bill has been designed to work alongside data protection law, for which we remain the statutory regulator, but with appropriate mechanisms for co-operation with the ICO—so, with this series of consultation duties where codes of practice or guidance that could be issued by Ofcom may have an impact on privacy. We think that is the best way of assuring regulatory coherence in this area.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is very helpful. Thank you very much indeed.

None Portrait The Chair
- Hansard -

Mr Almond, we are trying to get a pint into a half-pint pot doing this, so we are rushing a bit. If, when you leave the room, you have a “I wish I’d said that” moment, please feel free to put it in writing to us. We are indebted to you. Thank you very much indeed.

Examination of Witnesses

Sanjay Bhandari and Lynn Perry gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q Should the Bill commit to that?

Lynn Perry: As a recommendation, we think that could only strengthen the protections of children.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Picking up that last point about representation for particular groups of users including children, Ms Perry, do you agree that the ability to designate organisations that can make super-complaints might be an extremely valuable avenue, in particular for organisations that represent user groups such as children? Organisations such as yours could get designated and then speak on behalf of children in a formal context. You could raise super-complaints with the regulator on behalf of the children you speak for. Is that something to welcome? Would it address the point made by my colleague, Kim Leadbetter, a moment ago?

Lynn Perry: We would welcome provision to be able to bring particularly significant evidence of concern. That is certainly something that organisations, large charities in the sector and those responsible for representing the rights of children and young people would welcome. On some of these issues, we work in coalition to make representations on behalf of children and young people, as well as of parents and carers, who also raise some concerns. The ability to do that and to strengthen the response is something that would be welcomed.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I am glad you welcome that. I have a question for both witnesses, briefly. You have commented in some detail on various aspects of the Bill, but do you feel that the Bill as a whole represents a substantial step forward in protecting children, in your case, Ms Perry, and those you speak for, Sanjay?

Sanjay Bhandari: Our beneficiaries are under-represented or minority communities in sports. I agree, I think that the Bill goes a substantial way to protecting them and to dealing with some of the issues that we saw most acutely after the Euro 2020 finals.

We have to look at the Bill in context. This is revolutionary legislation, which we are not seeing anywhere else in the world. We are going first. The basic sanctions framework and the 10% fines I have seen working in other areas—anti-trust in particular. In Europe, that has a long history. The definition of harm being in the manner of dissemination will pick up pile-ons and some forms of trolling that we see a lot of. Hate crime being designated as priority illegal content is a big one for us, because it puts the proactive duty on the platforms. That too will take away quite a lot of content, we think. The new threatening communications offence we have talked about will deal with rape and death threats. Often the focus is on, quite rightly, the experience of black professional footballers, but there are also other people who play, watch and work in the game, including our female pundits and our LGBT fan groups, who also get loads of this abuse online. The harm-based offence—communications sent to cause harm without reasonable excuse—will likely cover things such as malicious tagging and other forms of trolling. I have already talked about the identification, verification and anonymity provisions.

I think that the Bill will go a substantial way. I am still interested in what fits into that residual category of content harmful to adults, but rather than enter into an arid philosophical and theoretical debate, I will take the spirit of the Bill and try to tag it to real content.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Before I turn to Ms Perry with the same question about the Bill’s general effect, Sanjay, you mentioned the terrible incidence of abuse that the three England footballers got after the penalties last summer. Do you think the social media firms’ response to that incident was adequate, or anywhere close to adequate? If not, does that underline the need for this legislation?

Sanjay Bhandari: I do not think it was adequate because we still see stuff coming through. They have the greatest power to stop it. One thing we are interested in is improving transparency reporting. I have asked them a number of times, “Someone does not become a troll overnight, in the same way that someone does not become a heroin addict overnight, or commit an extremist act of terrorism overnight. There is a pathway where people start off, and you have that data. Can I have it?” I have lost count of the number of times that I have asked for that data. Now I want Ofcom to ask them for it.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes. There are strong powers in the Bill for Ofcom to do precisely that. Ms Perry, may I ask you same general question? Do you feel that the Bill represents a very substantial step forward in protecting children?

Lynn Perry: We do. Barnardo’s really welcomes the Bill. We think it is a unique and once-in-a-generation opportunity to achieve some really long-term changes to protect children from a range of online harms. There are some areas in which the Bill could go further, which we have talked about today. The opportunity that we see here is to make the UK the safest place in the world for children to be online. There are some very important provisions that we welcome, not least on age verification, the ability to raise issues through super-complaints, which you have asked me about, and the accountability in various places throughout the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you, Ms Perry. Finally, Mr Bhandari, some people have raised concerns about free speech. I do not share those concerns—in fact, I rebutted them a Times article earlier this week—but does the Bill cause you any concern from a free-speech perspective?

Sanjay Bhandari: As I said earlier, there are no absolute rights. There is no absolute right to freedom of speech— I cannot shout “Fire!” here—and there is no absolute right to privacy; I cannot use my anonymity as a cloak for criminality. It is question of drawing an appropriate balance. In my opinion, the Bill draws an appropriate balance between the right to freedom of speech and the right to privacy. I believe in both, but in the same way that I believe in motherhood and apple pie: of course I believe in them. It is really about the balancing exercise, and I think this is a sensible, pragmatic balancing exercise.

None Portrait The Chair
- Hansard -

Ms Perry, I am very pleased that we were finally able to hear from you. Thank you very much indeed—you have been very patient. Thank you very much, Mr Bhandari. If either of you, as a result of what you have heard and been asked today, have any further thoughts that you wish to submit, please do so.

Examination of Witnesses

Eva Hartshorn-Sanders and Poppy Wood gave evidence.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Thank you. Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us this afternoon and for giving us your evidence so far. At the beginning of your testimony, Ms Hartshorn-Sanders, I think you mentioned—I want to ensure I heard correctly—that you believe, or have evidence, that Instagram is still, even today, failing to take down 90% of inappropriate content that is flagged to it.

Eva Hartshorn-Sanders: Our “Hidden Hate” report was on DMs—direct messages—that were shared by the participants in the study. One in 15 of those broke the terms and conditions that Instagram had set out related to misogynist abuse—sexual abuse. That was in the wake of the World cup, so after Instagram had done a big promotion about how great it was going to be in having policies on these issues going forward. We found that 90% of that content was not acted on when we reported it. This was not even them going out proactively to find the content and not doing anything with it; it was raised for their attention, using their systems.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That clearly illustrates the problem we have. Two parts of the Bill are designed to address this: first, the ability for designated user representation groups to raise super-complaints—an issue such as the one you just mentioned, a systemic issue, could be the subject of such a super-compliant to Ofcom, in this case about Instagram—and, secondly, at clause 18, the Bill imposes duties on the platforms to have proper complaints procedures, through which they have to deal with complaints properly. Do those two provisions, the super-complaints mechanism for representative groups and clause 18 on complaints procedures, go a long way towards addressing the issue that you helpfully and rightly identified?

Eva Hartshorn-Sanders: That will depend on transparency, as Poppy mentioned. How much of that information can be shared? We are doing research at the moment on data that is shared personally, or is publicly available through the different tools that we have. So it is strengthening access to that data.

There is this information asymmetry that happens at the moment, where big tech is able to see patterns of abuse. In some cases, as in the misogyny report, you have situations where a woman might be subject to abuse from one person over and over again. The way that is treated in the EU is that Instagram will go back and look at the last 30 historically to see the pattern of abuse that exists. They are not applying that same type of rigorousness to other jurisdictions. So it is having access to it in the audits that are able to happen. Everyone should be safe online, so this should be a safety-by-design feature that the companies have.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Meta claimed in evidence to the Committee on Tuesday that it gave researchers good access to its data. Do you think that is true?

Eva Hartshorn-Sanders: I think it depends on who the researchers are. I personally do not have experience of it, but I cannot speak to that. On transparency, at the moment, the platforms generally choose what they share. They do not necessarily give you the data that you need. You can hear from my accent that I am originally from New Zealand. I know that in the wake of the Christchurch mosque terrorist attack, they were not prepared to provide the independent regulator with data on how many New Zealanders had seen the footage of the livestream, which had gone viral globally. That is inexcusable, really.

None Portrait The Chair
- Hansard -

Q Ms Wood, do you want to comment on any of this before we move on?

Poppy Wood: On the point about access to data, I do not believe that the platforms go as far as they could, or even as far as they say they do. Meta have a tool called CrowdTangle, which they use to provide access to data for certain researchers who are privileged enough to have access. That does not even include comments on posts; it is only the posts themselves. The platforms pull the rug out all the time from under researchers who are investigating things that the platforms do not like. We saw that with Laura Edelson at New York University, who they just cut off—that is one of the most famous cases. I think it is quite egregious of Meta to say that they give lots of access to data.

We know from the revelations of whistleblowers that Meta do their own internal research, and when they do not like the results, they just bury it. They might give certain researchers access to data under certain provisions, but independent researchers who want to investigate a certain emergent harm or a certain problem are not being given the sort of access that they really need to get insights that move the needle. I am afraid that I just do not believe that at all.

The Bill could go much further. A provision on access to data in clause 136 states that Ofcom has two years to issue a report on whether researchers should get access to data. I think we know that researchers should have access to data, so I would, as a bare minimum, shorten the time that Ofcom has to do that report from two years to six months. You could turn that into a question of how to give researchers access to data rather than of whether they should get it. The Digital Services Act—the EU equivalent of the Bill—goes a bit further on access to data than our Bill. One result of that might be that researchers go to the EU to get their data because they can get it sooner.

Improving the Bill’s access to data provisions is a no-brainer. It is a good thing for the Government because we will see more stuff coming out of academia, and it is a good thing for the safety tech sector, because the more research is out there, the more tools can be built to tackle online harms. I certainly call on the Government to think about whether clause 136 could go further.

None Portrait The Chair
- Hansard -

Thank you. Last brief question, Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Goodness! There is a lot to ask about.

None Portrait The Chair
- Hansard -

Sorry, we are running out of time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I appreciate that; thank you, Sir Roger. Ms Wood, you mentioned misinformation in your earlier remarks—I say “misinformation” rather than “state-sponsored disinformation”, which is a bit different. It is very difficult to define that in statute and to have an approach that does not lead to bias or to what might be construed as censorship. Do you have any particular thoughts on how misinformation could be concretely and tangibly addressed?

Poppy Wood: It is not an easy problem to solve, for sure. What everybody is saying is that you do it in a content-neutral way, so that you are not talking about listing specific types of misinformation but about the risks that are built into your system and that need to be mitigated. This is a safety by design question. We have heard a lot about introducing more friction into the system, checking the virality threshold, and being more transparent. If you can get better on transparency, I think you will get better on misinformation.

If there is more of an obligation on the platforms to, first, do a broader risk assessment outside of the content that will be listed as priority content and, secondly, introduce some “harm reduction by design” mechanisms, through friction and stemming virality, that are not specific to certain types of misinformation, but are much more about safety by design features—if we can do that, we are part of the way there. You are not going to solve this problem straightaway, but you should have more friction in the system, be it through a code of practice or a duty somewhere to account for risk and build safer systems. It cannot be a content play; it has to be a systems play.

None Portrait The Chair
- Hansard -

Thank you. I am sorry, but that brings us to the end of the time allotted to this session. Ladies, if either of you wishes to make a submission in writing in the light of what you have not answered or not been able to answer, please do. Ms Wood, Ms Hartsholm-Sanders, thank you very much indeed for joining us.

Examination of Witnesses

Owen Meredith and Matt Rogerson gave evidence.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q My only concern is that someone who just decides to call themselves a journalist will be able to say what they want.

Owen Meredith: I do not think that would be allowable under the Bill, because of the distinction between a recognised news publisher publishing what we would all recognise as journalistic content, versus the journalistic content exemption. I think that is why they are treated differently.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Can I start by clarifying a comment that Owen Meredith made at the very beginning? You were commenting on where you would like the Bill to go further in protecting media organisations, and you said that you wanted there to be a wholesale exemption for recognised news publishers. I think there already is a wholesale exemption for recognised news publishers. The area where the Government have said they are looking at going further is in relation to what some people call a temporary “must carry” provision, or a mandatory right of appeal for recognised news publishers. Can I just clarify that that is what you meant?

Owen Meredith: Yes. I think the issue is how that exemption will work in practice. I think that what the Government have said they are looking at and will bring forward does address the operating in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. Can I move on to the question that Kim Leadbeater asked a moment ago, and that a number of Members have raised? You very kindly said a moment ago that you thought that clause 50, which sets out the definition of “recognised news publisher”, works as drafted. I would like to test that a bit, because some witnesses have said that it is quite widely drawn, and suggested that it would be relatively easy for somebody to set themselves up in a manner that met the test laid out in clause 50. Given the criticism that we have heard a few times today and on Tuesday, can you just expand for the Committee why you think that is not the case?

Owen Meredith: As I alluded to earlier, it is a real challenge to set out this legal definition in a country that believes, rightly, in the freedom of the press as a fourth pillar of democracy. It is a huge challenge to start with, and therefore we have to set out criteria that cover the vast majority of news publishers but do not end up with a backdoor licensing system for the press, which I think we are all keen to avoid. I think it meets that criterion.

On the so-called bad actors seeking to abuse that, I have listened to and read some of the evidence that you have had from others—not extensively, I must say, due to other commitments this week—and I think that it would be very hard for someone to meet all those criteria as set out in order to take advantage of this. I think that, as Matt has said, there will clearly be tests and challenges to that over time. It will rightly be challenged in court or go through the usual judicial process.

Matt Rogerson: It seems to me that the whole Bill will be an iterative process. The internet will not suddenly become safe when the Bill receives Royal Assent, so there will be this process whereby guidance and case law are developed, in terms of what a newspaper is, against the criteria. There are exemptions for news publishers in a whole range of other laws that are perfectly workable. I think that Ofcom is perfectly well equipped to create guidance that enables it to be perfectly workable.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you. So you are categorically satisfied about the risks that we have heard articulated; that maleficent actors would not be able to set themselves up in such a way that they benefit from this exemption.

Matt Rogerson: Subject to the guidance developed by Ofcom, which we will be engaged in developing, I do think so. The other thing to bear in mind is that the platforms already have lists of trusted publishers. For example, Google has a list in relation to Google News—I think it has about 65,000 publishers—which it automates to push through Google News as trusted news publishers. Similarly, Facebook has a list of trusted news publishers that it uses as a signal for the Facebook newsfeed. So I do not buy the idea that you can’t automate the use of trusted news sources within those products.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you; that is very helpful. I have only one other question. In relation to questions concerning freedom of speech, the Government believe, and I believe, that the Bill very powerfully protects freedom of speech. Indeed, it does so explicitly through clause 19, in addition to the protections for recognised news publishers that we have discussed already and the additional protections for content of journalistic and democratic importance, notwithstanding the definitional question that have been raised. Would you agree that this Bill respects and protects free speech, while also delivering the safety objectives that it quite rightly has?

Owen Meredith: If I can speak to the point that directly relates to my members and those I represent, which is “Does it protect press freedom?”, which is perhaps an extension of your question, I would say that it is seeking to. Given the assurances you have given about the detailed amendments that you intend to bring forward—if those are correct, and I am very happy to write to the Committee and comment once we have seen the detail, if it would be helpful to do so—and everything I have heard about what you are intending to do, I believe it will. But I do not believe that the current draft properly and adequately protects press freedom, which is why, I think, you will be bringing forward amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, but with the amendment committed to on Second Reading, you would say that the Bill does meet those freedom of speech objectives, subject to the detail.

Owen Meredith: Subject to seeing the drafting, but I believe the intention—yes.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you. That is very helpful. Mr Rogerson?

Matt Rogerson: As we know, this is a world first: regulation of the internet, regulation of speech acts on the internet. From a news publisher perspective, I think all the principles are right in terms of what the Government are trying to do. In terms of free speech more broadly, a lot of it will come down to how the platforms implement the Bill in practice. Only time will tell in terms of the guidance that Ofcom develops and how the platforms implement that at vast scale. That is when we will see what impact the Bill actually has in practice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q From a general free speech perspective—which obviously includes the press’s freedom of speech, but everybody else’s as well—what do you think about the right enshrined in clause 19(2), where for the first time ever the platforms’ have to have regard to the importance of protecting users’ right to freedom of speech is put on the face of a Bill? Do you think that is helpful? It is a legal obligation they do not currently have, but they will have it after the passage of the Bill. In relation to “legal but harmful” duties, platforms will also have an obligation to be consistent in the application of their own terms and conditions, which they do not have to be at the moment. Very often, they are not consistent; very often, they are arbitrary. Do you think those two changes will help general freedom of speech?

Matt Rogerson: Yes. With the development of the online platforms to the dominant position they are in today, that will be a big step forward. The only thing I would add is that, as well as this Bill, the other Bill that will make a massive difference when it comes through is the digital markets unit Bill. We need competition to Facebook so that consumers have a choice and so that they can decide which social network they want to be on, not just the one dominant social network that is available to them in this country.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I commend your ingenuity in levering an appeal for more digital competition into this discussion. Thank you.

None Portrait The Chair
- Hansard -

One final quick question from the Opposition Front Bench.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

Q I thank the witnesses for coming. In terms of regulation, I was going to ask whether you believe that Ofcom is the most suitable regulator to operate in this area. You have almost alluded to the fact that you might not. On that basis, should we specify in the Bill a duty for Ofcom to co-operate with other regulators—for example, the Competition and Markets Authority, the Financial Conduct Authority, Action Fraud or whoever else?

Tim Fassam: I believe that would be helpful. I think Ofcom is the right organisation to manage the relationship with the platforms, because it is going to be much broader than the topics we are talking about in our session, but we do think the FCA, Action Fraud and potentially the CMA should be able to direct, and be very clear with Ofcom, that action needs to be taken. Ofcom should have the ability to ask for things to be reviewed to see whether they break the rules.

The other area where we think action probably needs to be taken is where firms are under investigation, because the Bill assumes it is clear cut whether something is fraud, a scam, a breach of the regulations or not. In some circumstances, that can take six months or a year to establish through investigation. We believe that if, for example, the FCA feels that something is high risk, it should be able to ask Ofcom to suspend an advert, or a firm from advertising, pending an investigation to assess whether it is a breach of the regulation.

Rocio Concha: I agree that Ofcom is the right regulator, the main regulator, but it needs to work with the other regulators—with the FCA, ASA and CMA—to enforce the Bill effectively. There is another area. Basically, we need to make sure that Ofcom and all the regulators involved have the right resources. When the initial version of the Bill was published, Ofcom got additional resources to enable it to enforce the Bill. But the Bill has increased in scope, because now it includes fraud and fraudulent advertising. We need to make sure that Ofcom has the right resources to enforce the full Bill effectively. That is something that the Government really need to consider.

Martin Lewis: I was going to make exactly that point, but it has just been made brilliantly so I will not waste your time.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I thank the witnesses for joining us this afternoon, and particularly Martin Lewis for his campaigning in this area.

I will start by agreeing with the point that Martin Lewis made a minute or two ago—that we cannot trust these companies to work on their own. Mr Lewis, I am not sure whether you have had a chance to go through clause 34, which we inserted into the Bill following your evidence to the Joint Committee last year. It imposes a duty on these companies to take steps and implement systems to

“prevent individuals from encountering content consisting of fraudulent advertisements”.

There is a clear duty to stop them from doing this, rather as you were asking a minute ago when you described the presentation. Does that strong requirement in clause 34, to stop individuals from encountering fraudulent advertisement content, meet the objective that you were asking for last year?

Martin Lewis: Let me start by saying that I am very grateful that you have put it in there and thankful that the Government have listened to our campaign. What I am about to say is not intended as criticism.

It is very difficult to know how this will work in practice. The issue is all about thresholds. How many scam adverts can we stomach? I still have, daily—even from the platform that I sued, never mind the others—tens of reports directly to me of scam adverts with my face on. Even though there is a promise that we will try to mitigate that, the companies are not doing it. We have to have a legitimate understanding that we are not going to have zero scam adverts on these platforms; unless they were to pre-vet, which I do not think they will, the way they operate means that will not happen.

I am not a lawyer but my concern is that the Bill should make it clear, and that any interpretation of the Bill from Ofcom should be clear, about exactly what threshold of scam adverts is acceptable—we know that they are going to happen—and what threshold is not acceptable. I do not have the expertise to answer your question; I have to rely on your expertise to do that. But I ask the Committee to think properly about what the threshold level should be.

What is and is not acceptable? What counts as “doing everything they can”? They are going to get big lawyers involved if you say there must be zero scam adverts—that is not going to happen. How many scam adverts are acceptable and how many are not? I am so sorry to throw that back as a question when I am a witness, but I do not have the expertise to answer. But that is my concern: I am not 100% convinced of the threshold level that you are setting.

None Portrait The Chair
- Hansard -

Q Mr Fassam, do you have the answer?

Tim Fassam: I think we are positive about the actions that have been taken regarding social media; our concern is that the clause is not applied to search and that it excludes paid-for ads that are also user-generated content—promoted tweets or promoted posts, for example. We would ensure that that applied to all paid-for adverts and that it was consistent between social media and search.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Mr Fassam, I will address those two questions, if I may. Search is covered by clause 35 and user-generated content is subject to the Bill’s general provisions on user-generated content. Included in the scope of that are the priority illegal offences defined in schedule 7. Among those are included, on page 185—not that I expect you to have memorised the Bill—financial services offences that include a number of those offences to do with pretending to carry out regulated financial activity when in fact you are not regulated. Also included are the fraud offences—the various offences under the Fraud Act 2006. Do come back if you think I have this wrong, but I believe that we have search covered in clause 35 and promoted user-generated content covered via schedule 7 page 185.

Tim Fassam: You absolutely do, but to a weaker standard than in clause 34.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q In clause 35 there is the drafting point that we are looking at. It says “minimise the risk” instead of “prevent”. You are right to point out that drafting issue. In relation to the user-generated stuff, there is a duty on the platforms to proactively stop priority illegal content, as defined in schedule 7. I do take your drafting point on clause 35.

Tim Fassam: Thank you.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I want to pick up on Martin Lewis’s point about enforcement. He said that he had to sue Facebook himself, which was no doubt an onerous, painful and costly enterprise—at least costly initially, because hopefully you got your expenses back. Under the Bill, enforcement will fall to Ofcom. The penalties that social media firms could be handed by Ofcom for failing to meet the duties we have discussed include a fine amounting to 10% of global revenue as a maximum, which runs into billions of pounds. Do the witnesses feel that level of sanction—10% of global revenue and ultimately denial of service—is adequately punitive? Will it provide an adequate deterrent to the social media firms that we are considering?

None Portrait The Chair
- Hansard -

Mr Lewis, as you were named, I think you had better start.

Martin Lewis: Ten per cent. of the global revenue of a major social media or search player is a lot of money—it certainly would hit them in the pocket. I reiterate my previous point: it is all about the threshold at which that comes in and how rigidly Ofcom is enforcing it. There are very few organisations that have the resources, legally, to take on big institutions of state, regulators and Governments. If any does, it is the gigantic tech firms. Absolutely, 10% of global revenue sounds like a suitable wall to prevent them jumping over. That is the aim, because we want those companies to work for people; we don’t want them to do scam adds. We want them to work well and we want them never to be fined because is no reason to fine them.

The proof of the pudding will be in how robust Ofcom feels it can be, off the back of the Bill, taking those companies on. I go back to needing to understand how many scam ads you permit under the duty to prevent scam ads. It clearly is not zero—you are not going to tell me it is zero. So how many are allowed, what are the protocols that come into place and how quickly do they have to take the ads down? Ultimately, I think that is going to be a decision for Ofcom, but it is the level of stringency that you put on Ofcom in order for it to interpret how it takes that decision that is going to decide whether this works or not.

Rocio Concha: I completely agree with Martin. Ofcom needs to have the right resources in order to monitor how the platforms are doing that, and it needs to have the right powers. At the moment, Ofcom can ask for information in a number of areas, including fraud, but not advertising. We need to make sure that Ofcom can ask for that information so that it can monitor what the platforms are doing. We need to make sure that it has the right powers and the right resources to enforce the Bill effectively.

Tim Fassam: You would hope that 10% would certainly be a significant disincentive. Our focus would be on whether companies are contributing to compensating the victims of fraud and scams, and whether they have been brought into the architecture that is utilised to compensate victims of fraud and scams. That would be the right aim in terms of financial consequences for the firms.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q I have one final question that again relates to the question of reporting scams, which I think two or three witnesses have referred to. I will briefly outline the provisions in the Bill that address that. I would like to ask the witnesses if they think those provisions are adequate. First, in clause 18, the Bill imposes on large social media firms an obligation to have a proper complaints procedure so that complaints are not ignored, as appears to happen on a shockingly frequent basis. That is at the level of individual complaints. Of course, if social media firms do not do that, it will be for Ofcom to enforce against them.

Secondly, clauses 140 and 141 contain a procedure for so-called super-complaints, where a body that represents users—it could be Which? or an organisation like it—is able to bring something almost like a class action or group complaint to Ofcom if it thinks a particular social media firm has systemic problems. Will those two clauses address the issue of complaints not being properly handled or, in some cases, not being dealt with at all?

Martin Lewis: Everything helps. I think the super-complaint point is really important. We must remember that many victims of scams are not so good at complaining and, by the nature of the crossover of individuals, there is a huge mental health issue at stake with scams. There is both the impact on people with mental health issues and the impact on people’s mental health of being scammed, which means that they may not be as robust and up for the fight or for complaining. As long as it works and applies to all the different categories that are repeated here, the super-complaint status is a good measure.

We absolutely need proper reporting lines. I urge you, Minister—I am not sure that this is in the Bill—to standardise this so that we can talk about what someone should do when they report: the same imagery, the same button. With that, people will know what to do. The more we can do that, the easier and better the system will be.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q That is a really important point—you made it earlier—about the complaints process being hidden. Clause 18(2)(c) says that the complaints system must be

“easy to access, easy to use (including by children) and transparent.”

The previous paragraph (b) states that the system must

“provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind”.

The Bill is saying that a complaints process must do those two things, because if it does not, Ofcom will be on the company’s back.

Martin Lewis: I absolutely support all of that. I am just pushing for that tiny bit more leadership, whether it is from you or Ofcom, that comes up with a standardised system with standardised imagery and placing, so that everybody knows that on the top left of the advert you have the button that you click to fill in a form to report it. The more we have that cross-platform and cross-search and cross-social media, the easier it will be for people. I am not sure it is a position for the Bill in itself, but Government leadership would work really well on that.

Tim Fassam: They are both welcome—the super-complaint and the new complaints process. We want to ensure that we have a system that looks not just at weight of number of complaints, but at the content. In particular, you may find on the super-complaint point that, for example, the firm that a fraudster is pretending to be is the organisation that has the best grasp of the issue, so do not forget about commercial organisations as well as consumer organisations when thinking about who is appropriate to make super-complaints.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Well, your organisation, as one that represents firms in this space, could in fact be designated as a super-complainant to represent your members, as much as someone like Which? could be designated to represent the man on the street like you or me.

Tim Fassam: Absolutely. We suggested to Meta when we met them about 18 months ago that we could be a clearing house to identify for them whether they need to take something seriously, because our members have analysed it and consider it to represent a real risk.

None Portrait The Chair
- Hansard -

Last word to Rocio Concha.

Rocio Concha: I completely agree about the super-complaint. We as a consumer organisation have super-complaint powers. As with other regulators, we would like to have it in this context as well. We have done many super-complaints representing consumers in particular areas with the regulators, so I think we need it in this Bill as well.

On reporting, I want to clarify something. At the moment, the Bill does not have a requirement for users to complain and report to platforms in relation to fraudulent advertising. It happens for priority illegal content, but our assessment of the Bill is that it is unclear whether it applies to fraudulent advertising. We probably do not have time to look at this now, but we sent you amendments to where we thought the Bill had weaknesses. We agree with you that users should have an easy and transparent way to report illegal or fraudulent advertising, and they should have an easy way to complain about it. At the moment, it is not clear that the Bill will require that for fraudulent advertising.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Yes, that is a very good question. Please do write to us about that. Clause 140, on super-complaints, refers to “regulated services”. My very quick, off-the-cuff interpretation is that that would include everything covered and regulated by the Bill. I notice that there is a reference to user-to-user services in clause 18. Do write to us on that point. We would be happy to look at it in detail. Do not take my comment as definitive, because I have only just looked at it in the last 20 seconds.

Rocio Concha: My comment was in relation not to the super-complaints but to the requirements. We already sent you our comments with suggestions on how you can fix this in the Bill.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very grateful. Thank you.

None Portrait The Chair
- Hansard -

Ms Concha and Mr Fassam, thank you very much. Do please write in if you have further comments. Mr Lewis, we are deeply grateful to you. You can now go back to your day job and tell us whether we are going to be worse or better off as a result of the statement today—please don’t answer that now.

Martin Lewis: I am interviewing the Chancellor in 15 minutes.

--- Later in debate ---
Caroline Ansell Portrait Caroline Ansell
- Hansard - - - Excerpts

Thank you. That is very helpful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Thank you for joining us and giving evidence, Frances; it is nice to see you again. We had evidence from Meta, your former employer, on Tuesday, in which its representative suggested that it engages in open and constructive co-operation with researchers. Do you think that testimony was true?

Frances Haugen: I think that shows a commendable level of chutzpah. Researchers have been trying to get really basic datasets out of Facebook for years. When I talk about a basic dataset, it is things as simple as, “Just show us the top 10,000 links that are distributed in any given week.” When you ask for information like that in a country like the United States, no one’s privacy is violated: every one of those links will have been viewed by hundreds of thousands, if not millions of people. Facebook will not give out even basic data like that, even though hundreds if not thousands of academics have begged for this data.

The idea that they have worked in close co-operation with researchers is a farce. The only way that they are going to give us even the most basic data that we need to keep ourselves safe is if it is mandated in the Bill. We need to not wait two years after the Bill passes—and remember, it does not even say that it will happen; Ofcom might say, “Oh, maybe not.” We need to take a page from the Digital Services Act and say, “On the day that the Bill passes, we get access to data,” or, at worst, “Within three months, we are going to figure out how to do it.” It needs to be not, “Should we do it?” but “How will we do it?”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q When I was asking questions on Tuesday, the representative of Meta made a second claim that raised my eyebrow. He claimed that, in designing its algorithms, it did not primarily seek to optimise for engagement. Do you think that was true?

Frances Haugen: First, I left the company a year ago. Because we have no transparency with these companies, they do not have to publish their algorithms or the consequences of their algorithms, so who knows? Maybe they use astrology now to rank the content. We have no idea. All I know is that Meta definitely still uses signals—did users click on it, did they dwell on it, did they re-share it, or did they put a comment on it? There is no way it is not using those. It is very unlikely that they do not still use engagement in their ranking.

The secondary question is, do they optimise for engagement? Are they trying to maximise it? It is possible that they might interpret that and say, “No, we have multiple things we optimise for,” because that is true. They look at multiple metrics every single time they try to decide whether or not to shift things. But I think it is very likely that they are still trying to optimise for engagement, either as their top metric or as one of their top metrics.

Remember, Meta is not trying to optimise for engagement to keep you there as long as possible; it is optimising for engagement to get you and your friends to produce as much content as possible, because without content production, there can be no content consumption. So that is another thing. They might say, “No, we are optimising for content production, not engagement,” but that is one step off.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q The Bill contains provisions that require companies to do risk assessments that cover their algorithms, and then to be transparent about those risk assessments with Ofcom. Do you think those provisions will deliver the change required in the approach that the companies take?

Frances Haugen: I have a feeling that there is going to be a period of growing pains after the first time these risk assessments happen. I can almost entirely guarantee you that Facebook will try to give you very little. It will likely be a process of back and forth with the regulator, where you are going to have to have very specific standards for the level of transparency, because Facebook is always going to try to give you the least possible.

One of the things that I am actually quite scared about is that, in things like the Digital Services Act, penalties go up to 10% of global profits. Facebook as a company has something like 35% profit margins. One of the things I fear is that these reports may be so damning— that we have such strong opinions after we see the real, hard consequences of what they are doing—that Facebook might say, “This isn’t worth the risk. We’re just going to give you 10% of our profits.” That is one of the things I worry about: that they may just say, “Okay, now we’re 25% profitable instead of 35% profitable. We’re that ashamed.”

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Q Let me offer a word of reassurance on that. In this Bill, the penalties are up to 10% of global revenue, not profit. Secondly, in relation to the provision of information to Ofcom, there is personal criminal liability for named executives, with a period of incarceration of up to two years, for the reason you mentioned.

Frances Haugen: Oh, good. That’s wonderful.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We had a case last year where Facebook—it was actually Facebook—failed to provide some information to the CMA in a takeover case, and it paid a £50 million fine rather than provide the information, hence the provision for personal criminal liability for failing to provide information that is now in this Bill.

My final question is a simple one. From your perspective, at the moment, when online tech companies are making product design decisions, what priority do they give to safety versus profit?

Frances Haugen: What I saw when I was at Facebook was that there was a culture that encouraged people to always have the most positive interpretation of things. If things are still the same as when I left—like I said, I do not know; I left last May—what I saw was that people routinely had to weigh little changes in growth versus changes in safety metrics, and unless they were major changes in safety metrics, they would continue to pursue growth. The only problem with a strategy like that is that those little deficits add up to very large harms over time, so we must have mandated transparency. The public have to have access to data, because unless Facebook has to add the public cost of the harm of its products, it is not going to prioritise enough those little incremental harms as they add up.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Thank you very much.

None Portrait The Chair
- Hansard -

Ms Haugen, thank you very much indeed for joining us today, and thank you also for the candour with which you have answered your questions. We are very grateful to you indeed.

The Committee will meet again on Tuesday 7 June at 9.25 am for the start of its line-by-line consideration of the Bill. That session will be in Committee Room 14.

Ordered, That further consideration be now adjourned. —(Steve Double.)

Online Safety Bill (Fifth sitting)

(Limited Text - Ministerial Extracts only)

Read Full debate
Committee stage
Tuesday 7th June 2022

(2 years, 6 months ago)

Public Bill Committees
Online Safety Act 2023 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 7 June 2022 - (7 Jun 2022)

This text is a record of ministerial contributions to a debate held as part of the Online Safety Act 2023 passage through Parliament.

In 1993, the House of Lords Pepper vs. Hart decision provided that statements made by Government Ministers may be taken as illustrative of legislative intent as to the interpretation of law.

This extract highlights statements made by Government Ministers along with contextual remarks by other members. The full debate can be read here

This information is provided by Parallel Parliament and does not comprise part of the offical record

None Portrait The Chair
- Hansard -

Good morning, ladies and gentleman. If anybody wishes to take their jacket off, they are at liberty to do so when I am in the Chair—my co-Chairman is joining us, and I am sure she will adopt the same procedure. I have a couple of preliminary announcements. Please make sure that all mobile phones are switched off. Tea and coffee are not allowed in the Committee, I am afraid. I think they used to be available outside in the corridor, but I do not know whether that is still the case.

We now start line-by-line consideration of the Bill. The selection and grouping list for the sitting is available on the table in the room for anybody who does not have it. It shows how the clauses and selected amendments have been grouped for debate. Grouped amendments are generally on the same subject or a similar issue.

Now for a slight tutorial to remind me and anybody else who is interested, including anybody who perhaps has not engaged in this arcane procedure before, of the proceedings. Each group has a lead amendment, and that amendment is moved first. The other grouped amendments may be moved later, but they are not necessarily voted on at that point, because some of them relate to matters that appear later in the Bill. Do not panic; that does not mean that we have forgotten them, but that we will vote on them—if anybody wants to press them to a Division—when they are reached in order in the Bill. However, if you are in any doubt and feel that we have missed something—occasionally I do; the Clerks never do—just let us know. I am relaxed about this, so if anybody wants to ask a question about anything that they do not understand, please interrupt and ask, and we will endeavour to confuse you further.

The Member who has put their name to the lead amendment, and only the lead amendment, is usually called to speak first. At the end of the debate, the Minister will wind up, and the mover of the lead amendment—that might be the Minister if it is a Government amendment, or it might be an Opposition Member—will indicate whether they want a vote on that amendment. We deal with that first, then we deal with everything else in the order in which it arises. I hope all that is clear, but as I said, if there are any questions, please interrupt and ask.

We start consideration of the Bill with clause 1, to which there are no amendments. Usually, the Minister would wind up at the end of each debate, but as there are no amendments to clause 1, the Minister has indicated that he would like to say a few words about the clause.

Clause 1

Overview of Act

Question proposed, That the clause stand part of the Bill.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

Thank you, Sir Roger; it is a pleasure to serve under your chairmanship once again. It may be appropriate to take this opportunity to congratulate my right hon. Friend the Member for Basingstoke on her damehood in the Queen’s birthday honours, which was very well deserved indeed.

This simple clause provides a high-level overview of the different parts of the Bill and how they come together to form the legislation.

None Portrait The Chair
- Hansard -

The Minister was completely out of order in congratulating the right hon. Lady, but I concur with him. I call the shadow Minister.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

This part of the Bill deals with the definitions of services and which services would be exempt. I consider myself a millennial; most people my age or older are Facebook and Twitter users, and people a couple of years younger might use TikTok and other services. The way in which the online space is used by different generations, particularly by young people, changes rapidly. Given the definitions in the Bill, how does the Minister intend to keep pace with the changing ways in which people communicate? Most online games now allow interaction between users in different places, which was not the case a few years ago. Understanding how the Government intend the Bill to keep up with such changes is important. Will the Minister tell us about that?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Let me briefly speak to the purpose of these clauses and then respond to some of the points made in the debate.

As the shadow Minister, the hon. Member for Pontypridd, touched on, clauses 2 and 3 define some of the key terms in the Bill, including “user-to-user services” and “search services”—key definitions that the rest of the Bill builds on. As she said, schedule 1 and clause 4 contain specific exemptions where we believe the services concerned present very low risk of harm. Schedule 2 sets out exemptions relating to the new duties that apply to commercial providers of pornography. I thank the shadow Minister and my right hon. Friend the Member for Basingstoke for noting the fact that the Government have substantially expanded the scope of the Bill to now include commercial pornography, in response to widespread feedback from Members of Parliament across the House and the various Committees that scrutinised the Bill.

The shadow Minister is quite right to say that the number of platforms to which the Bill applies is very wide. [Interruption.] Bless you—or bless my hon. Friend the Member for North West Durham, I should say, Sir Roger, although he is near sanctified already. As I was saying, we are necessarily trying to protect UK users, and with many of these platforms not located in the UK, we are seeking to apply these duties to those companies as well as ones that are domestically located. When we come to discuss the enforcement powers, I hope the Committee will see that those powers are very powerful.

The shadow Minister, the hon. Member for Liverpool, Walton and others asked about future technologies and whether the Bill will accommodate technologies that we cannot even imagine today. The metaverse is a good example: The metaverse did not exist when the Bill was first contemplated and the White Paper produced. Actually, I think Snapchat did not exist when the White Paper that preceded the Bill was first conceived. For that reason, the Bill is tech agnostic. We do not talk about specific technologies; we talk about the duties that apply to companies and the harms they are obligated to prevent.

The whole Bill is tech agnostic because we as parliamentarians today cannot anticipate future developments. When those future developments arise, as they inevitably will, the duties under the Bill will apply to them as well. The metaverse is a good example, because even though it did not exist when the structure of the Bill was conceived, anything happening in the metaverse is none the less covered by the Bill. Anything that happens in the metaverse that is illegal or harmful to children, falls into the category of legal but harmful to adults, or indeed constitutes pornography will be covered because the Bill is tech agnostic. That is an extremely important point to make.

The hon. Member for Aberdeen North asked about gaming. Parents are concerned because lots of children, including quite young children, use games. My own son has started playing Minecraft even though he is very young. To the extent that those games have user-to-user features—for example, user-to-user messaging, particularly where those messages can be sent widely and publicly—those user-to-user components are within the scope of the Bill.

The hon. Member for Aberdeen North also asked about the App Store. I will respond quickly to her question now rather than later, to avoid leaving the Committee in a state of tingling anticipation and suspense. The App Store, or app stores generally, are not in the scope of the Bill, because they are not providing, for example, user-to-user services, and the functionality they provide to basically buy apps does not count as a search service. However, any app that is purchased in an app store, to the extent that it has either search functionality, user-to-user functionality or purveys or conveys pornography, is in scope. If an app that is sold on one of these app stores turns out to provide a service that breaks the terms of the Bill, that app will be subject to regulatory enforcement directly by Ofcom.

The hon. Members for Aberdeen North and for Liverpool, Walton touched on media literacy, noting that there has been a change to the Bill since the previous version. We will probably debate this later, so I will be brief. The Government published a media literacy strategy, backed by funding, to address this point. It was launched about a year ago. Ofcom also has existing statutory duties—arising under the Communications Act 2003, I believe. The critical change made since the previous draft of the Bill—it was made in December last year, I believe—is that Ofcom published an updated set of policy intentions around media literacy that went even further than we had previously intended. That is the landscape around media literacy.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am sure we will discuss this topic a bit more as the Bill progresses.

I will make a few points on disinformation. The first is that, non-legislatively, the Government have a counter-disinformation unit, which sits within the Department for Digital, Culture, Media and Sport. It basically scans for disinformation incidents. For the past two years it has been primarily covid-focused, but in the last three or four months it has been primarily Russia/Ukraine-focused. When it identifies disinformation being spread on social media platforms, the unit works actively with the platforms to get it taken down. In the course of the Russia-Ukraine conflict, and as a result of the work of that unit, I have personally called in some of the platforms to complain about the stuff they have left up. I did not have a chance to make this point in the evidence session, but when the person from Twitter came to see us, I said that there was some content on Russian embassy Twitter accounts that, in my view, was blatant disinformation—denial of the atrocities that have been committed in Bucha. Twitter had allowed it to stay up, which I thought was wrong. Twitter often takes down such content, but in that example, wrongly and sadly, it did not. We are doing that work operationally.

Secondly, to the extent that disinformation can cause harm to an individual, which I suspect includes a lot of covid disinformation—drinking bleach is clearly not very good for people—that would fall under the terms of the legal but harmful provisions in the Bill.

Thirdly, when it comes to state-sponsored disinformation of the kind that we know Russia engages in on an industrial scale via the St Petersburg Internet Research Agency and elsewhere, the Home Office has introduced the National Security Bill—in fact, it had its Second Reading yesterday afternoon, when some of us were slightly distracted. One of the provisions in that Bill is a foreign interference offence. It is worth reading, because it is very widely drawn and it criminalises foreign interference, which includes disinformation. I suggest the Committee has a look at the foreign interference offence in the National Security Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the Minister’s intervention in bringing in the platforms to discuss disinformation put out by hostile nation states. Does he accept that if Russia Today had put out some of that disinformation, the platforms would be unable to take such content down as a result of the journalistic exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We will no doubt discuss in due course clauses 15 and 50, which are the two that I think the shadow Minister alludes to. If a platform is exempt from the duties of the Bill owing to its qualification as a recognised news publisher under clause 50, it removes the obligation to act under the Bill, but it does not prevent action. Social media platforms can still choose to act. Also, it is not a totally straightforward matter to qualify as a regulated news publisher under clause 50. We saw the effect of sanctions: when Russia Today was sanctioned, it was removed from many platforms as a result of the sanctioning process. There are measures outside the Bill, such as sanctions, that can help to address the shocking disinformation that Russia Today was pumping out.

The last point I want to pick up on was rightly raised by my right hon. Friend the Member for Basingstoke and the hon. Member for Aberdeen North. It concerns child sexual exploitation and abuse images, and particularly the ability of platforms to scan for those. Many images are detected as a result of scanning messages, and many paedophiles or potential paedophiles are arrested as a result of that scanning. We saw a terrible situation a little while ago, when—for a limited period, owing to a misconception of privacy laws—Meta, or Facebook, temporarily suspended scanning in the European Union; as a result, loads of images that would otherwise have been intercepted were not.

I agree with the hon. Member for Aberdeen North that privacy concerns, including end-to-end encryption, should not trump the ability of organisations to scan for child sexual exploitation and abuse images. Speaking as a parent—I know she is, too—there is, frankly, nothing more important than protecting children from sexual exploitation and abuse. Some provisions in clause 103 speak to this point, and I am sure we will debate those in more detail when we come to that clause. I mention clause 103 to put down a marker as the place to go for the issue being raised. I trust that I have responded to the points raised in the debate, and I commend the clause to the Committee.

Question put and agreed to.

Clause 2 accordingly ordered to stand part of the Bill.

Clause 3 ordered to stand part of the Bill.

Schedules 1 and 2 agreed to.

Clause 4 ordered to stand part of the Bill.

None Portrait The Chair
- Hansard -

Before we move on, we have raised the issue of the live feed. The audio will be online later today. There is a problem with the feed—it is reaching the broadcasters, but it is not being broadcast at the moment.

As we are not certain we can sort out the technicalities between now and this afternoon, the Committee will move to Committee Room 9 for this afternoon’s sitting to ensure that the live stream is available. Mr Double, if Mr Russell intends to be present—he may not; that is up to you—it would be helpful if you would let him know. Ms Blackman, if John Nicolson intends to be present this afternoon, would you please tell him that Committee Room 9 will be used?

It would normally be possible to leave papers and other bits and pieces in the room, because it is usually locked between the morning and afternoon sittings. Clearly, because we are moving rooms, you will all need to take your papers and laptops with you.

Clause 5

Overview of Part 3

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Dan Carden Portrait Dan Carden
- Hansard - - - Excerpts

I want to add my voice to the calls for ways to monitor the success or failures of this legislation. We are starting from a position of self-regulation where companies write the rules and regulate themselves. It is right that we are improving on that, but with it comes further concerns around the powers of the Secretary of State and the effectiveness of Ofcom. As the issues are fundamental to freedom of speech and expression, and to the protection of vulnerable and young people, will the Minster consider how we better monitor whether the legislation does what it says on the tin?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 5 simply provides an overview of part 3 of the Bill. Several good points have been raised in the course of this discussion. I will defer replying to the substance of a number of them until we come to the relevant clause, but I will address two or three of them now.

The shadow Minister said that the Bill is a complex, and she is right; it is 193-odd clauses long and a world-leading piece of legislation. The duties that we are imposing on social media firms and internet companies do not already exist; we have no precedent to build on. Most matters on which Parliament legislates have been considered and dealt with before, so we build on an existing body of legislation that has been built up over decades or, in some cases in the criminal law, over centuries. In this case, we are constructing a new legislative edifice from the ground up. Nothing precedes this piece of legislation—we are creating anew—and the task is necessarily complicated by virtue of its novelty. However, I think we have tried to frame the Bill in a way that keeps it as straightforward and as future-proof as possible.

The shadow Minister is right to point to the codes of practice as the source of practical guidance to the public and to social media firms on how the obligations operate in practice. We are working with Ofcom to ensure that those codes of practice are published as quickly as possible and, where possible, prepared in parallel with the passage of the legislation. That is one reason why we have provided £88 million of up-front funding to Ofcom in the current and next financial years: to give it the financial resources to do precisely that.

My officials have just confirmed that my recollection of the Ofcom evidence session on the morning of Tuesday 24 May was correct: Ofcom confirmed to the Committee that it will publish, before the summer, what it described as a “road map” providing details on the timing of when and how those codes of practice will be created. I am sure that Ofcom is listening to our proceedings and will hear the views of the Committee and of the Government. We would like those codes of practice to be prepared and introduced as quickly as possible, and we certainly provided Ofcom with the resources to do precisely that.

There was question about the Scottish offences and, I suppose, about the Northern Irish offences as well—we do not want to forget any part of the United Kingdom.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are in agreement on that. I can confirm that the Government have tabled amendments 116 to 126 —the Committee will consider them in due course—to place equivalent Scottish offences, which the hon. Member for Aberdeen North asked about, in the Bill. We have done that in close consultation with the Scottish Government to ensure that the relevant Scottish offences equivalent to the England and Wales offences are inserted into the Bill. If the Scottish Parliament creates any new Scottish offences that should be inserted into the legislation, that can be done under schedule 7 by way of statutory instrument. I hope that answers the question.

The other question to which I will briefly reply was about parliamentary scrutiny. The Bill already contains a standard mechanism that provides for the Bill to be reviewed after a two to five-year period. That provision appears at the end of the Bill, as we would expect. Of course, there are the usual parliamentary mechanisms—Backbench Business debates, Westminster Hall debates and so on—as well as the DCMS Committee.

I heard the points about a standing Joint Committee. Obviously, I am mindful of the excellent prelegislative scrutiny work done by the previous Joint Committee of the Commons and the Lords. Equally, I am mindful that standing Joint Committees, outside the regular Select Committee structure, unusual. The only two that spring immediately to mind are the Intelligence and Security Committee, which is established by statute, and the Joint Committee on Human Rights, chaired by the right hon. and learned Member for Camberwell and Peckham (Ms Harman), which is established by Standing Orders of the House. I am afraid I am not in a position to make a definitive statement about the Government’s position on this. It is of course always open to the House to regulate its own businesses. There is nothing I can say today from a Government point of view, but I know that hon. Members’ points have been heard by my colleagues in Government.

We have gone somewhat beyond the scope of clause 5. You have been extremely generous, Sir Roger, in allowing me to respond to such a wide range of points. I commend clause 5 to the Committee.

Question put and agreed to.

Clause 5 accordingly ordered to stand part of the Bill.

Clause 6

Providers of user-to-user services: duties of care

None Portrait The Chair
- Hansard -

Before we proceed, perhaps this is the moment to explain what should happen and what is probably going to happen. Ordinarily, a clause is taken with amendments. This Chairman takes a fairly relaxed view of stand part debates. Sometimes it is convenient to have a very broad-ranging debate on the first group of amendments because it covers matters relating to the whole clause. The Chairman would then normally say, “Well, you’ve already had your stand part debate, so I’m not going to allow a further stand part debate.” It is up to hon. Members to decide whether to confine themselves to the amendment under discussion and then have a further stand part debate, or whether to go free range, in which case the Chairman would almost certainly say, “You can’t have a stand part debate as well. You can’t have two bites of the cherry.”

This is slightly more complex. It is