All 2 Lord Browne of Belmont contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Wed 1st Feb 2023
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2

Online Safety Bill

Lord Browne of Belmont Excerpts
Lord Browne of Belmont Portrait Lord Browne of Belmont (DUP)
- View Speech - Hansard - -

My Lords, it is beyond any doubt that an Online Safety Bill is needed. The internet has been left uncontrolled and unfettered for too long. While the Bill is indeed welcome, it is clear that more work needs to be done to ensure that it adequately protects children online.

There is a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people. Many have spoken in this debate already about the harm of easy access to pornography, which is carried into adult life and has a damaging impact on young people’s views of sex and relationships. For many young men addiction to pornography, which starts in teenage years, can often lead to the belief that women should be dehumanised and objectified. Pornography is becoming a young person’s main reference point for sex and there is no conversation about important issues such as consent. That is why the Bill needs to have proper and robust age verification measures to ensure that children cannot access online pornography and are protected from the obvious harms.

Even if the Bill is enacted with robust age verification, experience tells us this is no guarantee that age verification will be implemented. Parliament passed Part 3 of the Digital Economy Bill in 2017, yet the Government chose not to implement the will of this House. That cannot be allowed to be repeated. Not only must robust age verification be in the Bill, but a commencement date must be added to the Bill to ensure that what happened in the past cannot be allowed to happen again.

I know that some Members of the House are still fearful that age verification presents an insurmountable threat to privacy: that those who choose to view pornography will have to provide their ID documents to those sites and that their interests may be tracked and exposed or used for blackmail purposes. We live in an age where there is little that technology cannot deliver. Verifying your age without disclosing who you are is not a complex problem. Indeed, it has been central to the age verification industry since it first began to prepare for the Digital Economy Act, because neither consumers nor the sites they access would risk working with an age verification provider who could not provide strong reassurance and protection for privacy.

The age verification sector is built on privacy by design and data minimisation principles, which are at the heart of our data protection law. The solutions are created on what the industry calls a double-blind basis. By this, I mean that the adult websites can never know the identity of their users, and the age verification providers do not keep any records of which sites ask them to confirm the age of any particular user. To use the technical terms, it is an anonymised, tokenised solution.

The Government should place into the Bill provisions to ensure robust age verification is put in place, along with a clear time-limited commencement clause to ensure that, on this occasion, age verification is brought in and enforced. I support the Bill, but I trust that, as it makes its way through the House, provisions in it can be strengthened.

Online Safety Bill

Lord Browne of Belmont Excerpts
These amendments, in my name and that of the noble Lord, Lord Morrow, will ensure that all pornographic content is regulated in the same way, at the same time, and that Part 5 can be brought into force more quickly to ensure all content is treated in the same way. I believe that was certainly the will of your Lordships at Second Reading. I look forward to hearing the Minister’s views on how this will be achieved. I beg to move.
Lord Browne of Belmont Portrait Lord Browne of Belmont (DUP)
- View Speech - Hansard - -

My Lords, first, I tender an apology from my noble friend Lord Morrow, whose name is attached to the amendments. Unfortunately, he is unable to participate in tonight’s debate, as he had to return home at very short notice. I will speak to the amendments in this group. I thank the noble Baroness, Lady Ritchie, and my noble friend Lord Morrow for tabling the amendments, allowing for a debate on how the duties of Part 5 should apply to Part 3 services and to probe what sites Part 5 will cover once it is implemented.

The Government have devised a Bill which attempts carefully to navigate regulation of several different types of service. I am sure that it will eventually become an exemplar emulated around the world, so I understand why there may be a general resistance on the part of the Government to tamper with the Bill’s architecture. However, these amendments are designed to treat pornographic content as a clear exception wherever it is found online. This can be achieved, because we already know the harm caused by pornography and Part 5 already creates a duty to ensure that rigorous age verification is in place to stop children accessing it.

The Government recognised that the original drafting of the Bill would not address the unfinished business of Part 3 of the Digital Economy Act. In 2017, as many will recall, this House and the other place expressed the clear demand that online pornography should not be accessible to children. Part 5 of the Bill is the evolution of that 2017 debate, but, regrettably, it was bolted on belatedly after pre-legislative scrutiny. That bolt-on approach has had the unfortunate consequence of creating two separate regimes to deal with pornography. Part 5 applies only to “provider pornographic content”, which is content

“published or displayed on the service by the provider … or by a person acting on behalf of the provider”.

Clause 70 makes it clear:

“Pornographic content that is user-generated content … is not to be regarded as provider pornographic content”;


in other words, if pornography is on social media or the large tube sites, it falls under Part 3, not Part 5. That means that not all content will be regulated in the same way or at the same time.

Amendment 125A addresses an issue raised by this two-tier approach to regulation. Clause 49 defines “user-generated content” as content

“generated directly on the service by a user of the service, or … uploaded to or shared on the service by a user of the service, and … that may be encountered by another user”.

Encounter is defined broadly, meaning to

“read, view, hear or otherwise experience content”,

including adding “comments and reviews”. By including reviews, that seems to be a broad definition. Does it include a like, an up vote or an emoji? That is an important question that Amendment 125A probes. On this basis, it seems that almost all the most popular pornographic websites are user-to-user services, and therefore will fall into Part 3.