Digital Economy Bill Debate

Full Debate: Read Full Debate

Digital Economy Bill

Lord Paddick Excerpts
Committee: 2nd sitting (Hansard): House of Lords
Thursday 2nd February 2017

(7 years, 3 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Amendment Paper: HL Bill 80-III Third marshalled list for Committee (PDF, 262KB) - (2 Feb 2017)
Baroness Howe of Idlicote Portrait Baroness Howe of Idlicote (CB)
- Hansard - - - Excerpts

My Lords, as I said at Second Reading, I am pleased to be speaking today on a subject that I have regularly brought before your Lordships’ House over recent years in my several online safety Bills—the importance of protecting children online, which is very much today’s subject. I have tabled my probing Amendment 55 so that the Government can set out their plans for which organisations will act as the age verification regulator for which sections of this Bill, as this is crucial to ensure the child protection provisions of Part 3 are successfully implemented.

My amendment would designate the British Board of Film Classification as the age verification regulator for the whole of this process. I know that many in this House and the other place were delighted to hear the Government’s announcement that the BBFC will be the notification regulator for Part 3, a position for which it will be formally designated later this year. I am sure we can all agree that it will bring a level of expertise to that role which will be really invaluable. The use of the term “notification” regulator, however, suggests that the BBFC will provide only part of the regulatory function and that another kind of regulator will have a role to play. Indeed, this was backed up by the BBFC, which stated in its evidence to the Public Bill Committee in another place that it did not intend to have any role in enforcement under Clauses 21 and 22 on fines and informing payment providers and ancillary service providers.

This same message is repeated in the Explanatory Notes:

“The BBFC is expected to be the regulator for the majority of the functions of the regulator (including issuing notices to ISPs to prevent access to material), but is not intended to take on the role of issuing financial penalties and enforcement notices to non-compliant websites”.


This begs an important question to which the Minister must now provide an answer. Who will be the other regulator? It is one thing to have not clarified this at the point of introducing the Bill. It is, however, quite another for the Bill to have passed entirely through one House and be well on its way through another without any update.

In asking this question, I should say that I was very pleased to see that the Government have said that the BBFC will assume the enforcement role in relation to Clause 23, which was introduced on Report in another place. This, however, still leaves questions about the enforcement regulator for Clauses 21 and 22 and how the enforcement regulator in these clauses will interact with the BBFC in its role as “notification” regulator.

In its report on the Bill, the Delegated Powers and Regulatory Reform Committee criticised the lack of information about the regulator, saying:

“The decision as to who to appoint as regulator should be taken before, not after, a Bill is introduced so that it can be fully scrutinised by Parliament. This is especially because the regulator will have important and significant powers conferred by Part 3 which include the ability to impose substantial civil penalties”.


Parliament should know who the enforcement regulator will be, since it will be able to impose these substantial penalties.

Your Lordships’ House should be informed how the enforcement regulator, assuming the Government are still planning on a second regulator, will operate with the BBFC in terms of the mechanics of deciding whether to issue a fine, an enforcement notice, or notice to internet service providers to block certain sites, and how the two regulators will produce consistent guidance for this part of the Bill. Ofcom would be an obvious option for enforcement, but last November it made it clear that it does not want the role. I ask the Minister: who do the Government have in mind? When will he bring that information to the House? I look forward to hearing what he has to say.

Lord Paddick Portrait Lord Paddick (LD)
- Hansard - -

My Lords, I apologise to the Committee for not taking part in Second Reading. Having led on the Investigatory Powers Bill and the Policing and Crime Bill I was hoping for some time off for good behaviour, but apparently a policeman’s lot is not a happy one, even when he has retired.

My noble friend Lord Clement-Jones and I have Amendment 55B in this group. The first thing to say is that we on these Benches believe everything that can be demonstrated to be effective should be done to restrict children’s access to adult material online. We also believe that everything should be done to ensure that adults can access websites that contain material that it is legal for them to view. That is why Amendment 55B would require the age-verification regulator to produce an annual report on how effective the measures in the Bill have been in general in reducing the number of children accessing adult material online and how effective each enforcement mechanism has been. We also share the concerns expressed by the noble Baronesses, Lady Jones of Whitchurch and Lady Howe of Idlicote, on these provisions having been made somewhat at the last minute, and that they may not have been completely thought through.

The aims of the Bill and the other amendments in the group are laudable. The ideal that there should be equal protection for children online as there is offline is a good one, but it is almost impossible to achieve through enforcement alone. We have to be realistic about how relatively easy it is to prevent children accessing physical material sold in geographic locations and how relatively difficult, if not impossible, it is to prevent determined children accessing online material on the internet, much of which is free. An increasing proportion of adult material is not commercially produced.

That is not to say that we should not do all we can to prevent underage access to adult material, but we must not mislead by suggesting that doing all we can to prevent access is both necessary and sufficient to prevent children accessing adult material online, the detail of which I will come to in subsequent amendments. Of course internet service providers and ancillary service providers should do all they can to protect children, but there are also issues around freedom of expression that need to be taken into account.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, in light of these and some later amendments, I want to raise the matter of ancillary service providers. My understanding is that social media platforms continue to argue that they do not fall within the definition of ancillary service providers and are seeking confirmation from government that they have no role to play in preventing children accessing pornography online.

I am aware that the Minister stated at Second Reading:

“The Government believe that services, including Twitter, can be classified by regulators as ancillary service providers where they are enabling or facilitating the making available of pornographic or prohibited material”.—[Official Report, 13/12/16; col. 1228.]


I was pleased to hear him say that, but I would like confirmation that it remains the Government’s position. Unless such platforms are included, I simply do not understand what Part 3 of the Bill hopes to achieve.

I am unconvinced that it is possible to remove all adult content from the purview of children, but it is imperative to make it clear to young people that viewing adult sexual content is a transgressive act and not a cultural norm, so, at a minimum, it should be as difficult as reaching the top shelf in a newsagent or being underage in a pub. That is imperative for reasons I set out in great detail at Second Reading, so I will not repeat them here but simply say that children and young people are turning in large numbers to pornography to learn about sex, with unhappy consequences. Often violent, mainly misogynistic, unrealistic adult male fantasy is not a good starting point for a healthy, happy, consensual sex life.

I would have preferred for the age verification system to be fully thought out, prototyped and beta-tested before it came to the House in the form of legislation. None the less, I agree that Part 3 is a valiant attempt to stem the flow of adult material into the hands and lives of children. In the absence of a better, more thought out plan, I support it. But if this is the path we are taking, we must be clear in our message: this material is unsuitable for those under the age of 18.

The BBFC says that it intends to take a proportionate approach to its new role and will target the top 50 adult websites as accessed by viewers in the UK. Its research shows that 70% of all those who access such sites in the UK visit the top 50. Among children, concentration among those top sites is even higher. In that respect, I understand that age-verifying 70% of adult material websites sends a clear message.

However, a brief search on Twitter, which has a joining age of 13, shows that commercial pornography is readily available, with popular accounts attracting hundreds of thousands of followers. Many of those who access pornographic social media accounts do not publicly follow them, so it is more than likely that the follower figures are dwarfed by the number of actual viewers. In the case of younger viewers, such platforms if accessed via an app leave no browser footprint that might be discovered by parents—a very attractive proposition.

If social media companies provide alternative access to the same or similar pornographic material with no restriction, surely the regulator should be entitled to take the same proportionate approach and target pornographic social media accounts with similar viewer numbers to those for adult websites. For most young people, social media platforms are the gateway to the internet. Unless they are to be included within the definition of ASPs, neither the problem of young people accessing pornography nor the ambition of setting a social norm that puts adult sexual material beyond the easy reach of children and young people will be achieved. It will simply migrate.

I note that social media platforms are not homogenous and that some, including Facebook and Instagram, already take steps to prevent pornography being posted and act quickly to take it down when it does go up. It is disappointing that not all platforms take this approach. I do not want to focus on Twitter, but noble Lords might like go to the account, @gspot1177, with its 750,000 public followers, which has been publishing pornography with impunity since 2009. Surely it is necessary to bring this into scope of the regulator. Nobody is claiming that the measures set out in the Bill will prevent 100% of pornography being seen by children and I understand Ministers’ arguments that doing something is better than doing nothing, but I am concerned that in the lack of clarity about what does and does not fall within the definition of ASP there may lie a lack of political will about holding certain stakeholders to account.

I would love to hear from the Minister whether major social media platforms including Tumblr and Twitter have confirmed to the Government how they would respond to requests from the BBFC to withdraw services from a non-compliant site—and whether his statement at Second Reading that social media platforms may be considered ASPs by the regulator still stands.