Internet Encryption

Lord Clement-Jones Excerpts
Tuesday 14th May 2019

(5 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness for discussing this with me beforehand, which was very welcome. I agree that there may be serious consequences from DoH. The DoH protocol has been defined by the Internet Engineering Task Force. Where I do not agree with the noble Baroness is that this is not an obscure organisation; it has been the dominant internet technical standards organisation for 30-plus years and has attendants from civil society, academia and the UK Government as well as the industry. The proceedings are available online and are not restricted. It is important to know that DoH has not been rolled out yet and the picture is complex—there are pros to DoH as well as cons. We will continue to be part of these discussions; indeed, there was a meeting last week, convened by the NCSC, with DCMS and industry stakeholders present.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the noble Baroness has raised a very important issue, and it sounds from the Minister’s Answer as though the Government are somewhat behind the curve on this. When did Ministers actually get to hear about the new encrypted DoH protocol? Does it not risk blowing a very large hole in the Government’s online safety strategy set out in the White Paper?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

As I said to the noble Baroness, the Government attend the IETF. The protocol was discussed from October 2017 to October 2018, so it was during that process. As far as the online harms White Paper is concerned, the technology will potentially cause changes in enforcement by online companies, but of course it does not change the duty of care in any way. We will have to look at the alternatives to some of the most dramatic forms of enforcement, which are DNS blocking.

Online Harms

Lord Clement-Jones Excerpts
Monday 8th April 2019

(5 years, 1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - - - Excerpts

My Lords, it is with pleasure and a great deal of relief that I speak to this Statement and the White Paper that lies behind it. Having sat through endless hours of the previous debates and the acrimony generated by them, and having found ourselves in places where I suspect none of us wanted to be, it is a pleasure to come to proper business again and to look at something that affects the whole of our society. We must find remedies and seek a legislative way forward that deals with the problems that we know are part and parcel of this innovative and brilliant thing that we call the internet and the technological advances that go with it.

Having read the White Paper and listened to the Statement, I am convinced that, across the Benches of this House, we must see this as unique in a party-political system in that we must act together. Consensual approaches and sensible resolutions to the problem are a duty that falls upon all of us. After all, the internet affects every part of our society—all of us have felt the questions it raises and enjoyed the wonderful opportunities it affords—so I hope that we can approach this in a consensual and cross-party way.

I congratulate the Government—is it not wonderful to hear someone from these Benches saying that?—on generating a report that is lucid and clear and will generate the kind of discussion that the consultative period, now beginning, will need. It is well laid out; my son is a printer, and he constantly beleaguers me about layout as I understand it and layout as he understands it, and he would be pleased with this. I can give no higher commendation. Congratulations are in order.

I know that we will have detailed, forensic debates when the results of the consultation are before us. At the moment, highlighting some of the headline aspects will have to do. The duty of care has been spoken to already and we must emphasise it; after all, we are all aware of those who are harmed by the abuse of the internet. Some well-publicised cases leave their images constantly before our eyes, especially when we think that some of them, indeed a lot of them, are children. In previous legislation that we have debated on the Floor of the House, we have talked about designing the internet in such a way that the interests and rights of children are protected. I am quite sure that we will take all that forward in the outworking of the further proposals in this White Paper.

We want to protect people from harms, and we will no doubt want to discuss what we think constitutes harms in the proper sense. There are indeed in this White Paper, rather conveniently, tabulated harms: those that are illegal, that are dangerous; that deserve attention. These are indicative lists, and no doubt we will want to move things from here to there and there to here, and add to and subtract from as time goes on, but it is a pretty good starting point to show us the range of conducts and activities that we will need to give attention to.

It is a bold White Paper. It claims to be bold and boasts of being bold. For me, there is one aspect that teases me, and I hope the Minister can give us some reassurance on it. It is the whole idea that while the internet and online activity affects us locally—in our homes and elsewhere—this has to be balanced against the fact that the companies, across whose platforms the material that generates these problems come, are global. We have seen how difficult it is to deal with the taxation aspects of these global companies. It will be equally difficult to think about legislation that could bring them all into line, and a word about that would be very helpful as we steer our way into the consideration of these proposals.

Statutory measures are mentioned, and I am delighted about that, of course, because these proposals and this way forward need to be underpinned by the full force of the law, and the regulator will be endowed with powers that are appropriate to the importance of the job. I wonder how we will bring a regulator to birth; some suggest that it should perhaps be an offshoot of Ofcom in the first place, that under the aegis of Ofcom we can get regulation built in to our way forward, and that it can evolve into something more complete later.

Any legislation that we bring forward will need to be nimble and flexible, because technology moves faster than the making of laws, and since the making of a law, as we know from the one we have been discussing, can be interminable, I hope that we will never be accused of tardiness in acting promptly, flexibly and nimbly to combat the downside of online activities.

So I congratulate the Government and I look forward to further debates and in greater detail.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we, too, on these Benches welcome the fact that the Government’s proposals have come forward today, and we support the placing of a statutory duty of care on social media companies. We agree that the new arrangements should apply to any sites,

“that allow users to share or discover user-generated content, or interact with each other online”.

We think that is a fair definition.

We are all aware of the benefits of social media networks and the positive role they can play. There is, however, far too much illegal content and harmful activity on social media that goes undealt with by social media platforms and creates social harm. The self-harming material on Instagram and the footage of the Christchurch killings are perhaps the most recent examples.

Proper enforcement of existing laws is, of course, vital to protect users from harm, but, as the White Paper proposes, social media companies should have a statutory duty of care to their users—above all, to children and young people—and, as I say, we fully support the proposed duty of care. It follows that, through the proposed codes, Parliament and Government have an important role to play in defining that duty clearly. We cannot leave it to big private tech firms, such as Facebook and Twitter, to decide the acceptable bounds of conduct and free speech on a purely voluntary basis, as they have been doing to date.

It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly. I welcome the Government’s stated commitment to these two aspects.

We also very much welcome the Government’s adherence to the principle of regulating on a basis of risk and proportionality when enforcing the duty of care and drawing up the codes. Will the codes, as the Lords Communications Committee called for, when exercising powers of oversight, set out clearly the distinction between criminal, harmful content and antisocial content? By the same token, upholding the right to freedom of expression does not mean a laissez-faire approach. Does the Minister agree that bullying and abuse prevent people expressing themselves freely and must be stamped out? Will there be a requirement that users must be able to report harmful or illegal content to platforms and have their reports dealt with appropriately, including being kept informed of the progress and outcome of any complaint?

Similarly, there must be transparency about the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. We welcome the proposed three-month consultation period; indeed, I welcome the Government’s intention to achieve cross-party consensus on the crucial issue of regulating online harms. I agree that with a national consensus we could indeed play an international leadership role in this area.

Then we come to the question of the appropriate regulator to enforce this code and duty. Many of us assumed that this would naturally fall to Ofcom, with its experience and expertise, particularly in upholding freedom of speech. If it is not to be Ofcom, with all its experience, what criteria will be used in determining what new or existing body will be designated? The same appears to me to apply to the question of whether the ICO is the right regulator for the algorithms used by social media. I see that the Home Office will be drawing up certain codes. Who will be responsible for the non-criminal codes? Have the Government considered the proposals by Doteveryone and the Lords Communications Select Committee for a new “Office for Internet Safety” as an advisory body to analyse online harms, identify gaps in regulation and enforcement and recommend new regulations and powers to Parliament?

At the end of the day, regulation alone cannot address all these harms. As the noble Baroness, Lady Kidron, has said, children have the right to a childhood. Schools need to educate children about how to use social media responsibly and be safe online, as advocated by the PSHE Association and strongly supported by my party. Parents must be empowered to protect their children through digital literacy, advice and support. I very much hope that that is what is proposed by the online media literacy strategy.

At the end of the day, we all need to recognise that this kind of regulation can only do so much. We need a change of culture among the social media companies. They should be proactively seeking to prevent harm. The Government refer to a culture of continuous improvement being a desired goal. We on these Benches thoroughly agree that that is vital.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I am very grateful for the welcome by both noble Lords for this White Paper. Nevertheless, I am not complacent; I have worked with noble Lords opposite on several big Bills on digital matters and I know there is a lot of detail that will need to be included in the legislation. However, the principle that this is generally welcome and the fact that the main bones of the proposal are welcome—namely, the duty of care and the independent regulator—is good. We have made a point of saying that we want to work on a cross-party, consensual basis and one of the reasons for having an extensive consultation is to achieve that. In some ways, this is an old-fashioned way of making legislation, to the extent that we have had a Green Paper and a consultation, then a White Paper and a consultation: we hope that a lot of the issues can be ironed out, and some of the detail. The way we worked on the Digital Economy Act and the Data Protection Act shows that we can bring in some fairly big and complicated Bills in a consensual way.

The noble Lord, Lord Griffiths, talked about children. They are very important to our thinking. We have not written a specific chapter on the subject because we want it hard-wired throughout the whole White Paper. From the day the regulator is formed, any company in scope will have to say that it is thinking about the customers and users of its products in the design of its website and products means that it will have to, as part of its duty of care, think about the age, vulnerability and sort of people who will use it. That is built into the system.

We thought a lot about the international aspects of regulating the internet, because there is no point having a regulator or enforcement system that cannot cope with the way the internet works, which is, by definition, international. We will therefore think and consult on some of the further sanctions we could put on internet companies, such as individual liability. We might require representatives in the country in the same way as the GDPR does. Ultimately, we are consulting on whether we should take powers to block websites completely. These are, in the main, money-making organisations—Google’s second-largest advertising market is in this country, for example. The internet giants have significant economic stakes in this country, and they could be faced with a very serious penalty.

Above all, we are not expecting the internet companies, large or small, to do anything unreasonable. Some appalling things go on the internet, and the regulator will look at the duty of care—as said in the Statement—as a risk-based and proportionate approach. The big internet giants will be held to a different standard from the small start-ups.

Both noble Lords talked about the regulator. There is a possibility that an existing regulator could either take on this job or create the regulator which may be divested later. We are consulting on that, and would be interested in the views of noble Lords and other stakeholders. It is important to bear in mind that time is of the essence. We want to get on with this. We want to get it right—but we want to get a move on.

The noble Lord, Lord Clement-Jones, talked about some of the harms that are not just illegal. We absolutely agree. In some ways, the harms that are illegal are easy to deal with—they are illegal, and should be so offline as well as online—but things that are not specifically illegal, such as cyberbullying, can have a tremendous effect on people’s lives. We certainly take those into account. The internet companies will have to take a reasonable and balanced approach; they need to show that they are taking seriously harms that can really affect people’s lives, and that they are building their approach to them into the way they operate their companies. Terms and conditions should be met and abided by; there should be a proper complaints procedure, which we will demand be taken seriously, and there will be an appeals process.

The consultation actually started today. We have so far got eight responses. It will go on for three months, after which we will look at it. As I say, noble Lords are very welcome to contribute.

Finally, the noble Lord, Lord Clement-Jones, talked about a change of culture. I think the noble Lord, Lord Griffiths, implied the same thing. The point about this White Paper is that we are moving to a proactive system of regulation where we expect every company, be it large or small, to think in a proportionate way about the harms it could do and to take sensible measures not only to deal with them but to explain to the regulator what it is doing and to have transparent reporting. The regulator will be given powers to inquire of the internet companies what they are doing about these matters.

Public Authorities: Algorithms

Lord Clement-Jones Excerpts
Thursday 14th March 2019

(5 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

To ask Her Majesty’s Government what consideration they have given to the standards and certifications required for the algorithms used in decision-taking by public authorities and agencies.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, last year the Government published the Data Ethics Framework, which sets out clear principles and standards for how data is used in the public sector—an important tool guiding the ethical use of algorithms and AI technologies. The Government have also recently set up the Centre for Data Ethics and Innovation, which will provide independent, expert advice on the governance of data and AI technology. The centre’s first two projects will study the use of data in shaping people’s online experiences and the potential for bias in decisions made using algorithms. This work and the centre’s future work will play a leading role in ensuring transparency and accountability in the ethical use and design of algorithms.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, some 53 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment and assistance in decision-making. The Centre for Data Ethics and Innovation, for all its virtues, is not a regulator. The Data Ethics Framework does not cover all aspects of algorithms. As the Minister will know, it was quite difficult finding a Minister to respond to this Question. Is it not high time that we appointed a Minister—as recommended by the Commons Science and Technology Committee—who is responsible for making sure that standards are set for algorithm use in local authorities and the public sector and that those standards enforce certain principles such as transparency, fairness, audit and explainability and set up a kitemark so that our citizens are protected?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, there was no difficulty in finding a Minister in this House: answering the noble Lord’s very sensible Question was pinned on me at a very early stage. The point about the Centre for Data Ethics and Innovation, which will publish its interim report on algorithms in the summer—relatively soon—is that it will look across the whole area and highlight what should be done in regulation terms. It will be one of the things that we expect the centre to look at, so the genuine concerns raised by the noble Lord can be considered at by this forward-looking body.

Social Media: Online Anonymity

Lord Clement-Jones Excerpts
Wednesday 6th February 2019

(5 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

One of the things we are considering is a duty of care. That might include holding directors personally responsible. We have not decided that yet, but it is certainly an idea worth considering. As it is a White Paper that is coming out this winter, there will be a consultation on it, so we welcome views from my noble friend.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the Law Commission, in its scoping report last November into abusive and online communications, said that one of the key barriers to the pursuit of online defenders was,

“tracing and proving the identity of perpetrators, and the cost of doing so”.

I heard what the Minister said about the White Paper’s contents, but will the Government include a provision allowing the stripping of anonymity in circumstances of online crime? Have the Government had any discussions with the police or other enforcement agencies to understand the issues they face in tracking these perpetrators and bringing them to justice?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

It is certainly something worth considering in the White Paper, but as far as dealing with the police is concerned, the Home Office is working with policing to identify ways to tackle this when it goes over the threshold into criminality. These are relatively new crimes; the police will have to evolve methods to deal with them. We have also worked with the office of the Director of Public Prosecutions. There is a digital intelligence investigation programme, aiming to ensure policing has the ability to investigate the digital elements of all crime types. Also, the Home Office is working with the College of Policing to drive improvements in overall police capability to investigate and prosecute online offences.

Children and Young People: Digital Technology

Lord Clement-Jones Excerpts
Thursday 17th January 2019

(5 years, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I add my thanks to the noble Baroness, Lady Kidron, for initiating this important and extremely well-informed debate. She did it in such a thoughtful way, especially in emphasising the positive right of the child to flourish and the importance of harm prevention in this context.

Since we debated the first Digital Economy Act 10 years ago, public understanding of and attitudes towards the internet have changed markedly. Several noble Lords emphasised the benefits of digital technology, but in that time evidence has mounted of the effect of social media and connected devices on young people in particular, impacting on their health, mental well-being and educational attainment. The noble Lord, Lord Ramsbotham, unpacked that issue in an extremely instructive way.

Of course one could debate further the impact of the internet and digital technology on our democracy, as the noble Baroness, Lady O’Neill, demonstrated, but today I fear I have little time and it is necessary to concentrate on online harms to children. It has become clear that people—children and adults—should have the same rights online as they have offline. As the noble Baroness, Lady Redfern, said, we must align online and offline behaviour and recognise the unique dangers that online access sometimes poses.

This House has already had an impact through the limited amount of regulation we have been able to impose on the internet. Too many Members are involved for me to mention them all, but there are the age-verification provisions; the age-appropriate design code, which was the inspiration of the noble Baroness, Lady Kidron; and the new offence of revenge porn, which my noble friend Lady Grender was instrumental in introducing, with government support, through the Criminal Justice and Courts Act.

However, so far, government efforts specifically to deal with the abuses of social media have been extremely limited and there is still a culture of hands-off regulation of the internet, which favours the platforms. Indeed, as my noble friend Lord Storey pointed out, in the case of classification of video games, we have gone backwards. As he mentioned, we have had the Government’s digital charter, a Green Paper before that, and the Government’s response last May to the internet safety Green Paper. As many noble Lords have mentioned, we are also promised shortly a White Paper on internet safety strategy, which will set out plans for legislation covering,

“the full range of online harms, including both harmful and illegal content”.

Can the Minister convert that promise of “shortly” to “imminently” today? That would be an improvement to many minds.

The Secretary of State for Health last October asked the Chief Medical Officer to review the impact of too much social media use on children’s mental health and draw up guidance to help patients. Simon Stevens, the chief executive of NHS England, suggested that Ministers should consider taxing social media giants such as Facebook and Twitter to,

“help stem the tide of mental ill-health”,

or,

“at least help pick up the pieces”.

That is all heading in the right direction, and I hope it demonstrates the White Paper’s direction of travel. However, where is the promised interim review from the Chief Medical Officer?

In her report last year, Who Knows What About Me?, the Children’s Commissioner, Anne Longfield, set out a series of recommendations on what our policy-makers should do to protect children. As advocated by Carnegie UK Trust, she believes that a statutory duty of care should govern relations between social media companies and the audiences they target. Recently, Ofcom has argued for tech companies such as Facebook and Google to be regulated in the same way as the mobile phone and broadband industry. I do not believe that this goes far enough, but it is interesting nevertheless that Ofcom, which is not known for its proactivity in this area, is prepared to argue for that. The noble Baroness, Lady Williams, has said that the Home Office is considering the idea of an online safety commissioner. Those are all good indications.

Of course, many broadcasters have also got together to call for the independent regulation of online platforms’ operations in the UK. I pay tribute to the noble Baroness, Lady Lane-Fox, who has been almost as redoubtable a campaigner in this area as the noble Baroness, Lady Kidron. Last year, doteveryone produced a report entitled Regulating for Responsible TechnologyCapacity, evidence and redress: a new system for a fairer future. As a number of noble Lords mentioned, the NSPCC has come up with an interesting combined scheme with suggestions for not just a duty of care but a regulator to enforce a set of compulsory standards through that duty. What the noble Lord, Lord Bichard, said about the possible ingredients of that was very good. The right reverend Prelate the Bishop of Chelmsford mentioned the House of Lords Communications Select Committee, of which he is a member. We all await with bated breath what I hope will be a worthy successor to its excellent report, Growing Up with the Internet.

It is becoming clear that we need the Government’s internet safety White Paper to be much more strategic and comprehensive in nature, and to have real teeth in terms of standards, regulation, transparency of reporting and enforcement. To cap it all, if the Government have not written the White Paper already, I hope that they will take serious note of the excellent 5Rights paper, Towards an Internet Safety Strategy, for which the noble Baroness, Lady Kidron, was responsible. It sets out seven pillars of a safety strategy in a comprehensive framework. As the noble Baroness, Lady McIntosh, my noble friend Lord Storey, the right reverend Prelate the Bishop of Chelmsford and other noble Lords have emphasised, it is down to the Government to regulate this area. The Government should absolutely be proactive here.

As my noble friend Lady Grender stated so eloquently in the November debate initiated by the noble Lord, Lord Stevenson, this is about recognising that parents can do only so much to protect their children from online harms. I am the parent of an online gamer and the uncle of a pioneering addiction researcher, so I am particularly aware of some of the issues here. Of course, the noble Baroness, Lady Greenfield, is the expert, but the former Facebook president backed her. He let the cat out of the bag by stating that social networks had been designed to “exploit” the psychological vulnerabilities of their users, and that “dopamine hits” are built in to create addiction. That is what the algorithms are designed to deliver. It applies to gambling and gaming just the same.

We heard from a number of noble Lords, including the noble Baronesses, Lady Watkins and Lady Greenfield, about screen time. It is very instructive, is it not, that so many senior tech executives in Silicon Valley send their children to Waldorf schools—the equivalent of our Rudolf Steiner schools—which limit screen time? They believe that screen time has a major impact. I am not sure that I buy what techUK said in its briefing to us about the impact of screen time. The jury may be out on this, but I am afraid that I am pretty sure in what direction it is going.

We might pick and choose which regulator would be specific to this area. It could be the ICO, which has been very effective in the data field, it could be Ofcom or it could be a special commissioner. Nevertheless, we need to make sure that that body has the right resources and that we put the responsibility on to a single organisation so that we know who is accountable.

I do not have time to follow up on many of the points made by my noble friend Lord Storey about education, but it is absolutely crucial that our children are digitally literate—indeed, it is important that adults are digitally literate. That can be achieved in part through PSHE and partly through the kind of creative education that the noble Baroness, Lady McIntosh, talked about. However, ranging more widely, I would mention again the doteveryone organisation, because its identification of digital blind spots and how we are targeted by social media and digital technology is extremely important. We have to make sure that this is not just the responsibility of our teachers, and that we have in place other mechanisms to ensure that we achieve a high level of digital literacy. I have a huge amount of time not only for doteveryone but for people like the Good Things Foundation, which is doing a great deal in the community in this respect. We must ensure that we know who has power over our children, what values are in play and when that power is exercised. It is vital to the future of our children, to the proper functioning of our society and to the maintenance of public trust.

Online Pornography (Commercial Basis) Regulations 2018

Lord Clement-Jones Excerpts
Tuesday 11th December 2018

(5 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I want to say a few words before the summing up. We need to remind ourselves that the purpose of these regulations is to protect children, including those coming up to adulthood. We are trying to prevent them thinking that some fairly unsavoury habits that are not medically good for them are normal. That is the challenge. These websites have teaser adverts to try to get people drawn into pornography sites to buy harder-core or more detailed pornography. We are not trying to do anything about people who are willing to enter into a payment arrangement with the site but to make sure that children are stopped at the front end and are prevented from seeing the stuff that will give them the wrong impression about how you chat to a girl or a girl chats to a boy and how you behave with members of the same sex or the opposite sex in a sexual relationship. We need to be quite quick on this sort of stuff because if we are going to try to stop this being widespread we need to block it.

There is an awful lot of guff in this. It has taken a long time for these regulations to get here—we really expected them about a year ago. I do not know what DCMS has been doing during this time. I know it had some draft guidelines a long time ago, but perhaps they were so young that they were uneducated too and tried to learn about these things—I do not know.

The point about the adverts is they sit there in front. We are probably going to have buttons on the front of the website stating that people have to verify their age. That will take people off, probably to third-party sites which know them and anonymously verify that they are over 18 and that is when they can get into the website. However, the website is going to want to put something up for that first encounter. I wonder whether this is not an opportunity to think positively and perhaps put up something about understanding the beginning of a relationship and how you can get excited and go forward without going to the harder aspects which involve penetrative sex et cetera. There may be an opportunity there. That is a bit of a red herring because we are talking about the regulations, but it may be a positive thought for the future.

The thing that worries me particularly is paragraph 2.5 of the BBFC guidance which refers to sites that are,

“mostly frequently visited, particularly by children”,

and are,

“most likely to be sought out by children”.

Social media may not be marketed as carrying or giving access to pornography, but it does so on a huge scale. This one-third rule is very odd because it is easily abused. There are about 39 million UK users of Facebook, so do we say that if 12 million are putting up pornography that is okay because it is under the one-third threshold? Earnings would be very hard to measure, given Facebook’s turnover, so how are we going to do the one-third? It is very odd. The purpose of this is to protect children, so I do not think we should be having very high thresholds to let people get away with it.

There are two things that really worry me. Paragraphs 2.6, 2.7 and 2.8 of the guidance are on enforcement. It is going to be very slow. By the time the BBFC has sent out a warning and it is received, given another notification, published this, waited for the website to write back, et cetera, how long will it take? Websites that want to get round it will game the system. If they start doing that, the big websites—they are on side with this and want to help because they have got teenage children and are not paedophiles but are trying to sell adult pornography to adults and therefore want to help, believe it or not—will lose too much business; they will have to go with the flow and play the same game, in which case the whole thing will get wrecked.

If the Internet Watch Foundation, without a true legal basis, can get sites blocked immediately, why cannot we, with proper law? Everyone has had warnings about it. The whole of the industry around the world has apparently been talking about it for the past year. The BBFC has spoken at such events. Everyone knows, so I cannot understand why we cannot act more quickly and go live from day one. If anyone does not comply, that is bad luck. We could set up some pre-notification stating: “If you do not comply by tomorrow, you have had it”.

The other matter is the certification scheme, which is voluntary. A big hole is that because this is under a DCMS Bill, it could not touch privacy and data security. That is an ICO responsibility. The security of people’s data is regulated elsewhere, and the ICO has only recently started to show an interest in this, because it is overloaded with other things. There is now a memorandum of understanding between the BBFC and the ICO, which is very good. They could be brought together in a certification scheme. The BBFC cannot enforce data security and privacy, because that is an ICO responsibility, but a certification scheme could state that a site cannot be certified unless it complies with all the legal standards—both the Data Protection Act 2018, which the ICO is looking at, and the BBFC rules on age verification for websites and providers. That could be good.

If your Lordships want to know how to do it, I fear I shall give a plug for the British standard for which I chaired the steering group, BS 1296; it includes a whole section on how to do the GDPR stuff, as it was then called. We could not mandate it in the British standard because other standards mandate it, but that tells you how to do it.

The certification needs to be clear, otherwise there will be a whole lot of wishy-washy stuff. I am not sure that a voluntary scheme is a good idea, because the BBFC will have a lot of hard work trying to check sites that decide not to comply, so it will have to certify them by another method. That will be difficult.

However, at the end of the day, there is a lot of willingness between all the parties to try to get this to work. The world is watching us—quite a few other countries are waiting to see whether this will work here. That will help enormously. We should try to get a lot of cross-stakeholder information and co-operation, a round table of all interested parties from child protection all the way through to those running the adult sites. Perhaps some good could come out of that. Certainly, everyone wants to help the BBFC and DCMS, the parent body. Everyone wants to help the ICO. We would like to get this to work: there is a lot of good will out there if only we could get moving to make it work properly.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we on these Benches want the regulations and draft guidance to come into effect. The child protection provisions are a significant element of the Digital Economy Act which, although not entirely in line with what we argued for during its passage, we supported in principle at the time and still do, while realising, as my noble friend Lord Paddick said, that they are not the conclusive answer to children’s access to pornography. As he also said, a number of areas need to be addressed in the course of today’s debate.

For a start, as several noble Lords said, it seems extraordinary that we are discussing these sets of guidance nearly two years after the Digital Economy Act was passed and nearly a year after the Government published their guidance to the regulator, the BBFC. What was the reason for the delay?

Next, there is the question of material that falls within the definition of being provided on a commercial basis under the Online Pornography (Commercial Basis) Regulations, the subject of today’s debate. Several noble Lords mentioned this. As drafted, they do not currently include social media or search engines and on these Benches, we regret that the Government have decided to carve out social media from the definition. This is a potentially significant loophole in the regime. It is important that it is monitored and addressed if it damages effectiveness. It is in particular a major concern that social media and search engines do not have any measures in place to ensure that children are protected from seeing pornographic images.

The Secretary of State’s guidance to the AV regulator asks the BBFC to report 12 to 18 months after the entry into force of the legislation, including commenting on the impact and effectiveness of the current framework and changes in technology which may require alternative or additional means of achieving the objectives of the legislation. In addition, under Section 29 of the Digital Economy Act, 12 to 18 months after the entry into force of the scheme, the Secretary of State must produce a report on the impact and effectiveness of the regulatory framework.

This is therefore a clear opportunity to look again at social media. The Government have made some reference to legislating on social media, but it is not clear whether they intend to re-examine whether the definition of commercial pornography needs to be broadened. Can the Minister assure the House that this will be dealt with in the internet safety White Paper, that the Secretary of State’s report will cover the level of co-operation by services such as social media and search engines, which are not obliged to take enforcement action on notification, and that, in doing so, it will firmly tackle the question of access by children to pornography via social media?

Next is the question of resources for the age-verification regulator. This is a completely new regime, and with fast-changing technology, it is vital that the BBFC, as the AV regulator, has the necessary financial resources and stable grant funding to meet the important child protection goals. Can the Minister assure us that the Government will keep resources available to the BBFC in its AV regulator role under review and undertake explicitly in the Secretary of State’s annual report to deal with the question of resources enabling the BBFC to carry out its work?

Next is the question of the BBFC having chosen to adopt a voluntary scheme. On these Benches, we welcome the voluntary scheme for age-verification providers referenced in annexe 5 to the draft Guidance on Age-verification Arrangements. In fact, it bears a striking resemblance to the scheme that we proposed when the Act was passing through Parliament, which would have ensured that a scheme involving third-party companies providing identity services to protect individual privacy and data security would be engaged. As I recall, the noble Earl, Lord Erroll, helped greatly in convening providers of digital identity schemes to show what was possible. I think he is still ahead of us today.

Our key objections were that what was originally proposed did not sufficiently protect personal privacy. The BBFC is to be congratulated on establishing the certification scheme. As I understand it, it already expects all the major providers to undertake the certification process. Furthermore, because the scheme is voluntary, these assessments will be for foreign-based as well as UK providers, which is a major achievement and could not be accomplished with a UK statutory scheme.

The key to the success of the voluntary scheme, however, is public awareness. I hope that the Minister can tell us what DCMS is doing to support the promotion of the BBFC’s kitemark in the three months before the scheme comes into effect.

Next, there are the JCSI criticisms set out in its report on 28 November. This House rightly always takes the criticisms of the JCSI seriously, and the Minister set out a careful response to them. I do not always pray a government memorandum in aid, but the BBFC was following the Secretary of State’s guidance to the AV regulator. Under the terms of Section 27 of the Digital Economy Act, as a result of amendments in the Lords during its passage, the BBFC was charged with having regard to the Secretary of State’s guidance. The JCSI suggests that the BBFC could have chosen to ignore “incorrect” Secretary of State guidance, but that would have put it in an impossible position.

I shall not adumbrate all the different areas, but the inclusion of what was necessary in compliance with Section 27, the advice on best practice, the annexe setting out the voluntary scheme and the role of the ICO all seem to be helpful as part of the guidance and proportionate in terms of what the AV regulator prioritises.

There are a number of other aspects of these sets of guidance worthy of mention too. As we have heard, this age-verification framework is the first of its kind in the world, and there is international interest in it. Are the Government discussing with the BBFC what lessons there are in terms of encouraging robust AV for younger age groups and for other types of potentially harmful content? Will the Government use the expertise developed by the BBFC as the age-verification regulator in the internet safety White Paper?

Centre for Data Ethics and Innovation

Lord Clement-Jones Excerpts
Wednesday 21st November 2018

(5 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I am grateful to the Minister for repeating the Statement made elsewhere. He was present for part of the debate on artificial intelligence on Monday. On reflection, it is a bit surprising that the Government were not able to accelerate the announcement of this new body. It would have helped a lot in that debate. No doubt the tyranny of the grid is to blame again, but many of us would have felt the benefit had we known, not least, that the membership of the board had been enhanced by those Members of your Lordships’ House already referred to.

To go back in history a bit, the Centre for Data Ethics and Innovation came out of amendments we proposed during the passage of the Data Protection Bill, but it was built on excellent work by the Royal Society and others. We should pay tribute to the groundwork that led to today’s announcement. Those amendments had a lot of support from around the House and would have gone into the Bill had we been able to push them further, but we could not get them within the bounds of the Bill’s framing. We should say clearly that the model we had in mind then was the independent Human Fertilisation and Embryology Authority. In preparing the thinking in this new area of advanced technology and data processing and protection, one needed a carefully balanced body that could regulate in the context of difficult ethical issues raised by research and development.

I will now ask a number of questions about the body itself, and I hope that the Minister will respond, in writing later if not now. The body was originally intended to be an independent statutory body, but it is not because no powers have yet been established. What is the progress on that? The reports I have read suggest that that is still an objective of the Government, although they are making a virtue of the fact that it is an advisory committee in the interim period. In some senses, they will probably be judging its success, which is a bit worrying given that the whole benefit would be that it was independent of government, long-term and able to look without fear or favour at the big issues. If it is an advisory committee of the department, how independent will it be in practice? Is funding secured? Can it spend what it needs to get the research and advice it needs? How much of the original thinking about the HFEA remains? As an advisory committee, can it request information? One problem is the difficulty of extracting information from the behemoths that populate the international information society.

The press release rightly describes the membership as “stellar”. Given the names already mentioned here, I think we should recognise that. I confess that my application was weeded out very early in the game. This was unfortunate, because I would have been delighted to be part of that. Having seen the full list and heard why they were chosen, it is clear that the right decisions have been reached and I bear no malice to those responsible—honest. If the membership question comes up later, I am still around.

In the absence of the new centre starting up, we have only two or three areas of activity. We have a statement as a result of the consultations that took place. It talks about the focuses being to provide clear guidance and regulation and to lead debate about how data can be used in the future. But there are still some problems that need to be resolved, and I will be interested to hear the Minister’s comments. The AI report we discussed at length in a very good debate on Monday, when there were notable speeches from the right reverend Prelate the Bishop of Oxford and the noble Lords, Lord Reid and Lord Browne, shows the range of issues that are going to be up for discussion. These are very abstruse areas of intellectual activity such as ethics and the nature of machines—whether they are responsible for their actions and, if so, how any redress can be obtained. The noble Lord, Lord Browne, posed questions about intelligent weapons and what controls must be placed on them. It is a very stretching agenda. All we know is that issues currently in the list include data trusts, algorithms and consumer experiences. I do not think there will be a shortage of those. Can the Minister explain what the process will be? I gather an overall strategy document will be revealed.

There are some concerns about the balance between advice and regulatory action. I think the plan would be for advice to be offered to government and regulatory action to be taken by existing or other bodies. Could we have confirmation of that? There is a question about the balance between ethics and innovation. Clearly, innovations are difficult to support if they raise big ethical issues too quickly; they often need to be tested over time and analysed. It would be useful if there were a way forward on that. Of course, there is the whole question of how the Government intend to treat public data, its use and value for money, and the extent to which it will be available.

Lastly, the new centre, which I wish extremely well, enters a rather crowded space with the Information Commissioner’s Officer, Ofcom and the CMA, all of which have statutory functions in this area, but perhaps I may counsel that also to come are the Alan Turing Institute, which is now up and running, and the Open Data Institute. Therefore, there will be a need for some time for this whole process to settle down and for leadership from the Government on how it will work.

The responses to the consultation showed a clear public wish for consistency and coherence, and I hope that in that process there will be room for consultation. I do not wish the new body to be a proselytiser for data or indeed for artificial intelligence, but there is a difference between proselytising and being in an explanatory mode, reassuring people and explaining to them the benefits as well as the risks of this new technology. The centre needs to be public facing and fully engaged in that process, and I wish it well.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I too thank the Minister for repeating the Statement. He was missed in the debate on Monday. I have had the benefit of reading the Government’s response to the consultation on the Centre for Data Ethics and Innovation. I share the enthusiasm for the centre’s creation, as did the Select Committee, and, now, for the clarification of the centre’s role, which will be very important in ensuring public trust in artificial intelligence. I am also enthusiastic about the appointments—described, as the noble Lord, Lord Stevenson, said, as “stellar” in the Government’s own press release. In particular, I congratulate Members of this House and especially the noble Baroness, Lady Rock, and the right reverend Prelate the Bishop of Oxford, who contributed so much to our AI Select Committee. I am sure that both will keep the flame of our conclusions alive. I am delighted that we will also see a full strategy for the centre emerging early next year.

I too have a few questions for the Minister and I suspect that, in view of the number asked by me and by the noble Lord, Lord Stevenson, he will much prefer to write. Essentially, many of them relate to the relations between the very crowded landscape of regulatory bodies and the government departments involved.

Of course¸ the centre is an interim body. It will eventually be statutory but, as an independent body, where will the accountability lie? To which government department or body will it be accountable? Will it produce its own ethics framework for adoption across a wide range of sectors? Will it advocate such a framework internationally, and through what channels and institutions? Who will advise the Department of Health and Social Care and the NHS on the use of health data in AI applications? Will it be the centre or the ICO, or indeed both? Will the study of bias, which has been announced by the centre, explore the development of audit mechanisms to identify and minimise bias in algorithms?

How will the centre carry out its function of advising the private sector on best practice, such as ethics codes and advisory boards? What links will there be with the Competition and Markets Authority over the question of data monopolies, which I know the Government and the CMA are both conscious of? In their consideration of data trust, will the government Office for Artificial Intelligence, which I see will be the responsible body, also look at the benefits of and incentives for hubs of all things? These are beginning to emerge as a very important way of protecting private data.

What links will there be with other government departments in giving advice on the application of AI and the use of datasets? The noble Lord, Lord Stevenson, referred to lethal autonomous weapons, which emerged as a major issue in our debate on Monday. What kind of regular contact will there be with government departments—in particular, with the Ministry of Defence? One of the big concerns of the Select Committee was: what formal mechanisms for co-ordinating policy and action between the Office for Artificial Intelligence, the AI Council, the Centre for Data Ethics and Innovation and the ICO will there be? That needs to be resolved.

Finally, the centre will have a major role in all the above in its new studies of bias and micro-targeting, and therefore the big question is: will it be adequately resourced? What will its budget be? In the debate on Monday, I said that we need to ensure that we maintain the momentum in developing our national strategy, and this requires government to will the means.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am tempted to say that I will write, but I will try to answer some of the questions, and I will write regarding some that I do not get around to. I was in at the beginning of the debate on AI and I listened to the noble Lord’s speech.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, that is all that he needed to listen to.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Not everyone would agree with that, but I did indeed listen to it. I have read that AI is a joint responsibility with BEIS, and my noble friend Lord Henley coped more than adequately, so I do not think that I really was missed.

There was a great deal of support for this innovation—the centre—both in the response to the consultation and, as the noble Lord, Lord Stevenson, said, in proceedings on the then Data Protection Bill, so I am grateful for that today, but I accept the very reasonable questions. On the centre’s independence as it stands now and its statutory establishment, I say that we have deliberately set this up as an advisory body so that it can consider some of the difficult issues that noble Lords have raised. Policy is the Government’s responsibility, so there should not be any confusion about who is held accountable for policy—and it is not the Centre for Data Ethics and Innovation. When this has been established, when we have seen how it has worked and when we have addressed the questions of the crowded space that both noble Lords mentioned, it is our intention to put this on a statutory basis. Then we will see how it has worked in practice. When it comes to putting it on a statutory basis, I have no doubt that there will be lots of back and forth in Committee and things like that on the exact definitions and its exact role.

There are some differences from the Human Fertilisation and Embryology Authority, although of course that was a particularly successful body. One of the main differences was that a lot of those things were considered in advance of the science, if you like, and before the science was put into place. With AI, it is here and now and operating, so we do not have a chance to sit back, think about it in theory and then come up with legislation or regulation. We are dealing with a moving target, so we want to get things going.

As far as I am aware—I will check and write to the noble Lord, Lord Stevenson—the centre has no specific powers to demand information. That is, of course, something that we can look at when it comes to being on a statutory basis.

I am sorry that the application for membership by the noble Lord, Lord Stevenson, was not accepted. There can be only one reason: he spends so much time on the Front Bench that he would not have time, because we expect the directors to spend two to three days a month attending this, so it is a very large work commitment.

As noble Lords will know, the work plan includes two initial projects, which were announced in last year’s Budget: micro-targeting and algorithm bias. We expect the centre, in discussion with the Secretary of State, to come up with a work plan by spring 2019. As the noble Lord, Lord Stevenson, mentioned, there is a tension, if you like, between ethics and innovation, but we are very keen that it consider both because we have to be aware of the potential for innovation, which is constrained in some cases. We would not want a situation where the opportunities for AI for this country are avoided. As the report by the noble Lord, Lord Clement-Jones, made clear, there are tremendous opportunities in this sector. We are aware of the tension, but it is a good tension for the centre to consider.

Both noble Lords talked about the crowded space in this area. We expect the centre to produce memorandums of understanding to outline how it relates to bodies such as the AI Council, which has a slightly different focus and is more about implementation of the AI sector deal than considering the ethics of artificial intelligence. We understand that they need to work together and expect the centre to come back on that.

The noble Lord, Lord Clement-Jones, asked about accountability. The centre will be accountable to the Secretary of State for the DCMS. That is clear. He will agree its work plan. Of course, in terms of independence, once he has established that work plan, what the centre says will not be up to him, so there is independence there. We included in our response that the Government will be expected to reply within six months, so there is a time limit on that. It will apply to all government departments, not just the DCMS. The Ministry of Defence and the department of health have obvious issues and the centre can provide advice to them as well.

The noble Lord, Lord Clement-Jones, asked whether the centre, when it considers bias, would include audit mechanisms. It absolutely might. It is not really for us to say exactly what the centre will consider. In fact, that would be contrary to its independence, having been given the subject to think about. In our response we said some of the things that might be considered, such as audit mechanisms.

There is an obvious issue about competition, which the House of Lords Select Committee mentioned. Work is going on. The Chancellor commissioned the Furman review to look at that and we expect the centre to come up with a discussion on how it will work with the Competition and Markets Authority, but obviously competition is mainly to do with the Competitions and Markets Authority.

At the moment, the body is resourced by the DCMS. In the 2017 Budget, it was provided with £9 million in funding over three years. We expect that to be sufficient but, clearly, we will have to provide adequate resources to do an adequate job.

Public Sector Television Content

Lord Clement-Jones Excerpts
Thursday 25th October 2018

(5 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I have outlined that things are moving fast. The consultation finishes on 5 October. Ofcom has said it will report at the beginning of 2019. Then, as the noble Lord, Lord Griffiths, alluded to, it is up to the business managers—if Ofcom decides that legislation is necessary; you will have to look at the report. This is a complex area. The new technologies do not make it simple. It is not just like an old, linear EPG. But we understand the urgency and we know that the commercial interests do make it difficult for public service broadcasters. The key is that we support public service broadcasting.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have heard from my noble friend and other noble Lords about the urgent need to change the EPG regulations, but is there not another aspect? The chief executive of Channel 4 has pointed out that there is no regulation at all of so-called smart voice search controls, which are increasingly being introduced by the major television manufacturers. That aspect is barely covered by the Ofcom report. Will the Minister guarantee that it will be covered in any new regulations?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I accept, as I said before, that this is a complex area. We are talking about not only linear, satellite and aggregators, but about TV and videos which are just on the internet. As noble Lords will know, as well as looking at the prominence regime, we are looking at online harms generally. We expect to publish a White Paper on that in the winter.

Distributed Ledger Technologies

Lord Clement-Jones Excerpts
Wednesday 18th July 2018

(5 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The noble Lord is absolutely right. That is a very good example of where this distributed technology could be used, and there are other, similar areas. One of the benefits of this technology, and the fact that it is distributed and everyone has the same copy of the database, is that it builds trust in data, and this is an important area across many departments. I do not know specifically what proofs of concept the Home Office is doing at the moment, but I will certainly take that back to my noble friend the Minister. As I said in my previous answer, there is a cross-governmental officials group and we are currently looking at how best to co-ordinate across government.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, to take the question from the noble Lord, Lord Harris, a stage further and add to the convivial atmosphere, has not the Government Digital Service fallen behind the times with the development of its Verify digital identity system? It is not regarded as fit for purpose by HMRC, for example. Should we not be creating a single online identity for citizens through distributed ledger technology?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The first question is whether we should be creating a single digital identity, and I defer to the Home Office on that. If that decision was made, whether distributed ledger technology is the right technology for it is, I think, a secondary question.

Brexit: Media Hubs

Lord Clement-Jones Excerpts
Monday 9th July 2018

(5 years, 10 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am very pleased to move seamlessly from the digital part of my brief to sport, and of course I agree with everything my noble friend said.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the Minister has put a brave face on it but is it not a fact that, once the Prime Minister had ruled out membership of the digital single market in her Mansion House speech, the chances of reaching an agreement on country of origin principle with a single UK regulator were nil? Does that not mean that it is a question of when—not if—these broadcasters will move their licences, particularly as the Government can give absolutely no certainty, which is what they need?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

It is a good thing that the noble Lord is not in charge of our negotiations if he goes in with that attitude. As I tried to point out, there are good reasons for us to continue with a bespoke deal that is to our mutual advantage. I pointed out the fact that our regulation is widely supported around the EU. He asked for certainty; of course there is not 100% certainty, but you never go into a negotiation with that. As we have said, we are preparing a contingency position, just in case the country of origin principle or equivalent is not negotiated.