Wednesday 10th March 2021

(3 years ago)

Westminster Hall
Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

11:00
Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered the legal status of automatic computer-based decisions.

It is a pleasure to serve under your chairmanship, Mr Hollobone. I should apologise to a number of viewers of the debate for advertising its start time as 10.30 am, as opposed to 11 am. I would like to blame Microsoft Outlook, but in fact it was entirely my own fault.

I should also declare my interests. I am the chair of the Institute of Artificial Intelligence, which brings together legislators from around the world to discuss the implications of regulation of artificial intelligence. As Chair of the Business, Energy and Industrial Strategy Committee, I call attention to our currently suspended but still live inquiry into the Post Office Horizon scandal, which I shall refer to today. Lastly, I draw attention to the fact that I used to be employed as a solicitor with a law firm now called Womble Bond Dickinson, which represented the Post Office on the Horizon issue, but confirm that I did not personally act for the Post Office on that issue. I should also thank Paul Marshall, a barrister at Cornerstone Barristers, whose note to me has underpinned much of my contribution today, and Stephen Mason, a research fellow at the University of London’s Institute for Advanced Legal Studies, and his colleagues for their analysis.

The Minister knows that I come to the debate as a technology evangelist—someone who advocates harnessing the potential of technology to modernise our economy and our public services. There is, of course, a much wider debate about the need to update our laws and regulations, and indeed how we run government, due to the technological revolution, but today I will focus on the specific and important issue of how the law of evidence applies to automated computer-based decisions. That has wide-ranging implications across the public and private sector.

The case of the Post Office’s failed private prosecution of more than 1,000 sub-postmasters for accounting errors created by a computer system and not by the sub-postmasters resulted in what looks to be one of the largest miscarriages of justice in our country—a tragedy for our justice system, but also a personal tragedy for the sub-postmasters involved and their families, all stemming from a computer system. I understand that one victim of the Post Office Horizon scandal pleaded guilty to false accounting merely because she had been overwhelmed by the errors and because she could not face the prospect of a jury trial for theft, which was being threatened by the Post Office at the time. Another was wrongly imprisoned when eight weeks pregnant. I stress this point because computer-based decisions can lead not only to not getting a credit card but to untold human suffering in the face of miscarriages of justice. That is just one example.

As the Minister knows, automated computer-based decision making is more and more widespread with every passing year. The problem is that computer systems are not continuously reliable; latent errors can occur frequently. Achieving reliability in a computer system in the first place is difficult; it is even harder to assess and assure that reliability on an ongoing basis. Indeed, artificial intelligence, in its capacity as a general-purpose technology across every aspect of our economy, means that the number of decisions coming from software-based systems will increase and increase, both in the private sector and in the delivery of public services. The Minister, of course, understands that. The Centre for Data Ethics and Innovation, which is part of his Department, published a report only last year on algorithmic decision making, which concluded:

“We must ensure decisions can be scrutinised, explained and challenged so that our current laws and frameworks do not lose effectiveness, and indeed can be made more effective over time.”

However, the fact is that, as highlighted in the judgments of Mr Justice Fraser in the Post Office Horizon case of Bates v. Post Office, our laws are dramatically out of date. This is evidenced by the very nature of the dates of the legislation involved for criminal issues, such as the Youth Justice and Criminal Evidence Act 1999. We all recognise, of course, that technology has moved on a great deal since then. That Act repealed section 69(1) of the Police and Criminal Evidence Act 1984, which until its repeal meant that computer-derived documents could not be used as evidence unless it could be shown that, at all material times, the computer was operating properly. At the time, the Law Commission recommended the repeal because of concerns that it was increasingly difficult to meet that threshold—in other words, in 1999 it was increasingly difficult to show that a computer was operating properly at material times.

The change in the law left an absence of formal statutory guidance, resulting in the courts applying to computers the presumption of being properly functioning traditional machines. In practice, that means that a party can rely on the presumption that a computer was operating reliably at all material times—that is to say, that the computer was always right. It is for the objector—in the Post Office case, the sub-postmaster—to prove that the computer was not operating reliably. This is perhaps an obvious point, but in my view that results in an unacceptable imbalance of power.

The owners of computer-based decisions are usually big companies or the state. An advanced computer system is not the same as a factory-floor machine. In contrast, the objectors are employees, customers or citizens who have no real prospect of being able to prove that a computer system owned by a company or the Government was not operating reliably. That has very wide-ranging implications.

If people found it difficult to prove a computer was operating reliably in the early 1990s, we can only imagine how difficult it might be to do that today, not least when machine-learning algorithms come to conclusions for reasons even the computer programmer does not understand. If the Post Office had been required to prove that its computer system was operating reliably, it would not have been able to do so, because we now know that it was not, and sub-postmasters would not have been wrongly imprisoned. The legal presumption that a computer is always right is therefore unsafe and liable to cause significant harm and injustice.

I am not suggesting a return to the pre-1999 approach, but we need to find a new way to manage the risk and update our laws appropriately. As the Centre for Data Ethics and Innovation said in its algorithmic decision-making report,

“we have a window of opportunity to get this right and ensure that these changes serve to promote equality, not to entrench existing biases.”

That is important, because if we are to harness the full potential of technology in our economy, the public need to have confidence in the way in which it is being used, and that there are appropriate rights of redress for those who fall foul of it.

I understand that the Under-Secretary of State for Justice, the hon. Member for Cheltenham (Alex Chalk) has also been engaging with this issue and has referred the matter to the Lord Chief Justice, Lord Burnett of Maldon, in his capacity as chair of the Criminal Procedure Rule Committee. The reason I asked for a debate with the Digital team is my belief that this is broader than an issue merely for the Criminal Procedure Rule Committee. I am therefore calling on the Minister to use the powers of the Government Department responsible for digital and technology issues to refer this matter to the Law Commission for formal consideration. I look forward to hearing his response.

11:08
Matt Warman Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Matt Warman)
- Hansard - - - Excerpts

I congratulate the Chair of the Select Committee on securing this important debate. He is absolutely right to say that the potential of technology to enhance the decision-making process, in the public sector just as much as the private sector, is something that this Government are absolutely committed to not only getting the maximum out of, but getting right as well. He is also absolutely right to highlight that legislation from decades ago is perhaps not 100% where we would wish it to be.

First, let me say that I share the concerns raised by him and other Members about the specific example he has raised and the treatment of postmasters, who are vital members of the community, in this whole affair. I also acknowledge that it highlights essential legal issues. I will address those shortly, although I should perhaps start by saying that he has been comprehensive in his own circumnavigation of the issues at hand.

On Horizon, the Government recognise that the dispute has had a hugely damaging effect on the lives of the affected postmasters and their families. Its repercussions are still being felt today. Over the years, the Horizon accounting system recorded shortfalls in cash in branches. At the time, the Post Office believed that those shortfalls were caused by postmasters, leading to dismissals, recovery of losses and, in some cases, criminal prosecution. Many hon. Members, me included, have listened to the stories of the postmasters affected and have been deeply moved by the impact on their livelihoods, their finances and often their health.

A group of 555 of those postmasters, led by former postmaster Alan Bates, brought a group litigation claim against the Post Office in 2017. In the findings of Mr Justice Fraser, it is clear just how wrong the Post Office was in its relationship with postmasters and that there were clear failings in the Horizon system. As I will explain, the Government are taking steps through an independent inquiry to ensure that lessons are learned and that a full analysis takes place.

The Post Office reached a full and final settlement with the group litigation claimants in December 2019 and apologised for its failings. That settlement was an important step towards addressing the wrongs of the past, but it was only the start of a long journey for the Post Office to repair and strengthen its relationship with postmasters.

As part of the settlement, the Post Office agreed to set up the historical shortfall scheme, open to current and former postmasters who may have experienced and repaid Horizon shortfalls but did not participate in the group litigation. That is an important step in ensuring that all those who were affected have the opportunity to seek resolution.

A number of postmasters with criminal convictions have applied to the Criminal Cases Review Commission to have their cases referred for appeal. To date, the commission has referred 51 cases either to the Court of Appeal or to the Crown court. The Government welcome the decision made by the Crown court in December 2020 to overturn six of those convictions.

However, a number of cases—42 in total—are still to be heard in the relevant Appeal Court at the end of March. It would not be appropriate for the Government to comment on those cases while the courts are still considering them, but I assure hon. Members that the Post Office is co-operating with the commission to the fullest extent.

More broadly, we must ensure that such a situation can never be allowed to occur again. In September 2020, therefore, the Government launched the Post Office Horizon IT inquiry, an independent inquiry led by Sir Wyn Williams. Sir Wyn’s inquiry will work to understand fully what happened, gather available evidence and ensure that lessons have been learned so that this cannot occur again. The inquiry will look specifically at whether the historical shortfall scheme is being delivered properly. The Government look forward to receiving that report in the summer.

In recent years, however, a lot has changed on standards and ethics relating to the management of algorithms and data in general. The hon. Member for Bristol North West (Darren Jones) rightly pointed out the work of the Centre for Data Ethics and Innovation. Crucially, that centre has not only “data ethics” but “innovation” in its title—those two things go hand in hand.

The centre was established by my Department in 2017, but that is not the only area in which we have implemented change. Substantial steps have been taken to consider and address deficiencies in the application of algorithms where that lies within the remit of the DCMS and, crucially, beyond. I am confident that we are in a much stronger position than when the worst excesses of the Horizon affair took place, but there is more work to do.

If an automated decision is based on personal data, the UK general data protection regulation already applies. It provides regulatory tools to safeguard data subjects and identified or identifiable persons in automated decision making. Organisations processing personal data must also adhere to strong transparency requirements. Organisations, including public authorities, should ensure that the algorithms they deploy and procure, where based on personal data, generate sound and impartial decisions, and that that should be considered before such algorithms are used.

The UK GDPR contains provisions for protecting the interests of data subjects and their data. In particular, data protection impact assessments are mandatory for data processing that is high risk and require organisations to weigh up the impacts on privacy of data processing activities, including automated decision making.

In addition, the Government have introduced non-legislative tools that will be important as we move towards a world where not just algorithms but the ability for computers to amend algorithms—artificial intelligence—become more commonplace. Let me run through some of them. We were the first Government to publish a data ethics framework, which is a set of principles to guide the design of appropriate data use in the public sector, aimed at anyone working with data in the public sector. We published an ethics, transparency and accountability framework for automated decision making, and we have commissioned the Government Digital Service to deliver the review of artificial intelligence adoption in the public sector. We have also published an AI guide for Government.

There are also published guidelines on AI procurement in collaboration with the World Economic Forum’s Centre for the Fourth Industrial Revolution. It will inform and empower buyers in the public sector, helping them to evaluate suppliers and then confidently and responsibly procure the right AI technologies for the benefit of citizens. We have also published, along with the Information Commissioner’s Office and the Alan Turing Institute, “Explaining decisions made with AI”. This guidance gives organisations practical advice to help them explain the processes, services and decisions delivered or assisted by AI to the individuals affected by them. That is a crucial action that the hon. Member for Bristol North West mentioned.

Those various documents are updated with new thinking and insight from our public sector, civil society, industry and academic partners. We have also launched the new AI dynamic purchasing system, which is a framework that offers public sector customers a direct route to AI services in an emerging market, addressing ethical considerations when organisations buy AI services for use in the public sector.

The new and independent Regulatory Horizons Council has been appointed to scan the horizons for new technological innovations and provide the Government with impartial, expert advice on the regulatory reform required to support their rapid and safe introduction. More broadly, the Government are always monitoring how algorithms and data affect people’s lives. As they grow in importance in all our lives, we will consider what more we can do. That is why we are active in the international debates on algorithm and artificial intelligence regulations at the Council of Europe and, beyond that, at the OECD and in the Global Partnership on Artificial Intelligence.

The hon. Gentleman specifically asked whether the status of algorithms in the courts might be referred to the Law Commission, especially given the role played by the commission in first adjusting the Police and Criminal Evidence Act 1984 on this topic. It is a suggestion worth very serious consideration, and my colleagues in the Ministry of Justice and I are grateful for it. He will know that it is not in the Law Commission’s current three-year plan of work, and it will take considerable time to establish the necessary work in order to address the underlying legal issue.

While we consider that route, the Government are also investigating whether there may be faster methods that we can use to address the legal status of algorithms in a court of law—the hon. Gentleman mentioned that himself. For example, once the Court of Appeal has made a determination in respect to the Criminal Cases Review Commission, the judiciary Criminal Procedure Rule Committee could consider making changes in this area. The courts are expected to make their determination shortly, after which I look forward to taking up the matter with the Ministry of Justice and the Lord Chief Justice, the chair of that committee.

To close, I thank you, Mr Hollobone, and the hon. Gentleman. This is the beginning of the next phase in an ongoing debate. It is a hugely important issue, and seizing these opportunities for the benefit of citizens and everyone around the world is in all our interests. It will be a complex and involving conversation, and I look forward to having more conversations with the hon. Gentleman.

Philip Hollobone Portrait Mr Philip Hollobone (in the Chair)
- Hansard - - - Excerpts

I am afraid that the hon. Member does not have the right of reply in half-hour debates. I know it is confusing and I am sorry to be the bearer of bad news, but we enjoyed his initial contribution.

Question put and agreed to.

11:19
Sitting suspended.