Public Authorities (Fraud, Error and Recovery) Bill

Debate between Baroness Bennett of Manor Castle and Lord Vaux of Harrowden
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

My Lords, I rise briefly to support the amendments so powerfully, and with considerable detail, explained by the noble Baroness, Lady Fox of Buckley. I want to cross-reference a couple of things. I was unable to be here for the whole discussion on the last group in this Committee but I came in and heard the Minister reassuring us that there are layers of support in the DWP for identifying the vulnerable and that there is regular vulnerability training.

I have to contrast that with one of my last contributions in this Committee and this Room, talking about the horrendous case of Nicola Green. I try to share as much as I can of what I am doing in the Chamber so that it is available to the world. I have to say that the little parliamentary video of that exchange, with its less-than-ideal lighting—no offence to anyone who is doing their best they can with the television—has, you could reasonably say, gone viral, because there is a flood of comments of people saying what the DWP has done to them. I cannot attest, of course, to the truth of every one of those comments, but there is a profound problem of trust with the DWP.

I fully acknowledge that the Minister, when she was on the Opposition benches, and I have often spoken out strongly on this matter. The Government actually called an inquiry into the DWP’s treatment of disabled people after the EHRC expressed concern that equality had been breached. That is the context in which we are looking at these amendments.

The noble Baroness is calling for people to have a day in court—to be able to have a genuinely independent voice in our greatly respected courts and put the case. If they indeed have committed fraud and can afford the repayments, or it is not a complete error by the DWP, or the DWP is at fault or is not being realistic about how much people need to eat and live, the court will make a ruling. That, surely, is regarded as a basic principle and right in our law.

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I will speak briefly to Amendments 102 and 122, which would require the Secretary of State to apply to the court for a direct deduction order—a DDO. I confess that I am struggling a bit to understand the circumstances in which the Secretary of State would be able to make a direct deduction order, as the Bill is drafted. I hope the Minister will be able to help me.

When we discussed the DDOs in relation to Part 1 of the Bill, the noble Baroness, Lady Anderson, correctly pointed out that a direct deduction order could be made only in circumstances where either there had been a final determination of the amount of the liability by a court or the person concerned had agreed that the amount was payable. I agreed then that that was an important safeguard, as it is a significant restriction on when the DDO process could be used under Part 1. I asked why, if the court was making the determination of liability, we did not just leave the court to determine the way in which it should be repaid, rather than requiring new powers for the Minister to make that decision. The noble Baroness was kind enough to offer to write to me on that, and I very much look forward to receiving her letter.

However, I think the same issue may arise here, except that I am struggling to find the definition of the amount recoverable described in paragraph 1(1) of new Schedule 3ZA, inserted by Schedule 5 to the Bill. Can the Minister please explain how the amount recoverable is determined, and by whom? Does this part have the same safeguard as Part 1, which is either final court determination or agreement by the person concerned, or is it at the discretion of the Secretary of State? I can see, in Clause 89, that the person must have been convicted of an offence or agreed to pay a penalty. That raises the question: does this DDO regime apply in cases or error, or not? Presumably, in cases of error there will not be a conviction or a penalty, so it does not apply in the case of error, but I am confused.

I cannot find anywhere the amount being determined by a court; that is where I am struggling a bit. If the recoverable amount has not been decided by the court, then the amendment in the name of the noble Baroness, Lady Fox, is likely to be necessary. That is particularly important because, just as it does in Part 1, for understandable reasons, the appeal process to the First-tier Tribunal against a DDO prevents a person appealing with respect to the amount that is recoverable. If that is the case, and the amount recoverable has not been determined by a court, I think there is an issue here.

Public Authorities (Fraud, Error and Recovery) Bill

Debate between Baroness Bennett of Manor Castle and Lord Vaux of Harrowden
Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, we have been debating Part 1, which gives substantial powers to the Cabinet Office when the Minister has reasonable grounds to suspect fraud, and we are about to kick off on Part 2, which gives substantial powers to the DWP. Those include police-style powers to enter private premises, search them and seize property, as well as powers to demand information. Those are potentially very intrusive powers, so it is essential that they can be exercised only when it is genuinely appropriate to do so.

The two amendments in this group cover both Parts 1 and 2, and they provide essential clarification as to how the DWP and PSFA should interpret the legal threshold for most of the investigative powers in the Bill, which is the requirement to have “reasonable grounds” of suspicion of fraud.

The amendments are intended to ensure that, when the DWP and PSFA are exercising their investigative powers under this Bill, reasonable grounds do not include generalisations or stereotypes of certain categories of people—for example, that members of a particular social group are more likely to be involved in fraudulent activity than others. Investment in data analytics and other emerging technologies, such as AI, for fraud risk detection is inevitably, and probably rightly, increasing. The Government have signalled their intention to turbocharge AI and to mainline AI into the veins of the nation, including the public sector.

The Government are, as we speak, trying to pass the Data (Use and Access) Bill, which would repeal the current ban on automated decision-making and profiling of individuals. The DWP has invested heavily in artificial intelligence, widening its scope last year to include use of a machine-learning tool to identify fraud in universal credit advances applications, and it intends to develop further models. This is despite a warning from the Auditor-General in 2023 of

“an inherent risk that the algorithms are biased towards selecting claims for review from certain vulnerable people or groups with protected characteristics”.

The DWP admitted that its,

“ability to test for unfair impacts across protected characteristics is currently limited”.

There are real concerns about the inaccuracy of algorithms, particularly when such inaccuracy is discriminatory, when mistakes disproportionately impact a certain group of people. It is well evidenced that machine-learning algorithms can learn to discriminate in a way that no democratic society would wish to incorporate into any reasonable decision-making process about individuals. An internal DWP fairness analysis of the universal credit payments algorithm, which was published only due to a freedom of information request, has revealed a “statistical significant outcome disparity” according to people’s age, disability, marital status and nationality.

This is not just a theoretical concern. Recent real-life experiences in both the Netherlands and Sweden should provide a real warning for us, and are clear evidence that we must have robust safeguards in place. Machine-learning algorithms used in the Netherlands’ child tax credit scandal learned to profile those with dual nationality and low income as being suspects for fraud. From 2015 to 2019, the authorities penalised families over suspicion of fraud based on the system’s risk indicators. Tens of thousands of families, often with lower incomes or belonging to ethnic minorities, were pushed into poverty. Some victims committed suicide. More than a thousand children were taken into foster care. The scandal ultimately led to the resignation of the then Prime Minister, Mark Rutte.

In Sweden in 2024, an investigation found that the machine-learning system used by the country’s social insurance agency is disproportionately flagging certain groups for further investigation over social benefits fraud, including women, individuals with foreign backgrounds, low-income earners and people without university degrees. Once cases are flagged, fraud investigators have the power to trawl through a person’s social media accounts, obtain data from institutions and even interview an individual’s neighbours as part of their investigations.

The two amendments that I have tabled are based on paragraph 2.2 of Code A to the Police and Criminal Evidence Act 1984, in relation to police stop and search powers, which states that:

“Reasonable suspicion cannot be based on generalisations or stereotypical images of certain groups or categories of people as more likely to be involved in criminal activity”.


These amendments would not reduce the ability of departments to go after fraud. Indeed, I argue that by ensuring that the reasonable suspicion is genuine, rather than based on stereotypes, they should improve the targeting of investigations and therefore make the investigations more effective, not less so.

The Bill extends substantial intrusive powers to the Cabinet Office, the PFSA and the DWP, and those powers must be subject to robust safeguards in the Bill. The use of “generalisations or stereotypes”, whether through automated systems or otherwise, should never be seen as grounds for reasonable suspicion. I hope the Minister will see the need for these safeguards in that context, just as they are needed and exist in relation to stop and search powers. I beg to move.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord, Lord Vaux of Harrowden, and to speak in favour of Amendments 75A and 79A, to which I have attached my name and which noble Lords will see have rather broad support in political terms—perhaps not the broadest I have ever seen but it is certainly up there. I must also pay tribute to Justice, a cross-party law reform and human rights organisation that is the UK section of the International Commission of Jurists, which has been most determined in ensuring that these issues are raised in this Bill, in this context.

I have already addressed these issues in the Chamber in a number of amendments to the Employment Rights Bill that I tabled and spoke to. I am not going to repeat all that I said there, but I cross-reference those amendments. If noble Lords want to find out more about this issue, there is an excellent book by the researcher Shannon Vallor, The AI Mirror, which is a useful metaphor for understanding the risks whereby we live in a biased society in which those biases risk being reflected back to us and magnified by the use of artificial intelligence and algorithms. That is very much what these two amendments seek to address.

The noble Lord has already given us two international examples of where using AI, algorithms, stereotypes and generalisations in investigations has gone horribly wrong. I have to add a third example, which is the infamous case in Australia of “Robodebt”. That was an automated debt recovery and assessment programme, from the rough equivalent of the DWP, that was exercised in Australia. There was controversy before and through its implementation, and it was an unmitigated disaster. I point the Minister and others to the fact that there was a Royal Commission in Australia which said the programme had been

“a costly failure of public administration in both human and economic terms”.

I note that the House of Representatives in Australia passed a public apology to the huge number of people who were affected.

In a way, I argue that these amendments are a protection for the Government, that this will be written into law: there is a stop that says, “No, we cannot allow things to run out of control in the way we have seen in so many international examples”. I think these are truly important amendments. I hope we might hear positive things from the Minister but, if not, we are going to have to keep pursuing these issues, right across the spectrum. I was very taken: Hansard will not record the tone of voice in which the noble Lord, Lord Vaux, said that the Government wish “to mainline AI”, but it is important to note that a concerning approach is being taken by the Government to the whole issue of artificial so-called intelligence.