Biometric Recognition Technologies in Schools Debate

Full Debate: Read Full Debate

Biometric Recognition Technologies in Schools

Lord Strasburger Excerpts
Thursday 4th November 2021

(2 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Strasburger Portrait Lord Strasburger (LD) [V]
- Hansard - -

My Lords, I thank my noble friend Lord Clement-Jones for securing this important debate on a topic that has shocked the public and caused widespread concern and alarm. I also declare my interest as chair of Big Brother Watch.

It is hard to know where to start on the use of facial recognition technology to administer something as mundane as payment for school meals. Deploying airport-style security methods to ensure that a hungry child is paying for their lunch is such obvious overkill that it would be funny—if the implications were not so serious. As Fraser Sampson, the biometrics commissioner for England and Wales, said, just because schools can use the technology does not mean that they should. There are plenty of less intrusive and less risky ways to do the same task that are already in use in many schools.

Introducing facial recognition technology brings schools into the realm of data protection law, under which any processing must be lawful, transparent and fair. This means that a school would need to consider, in a structured analysis, whether the use of such technology is a proportionate measure to achieve the aims it seeks to achieve, or whether the interference with the child’s rights is of a level that renders the use of the technology unacceptable. I can only assume that, in the cases of the schools that have adopted this technology, this analysis was not done, or was not done properly, because the answer is so obviously that it is not proportionate.

That is particularly the case when we remember that the GDPR stresses that children merit special protection when it comes to their data. By law, children do have the right to refuse to participate in the use of intrusive technologies, and their wishes override those of their parents. In that case, the school must put in place reasonable alternatives which would presumably negate the claimed efficiency benefits of the new system.

I should also point out that the facial recognition systems being installed in schools reportedly cost £12,000 and then £3,000 a year. Would that money not be better spent on free school meals in the holidays, which the Government seem to have so much trouble funding?

I also have a wider concern. The use of biometric systems to police something as trivial as payment for school meals is training our children to accept that their private data is not theirs to be kept private and protected. As Silkie Carlo, director of Big Brother Watch, says:

“We are supposed to live in a democracy, not a security state. This is highly sensitive, personal data that children should be taught to protect, not to give away on a whim … there are some red flags here for us.”


The data protection principles that my noble friend Lord Clement-Jones has spoken of—consent, proportionality and safeguards around data storage and sharing—all derive from the GDPR, which is broadly incorporated into UK law through the Data Protection Act 2018. Now that we have left the EU, the Government are seeking to overhaul our data protection framework and water down citizens’ rights, encouraging institutions and businesses to use AI tools such as facial recognition and personal data such as facial images, with substandard protections compared with those of our neighbours. They even want to do away with the Biometrics and Surveillance Camera Commissioner, who oversees the uses of this technology. So my first question to the Minister is: would it be easier or harder for schools or data-gathering companies to take children’s sensitive biometric data out of the Government’s forthcoming attack on UK GDPR?

My noble friend Lord Clement-Jones also referenced the police’s use of live facial recognition, which has been going on for five years now with Home Office funding and the Mayor of London’s blessing, despite there being no explicit legal basis and no parliamentary scrutiny. In addition, there has been a judgment in the challenge brought by the Liberal Democrat councillor Dr Ed Bridges, finding that South Wales Police’s use of live facial recognition had been unlawful because appropriate safeguards were not in place. Another factor was the well-documented problems with the technology’s race and sex bias, which has not been appropriately explored and mitigated.

Here is another area where the Government’s reckless attitude to new technologies, rights and liberties has impacted on the rights of children. Civil liberties group Big Brother Watch, which I chair, observed a Metropolitan Police trial of live facial recognition that resulted in an innocent 14 year-old black schoolboy walking home in his school uniform being accosted by four plain-clothes police officers. He was pulled into a side street, held up against a wall and asked for his ID, fingerprints and phone. Of course it was another case of mistaken identity, as is the case in 93% of all facial recognition so-called matches generated by the Metropolitan Police. This unforgivable incident could easily traumatise a child.

This dangerously authoritarian technology diminishes trust in the police and other public authorities at a time when it is already very low, and it makes Britain less of a free country to live in. So my second and final question to the Minister is: will the Government bring forward legislation to impose an urgent moratorium on public authorities’ use of live facial recognition technology in order to give Parliament an opportunity to properly assess it before any further harm is done?