Digital Understanding

Baroness Jones of Moulsecoomb Excerpts
Thursday 7th September 2017

(6 years, 8 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Jones of Moulsecoomb Portrait Baroness Jones of Moulsecoomb (GP)
- Hansard - -

I too congratulate the noble Baroness, Lady Lane-Fox, on introducing this debate. She has already forced somebody with few digital skills into a little bit of digital understanding, and I thank her very much for that. It is a pleasure to follow the noble Lord, Lord Berkeley. He put this issue into the context of music and I shall put it into the context of policing. It was a delight that the noble Baroness mentioned climate change, but I am going to avoid that topic today and talk about policing.

A high level of digital understanding is obviously important for the police. It will be essential in fighting crime. The problem is that the rapid pace of technological advancement leaves many unknown unknowns—for example, the policing issues that might arise with driverless cars or quantum computers. As new crimes come forward, such as cyberbullying and phishing, the police need technology skills and support to face these 21st-century crimes. At the same time as we navigate these challenges, we also have to maintain a constant focus on protecting civil liberties while encouraging and facilitating innovation.

Digital crime differs greatly from traditional crimes, because most digital crime can be committed from the comfort of the perpetrator’s own home, and the actions of a computer-savvy criminal can rapidly affect thousands of people. The ransomware attack on the NHS in May showed the devastating effect that cybercrime can have on core public services. To meet these challenges, all police officers and police staff need the knowledge and skills to use digital technology and be aware of emerging trends. Police leaders must have a deep understanding of the developing issues, and have the vision for a new strategy to seize the initiative on these new crimes.

I want to talk about big data, which the police use a lot. That means drawing huge amounts of data from diverse sources, assessing their accuracy and reliability, and then making critical analyses—and sometimes difficult decisions based on what has been learned. This is an important issue, as it has wide-ranging implications for civil liberties and discrimination within society. It offers opportunities for the police to add data-driven insights to their traditional policing expertise. Complex algorithms can make useful predictions from a range of data as diverse as historical crime data, location of cashpoints, census data, football results, weather patterns and temperature changes.

The opportunity is that big data models can give deeper insight into the trends that affect crime and allow police to direct scarce resources better. Often this can make policing easier but sometimes IT goes badly wrong, and I shall give your Lordships an example of that. Last month, London’s Met police used what is actually a controversial, inaccurate and largely unregulated automated facial recognition technology to spot troublemakers at the Notting Hill Carnival. This is the second year running that it has trialled it, and once again it did more harm than good. Last year it actually proved useless, so that was okay, but this year it proved worse than useless, with 35 false matches and one wrongful arrest of someone erroneously tagged as being wanted on warrant for a rioting offence. Silkie Carlo, the technology policy officer for civil rights group Liberty, saw the technology in action and, in a blog post, described the system as showing,

“all the hallmarks of the very basic pitfalls technologists have warned of for years—policing led by low-quality data and low-quality algorithms”.

Yet, in spite of its lack of success, the Met’s project leads viewed the weekend not as a failure but as a resounding success. It had come up with one solitary successful match, and even that was skewered by sloppy record-keeping that got an individual wrongly arrested. The automated facial recognition was accurate but the person had already been processed by the justice system and was erroneously included on a suspect database. It so often comes back to basic record-keeping, not to technology that can make things easier.

I see two particular problems for the police force: understanding what there is in terms of digital products, and having the judgment to know what is appropriate to use.