Advanced Artificial Intelligence

Lord Anderson of Ipswich Excerpts
Monday 24th July 2023

(9 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Anderson of Ipswich Portrait Lord Anderson of Ipswich (CB)
- View Speech - Hansard - -

My Lords, machine learning models, most famously AlphaFold, have a well-known role in the discovery of useful drugs. Drugs need to be safe, so open-source toxicity datasets are used to screen new molecules and discard those which are predicted to be toxic—a justly celebrated benefit of artificial intelligence.

On a darker note, suppose that a bad actor wishes to create a new nerve agent. They could take an open-source generative model and set it to work with the same toxicity dataset but with the instruction to seek out, rather than avoid, molecular structures predicted to be toxic. There will be false positives and the molecule, once identified, would still have to be synthesised, but it is now feasible to find multiple previously unknown chemical warfare agents with little more than a computer and an internet connection—as shown in a recent paper published after some agonising, and as footnoted in last month’s thought-provoking Blair/Hague report.

Crimes, as well as threats to national security, can be facilitated by AI techniques. Take high-value spear phishing, historically a labour-intensive enterprise. The diffusion of efficient and scalable AI systems will allow more actors to carry out such attacks, at a higher rate and volume, on targets who can be researched by data extraction attacks or scraping social media and can be more cunningly deceived with the help of speech synthesis systems and fake images. Similar disinformation techniques will no doubt be used by others to diminish our capacity to know what is real, and thus to threaten our democracy.

Democracy is not a suicide pact; accordingly, those who protect us from serious crime and threats to our security must themselves be able to use AI, subject to legal constraints founded on civil liberties and set by Parliament. My independent review of the Investigatory Powers Act, concentrating particularly on the work of the UK’s intelligence community, UKIC, was presented to the Prime Minister in April and quietly published last month. As part of this most timely debate, which I congratulate my noble friend Lord Ravensdale on securing, I will summarise three of its conclusions.

First, as one would hope, UKIC makes use of AI. It underlies existing capabilities such as cyber defence against malicious actors and the child abuse image database. UKIC has for many years employed machine learning automation techniques such as image-to-text conversion, language translation, audio processing and the use of classifiers to pick information of interest out of huge datasets. Models can be trained on labelled content to detect imagery of national security concern, such as weapons, allowing the work of human analysts to be focused on the most promising images. Other techniques of significant potential value include speech to text and speaker identification.

Secondly, UKIC itself, and those entrusted with its oversight, are alert to the ethical dilemmas. IPCO’s Technology Advisory Panel—a body recommended in my bulk powers review of 2016 and ably led by the computer scientist Professor Dame Muffy Calder—is there to guide the senior judicial commissioners who, quite rightly, have the final say on the issue of warrants. The CETaS research report published in May, Privacy Intrusion and National Security in the Age of AI, sets out the factors that could determine the intrusiveness of automated analytic methods. Over the coming years, the focus on how bulk data is acquired and retained may further evolve, under the influence of bulk analytics and AI, towards a focus on how it is used. Perhaps the Information Commissioner’s Office, which already oversees the NCA’s use of bulk datasets, will have a role.

Thirdly, in a world where everybody is using open-source datasets to train large language models, UKIC is uniquely constrained by Part 7 of the Investigatory Powers Act 2016. I found that these constraints—designed with different uses in mind, and comparable to the safeguards on far more intrusive powers such as the bulk interception of communications—impinge in certain important contexts on UKIC’s agility, on its co-operation with commercial partners, on its ability to recruit and retain data scientists, and, ultimately, on its effectiveness. My conclusion was that a lighter-touch regime should be applied, with the consent of a judicial commissioner, to certain categories of dataset in respect of which there is a low or no expectation of privacy. That would require a Bill to amend the IPA. I do not always welcome Home Office Bills, but I hope this one will come sooner rather than later.