Cyber-troop Activity: UK

Owen Thompson Excerpts
Tuesday 9th March 2021

(3 years, 2 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Owen Thompson Portrait Owen Thompson (Midlothian) (SNP)
- Hansard - -

I beg to move,

That this House has considered cyber troop activity in the UK.

It is a pleasure to serve under your chairmanship, Sir Charles. I secured this debate because I feel that we cannot go into another round of elections in May with our heads in the sand about a very real threat to our democracy: industrialised disinformation by state and political actors. Across political divides we must stand against the forces that seek to smear, manipulate, speak untruths and undermine the legitimacy of Governments or political opponents, through underhand and under-regulated techniques.

Politics, by its nature, will always host opposing and differing views, and that is absolutely right, as is the opportunity to debate the points around these views, but it is incumbent on all of us to ensure that the public can have confidence in the information that they see presented from politicians and those who report on political events. The old adage that a lie can get halfway around the world before the truth gets its boots on certainly applies here, but at a whole new industrialised level, with mass distribution only ever a mouse click away.

Social media has been both a blessing and a curse. It is, in theory, a great leveller, providing an open platform for the discussion of ideas and helping the disadvantaged to organise groups to get their voices heard. It opens up publishing to citizen journalists, speaking without gatekeepers. However, in many ways, instead of widening the debate, it has become increasingly polarised and dominated by echo chambers, with information provision ruled by mysterious algorithms. The lack of editorial content control has created a nightmare for fact checking and fairness, and increasing numbers of nefarious actors have learned how to manipulate the system, fuel conspiracy theories and sow division. The waters have become murky and it is a pool in which many people no longer want to swim.

It cannot be dismissed simply as modern-day political spin. The new technologies create far more poisonous possibilities for the most Machiavellian practitioners of the dark arts, and there is plenty of evidence that they are taking advantage of these new superpowers. Those who want to see standards and integrity in public life maintained cannot simply stand by and ignore it.

Millions are being spent on orchestrated disinformation in what the Electoral Reform Society described as the unregulated “wild west” of online political campaigning. Organised cyber-troop operations use an increasingly sophisticated armoury to alter the nature and course of legitimate political debate, to smear and discredit opponents, to interfere in foreign affairs and generally to create distrust in the very processes on which democracy relies. Facts get confused, opposing points of view are tainted and people are turned off by an onslaught of hate, misleading propaganda and deliberately divisive content.

Techniques used by these cyber-troops include armies of trolls or political bots amplifying particular opinions or hate speech, harassment, doxxing, smearing, doctoring images and videos, mass reporting of content and illegally harvesting data to micro-target with misleading information. They do it because it works.

Fergus Bell, the co-founder of London-based media consultancy Fathm, has worked on many elections and believes that false information shared online has been “very successful” at swaying voters. It does not have to be direct in its influence but, as he says,

“if you cause division between people, or if you can change someone’s mind on one tiny thing that might make them vote differently, you can push an election”.

The cyber-troops have precise, data-driven strategies to home in on the soft spots, and they know exactly where those are.

People who are targeted by these tactics may be disenfranchised by the processes, become disillusioned with everyone involved in politics and no longer bother to participate in democracy. In some cases, this appears to be the purpose of cyber-troop activities, as Channel 4 reported in the US elections, where they found evidence of micro-targeting by the Trump campaign to deter 3.5 million black Americans from voting at all. That type of voter suppression should alarm us all.

The rapid rise of disinformation industries is evidenced in the Oxford Internet Institute’s report, “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation”. It is quite a wake-up call for those who think that these things could not happen or do not happen here. The report found that 81 countries are now using social media to spread computational propaganda and disinformation about politics, including the UK, which is a jump from 70 countries in the previous year.

The report found evidence of Chinese, Russian and Iranian-backed disinformation campaigns about covid-19 to amplify anti-democratic narratives and undermine trust in health officials. Microsoft has also warned that hackers operating out of Russia, China and Iran were targeting the staff and associates of both Donald Trump and Joe Biden ahead of the US election last year. In Argentina, a “deepfake” video was used to manipulate the Minister of Security to make her appear drunk.

As for China, a 2017 Harvard paper estimated that the Chinese Government employ 2 million people to write 448 million social media posts a year. The primary purpose of this activity is to keep online discussions away from sensitive political topics. Closer to home, the long-delayed Russia report from the Intelligence and Security Committee confirmed that there was “credible open source commentary” suggesting that Russia tried to influence the Scottish independence referendum and subsequent elections. Yet astonishingly it seems that the Government have not yet sought to find evidence of interference in the EU referendum and instead took an ostrich-like approach to defending our democratic process. At the very least, I would hope that the Government could be looking to implement the recommendations of the ISC report.

It is not just foreign interference that is at stake here; the UK has to get its own house in order. There are questions about data-driven profiling and Facebook advertising by political actors in the UK. In the 2019 general election, 90% of the Conservative party’s Facebook advertisements in early December were labelled as misleading by Full Fact. The real danger of this kind of misleading content is that cyber-troop tactics can then be used to amplify it to the extent that, by the time it is rebutted, it has already reached thousands of feeds. The Conservatives even tried to rebrand their Twitter output during a debate as coming from “factcheckUK”, changed its logo to hide its political origins and pushed pro-Conservative material in a way that deliberately confused it with independent fact-checking sites.

Another question is why Topham Guerin, one of the communications companies behind the 2019 campaign, was awarded a £3 million covid-19 contract by the Government. It is yet more evidence of the need for my Ministerial Interests (Emergency Powers) Bill, which aims to hold the Government to account, to be supported in all quarters of the House—but that matter is for another day.

Although it is not always clear who is behind these actions, there is always clear evidence of bots being used to swell numbers artificially and drive political positions. A study by the Institute for Strategic Dialogue identified that almost all of the 10 most active accounts on Twitter discussing the Brexit party appeared to be automated bots, while prior to the 2019 general election a report found that a third of the Prime Minister’s own Twitter followers were bots.

Tackling this issue is not about silencing voices; it is about getting back some semblance of a level playing field, recognising the range of genuine voices and turning down the noise from the fakes. The UK is one of 48 countries identified in the Oxford report where cyber-troop manipulation campaigns are being run by private firms on behalf of Government or political actors. The report found that almost $60 million had been spent on hiring these firms since 2009, but I suspect that this figure is only the tip of the iceberg. There needs to be greater transparency and a tightening of the links between the public sector and private contractors.

Cyber-troops sometimes work in conjunction with civil society organisations, internet subcultures, youth groups and fringe movements; groups who may be motivated by genuinely held beliefs but whose causes may ultimately be damaged by those who strategically spread disinformation or computational propaganda. Take, for example, Turning Point, a right-wing youth pressure group. A US Senate report found that its social media activity was regularly co-opted and reposted by the Internet Research Agency, which is known in Russian slang as the “trolls from Olgino”.

The use of third-party campaigning organisations can also be a way to rig the system—to channel illegal levels of funds and campaigns, or at the very least to exploit gaps in our outdated electoral laws in order to press political agendas. Many questions have rightly been asked about the official Vote Leave campaign’s techniques, their links to other groups, the “dark money” spent and their micro-targeting techniques, used in breach of privacy laws.

As the Vote Leave campaign demonstrated, tougher rules are needed in the conduct of future referenda, as well as elections. The Scottish Government introduced the Referendums (Scotland) Act 2020 to better regulate the conduct of any future referendum, where they have the power to do so, including on campaign spending and donations. I would like to see further action to tighten the rules in this place too.

Fighting cyber-troops is complex and has to be tackled on several fronts, with governments, civil society, academia and technology businesses all having a role to play. The social media giants must certainly be better regulated and take greater responsibility for what is published. I therefore welcome the moves to improve regulation through the online safety Bill.

However, the misinformation and disinformation being propagated by cyber-troops is clearly an ongoing and growing aspect of online harms, so it is disappointing that this aspect has not been robustly tackled through these proposals. There are half-hearted plans from the Government for digital imprints, which is a move in the right direction, towards greater transparency, but it does not go far enough or fast enough. The get-out clause, which is that the imprint can be located in an

“accessible alternative location linked to the material”,

is not good enough.

Online political advertising remains largely unregulated, and there is nothing from the Government so far that shows a determination to better regulate against indecent and dishonest material, dark ads or data targeting. At the very least, we need to see who is using citizens’ data and why, as well as why people see particular ads. I believe that, on this front, the European regulatory plans go further than those of the UK.

I am aware of the challenges with regulating and fact-checking political content, but it is not impossible to overcome these, and it is essential that this is looked at urgently. It is no longer enough simply to rely on a sense of fair play and “a fair crack of the whip for all sides” to manage the truth amidst the overwhelming barrage of information being dumped upon us. There is no chance for rebuttals from opponents when so much content can spread so widely and maliciously, without any clarity or transparency on the sources.

It is not enough to treat the threat of cyber-troops as solely an electoral phenomenon. The Government’s counter-disinformation unit is usually only operational during periods of heightened vulnerability, when we know that cyber-troops are working to sow division and discord every minute of every day.

Much needs to be done to reform the rules, strengthen democracy and restore faith in our democratic processes, yet there has been disappointingly slow progress so far. Many organisations, such as Reset and the Fair Vote Project, are working on this alongside the all- party parliamentary groups on electoral campaigning transparency and digital regulation and responsibility. They are doing the research and taking forward proposals on a cross-party basis, so a lot of the heavy lifting has already been done on the Government’s behalf.

However, the Government have given no indication that they collect data on cyber-troop activity, despite the important role that they should be playing in analysing and assessing this threat. When I have raised questions about cyber-troops, I have been advised, in response, that the Government’s fine-sounding “defending democracy programme” is tackling this. However, from what I have found so far, it does not seem to be doing very much. Perhaps the Minister can point me to something other than that when she responds today.

We need to stop kicking this into the long grass. There is plenty of evidence of the threats from both within and outwith the UK. I have previously called for a debate, in Government time, on the need for electoral reforms to protect free and fair elections. However, if I cannot have that, we need to have it moving forward on another basis.

This is not a party political issue; it is about integrity in public life. Political differences are healthy, as is debate, but the tactics of division and disinformation from cyber-troops are a cancer on all political discourse, and it is spreading too fast to ignore. We all have a moral imperative to take action, and I call on this Government to do so.

--- Later in debate ---
Owen Thompson Portrait Owen Thompson
- Hansard - -

Thank you, Sir Charles. I will briefly thank all hon. Members for their contributions this afternoon. I think we have seen a very clear understanding that it is in all our interests to ensure that we tackle this issue and get it right. I very much endorse the comments of my hon. Friend the Member for Glasgow South (Stewart Malcolm McDonald) about seeking a strategy, because we are starting to see a swell of opinion for tackling some of these things, especially misinformation online. We have seen the importance of that through the current pandemic. The public need to be able to have confidence in the information that they access.

In a nutshell, the issue comes back to what the hon. Member for Strangford (Jim Shannon) said. He very ably made the point that it is so important that we are able to agree to disagree. I do not think that anybody is suggesting that we need to have any kind of thought control or that everybody has to have the same opinions. It is important that we do not, but it is important also that we can have confidence that those views and opinions are presented in a way that is accurate and factual.

Charles Walker Portrait Sir Charles Walker (in the Chair)
- Hansard - - - Excerpts

I thank colleagues for facilitating and conducting such an excellent debate.

Question put and agreed to.

Resolved,

That this House has considered cyber troop activity in the UK.