Online Filter Bubbles: Misinformation and Disinformation

John Nicolson Excerpts
Tuesday 16th January 2024

(3 months, 3 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - -

I thank the hon. Member for Weston-super-Mare (John Penrose) for securing this debate. He and I sat together recently at a Quaker dinner in London, where we discussed disinformation and the coarsening of public debate, and I think that the small cross-party group present at the event all agreed that social media had been one of the driving factors behind that, if not the only one.

In 2015, as a new MP and a new user of social media, it took me quite some time to adapt. At first, I thought that when people wrote to me on Twitter the rules of normal social intercourse applied—that I might disagree with someone but if I responded courteously and offered facts, a respectful dialogue would then ensue or we could agree to disagree amicably.

Historywoman, a professor from Edinburgh University no less, soon disabused me of that view. The venom was staggering, and apparently it was just because we disagreed on facts about the constitution; she screamed abuse. Then there was Wings Over Scotland, with more eyeball-bulging, temple-throbbing hate. I had offered some facts about trans people, which he did not like; in fact, he hated it so much that he pounded his keyboard for months in a frenzy.

I got to understand the concept of pile-ons when a sinister organisation called the LGB Alliance decided to reward folk who gave them money by reposting disinformation and abuse about me from their account—a charity account, no less. Finally, when someone called me a “greasy bender” and Twitter moderators judged that comment to be factual and fair comment, I realised that courteous replies did not quite cut it and I became a fan of the block button.

Why are these people so angry and why do they believe that they can behave online in a way that would be considered certifiable offline? I sit on the Culture, Media and Sport Committee, which has undertaken long and detailed inquiries into the areas of disinformation and misinformation, and the impact of online filter bubbles. So what are filter bubbles? They are the result of algorithms designed to increase user engagement rather than correct inaccuracies; in other words, they are designed to show people content again and again based on their viewer biases. For some people, that can be innocent enough—I seem to be directed towards endless posts and films about house restoration options and Timothée Chalamet’s latest outfits—but for others, the filter bubbles are far from benign. Indeed, Facebook itself warned its own staff that its algorithms

“exploit the human brain’s attractiveness to divisiveness”.

What does that mean in practice? It means that if someone watches one conspiracy video, the chances are 70% or more that another conspiracy video reinforcing their paranoia will be recommended for them to watch immediately afterwards. The result is to drive some users into a frenzy. This is why some people blow up 5G masts, convinced that they are the cause of covid. It is not just the underprivileged and ignorant who fall prey; even graduates of the world’s most elite universities can become victims. Donald Trump thought that injecting bleach could cure covid and we now know from the covid inquiry that Boris Johnson wondered whether blowing a hairdryer up his nostrils might save him from the pandemic.

Filter bubbles pose an enormous threat to our democracy. We know how heavily engaged Vladimir Putin was in encouraging people to vote for Brexit by spreading disinformation online. He believed that Brexit would weaken the European Union and Britain’s economy. He was successful but only half right. In the United States, swept away in a tsunami of ignorance, prejudice and shared disinformation, those who stormed the Capitol believed that the victor had lost and the loser had won. Who knew that one of the world’s great democracies would be so vulnerable?

At the Select Committee, we have heard harrowing stories about vulnerable young people fed content persuading them to commit suicide. One father’s testimony was especially harrowing, and I will never forget it. So what responsibility should Members of Parliament take? Surely we should have been much tougher, and dealt much sooner with cynical and unscrupulous social media companies that are driven only by profits and scared only by threats to those profits.

Of course, politicians are directly responsible for the way in which disinformation that they initiate is spread offline and online. All of us—at least almost all—condemned Nigel Farage’s overtly racist Brexit campaign poster, with its image of outsiders supposedly queuing to get into the UK; it had hideous echoes of the 1930s. But what of the much mocked and seemingly more innocuous Tory conference speeches last September? Delegates were told that the UK Government had prevented bans on meat and single-car usage, and had stopped the requirement of us all having seven household bins. The claims were risible, false and mocked but, strikingly, Cabinet Minister after Cabinet Minister tried to defend them when questioned by journalists. Does it matter? Yes, it does. It has a corrosive effect on voters’ trust. Knowingly spreading disinformation helps only those who would undermine our democratic institutions. Some call it post-truth politics: conditioning voters to believe no one and nothing—to believe that there is no difference between truth and lies, and no difference between “Channel 4 News” and GB News.

Our Committee found that there have been repeated successful attempts by bad-faith actors to insert their talking points into our democratic discourse online. The social media companies have shown little interest in tackling them. They were disdainful witnesses when we summoned them and, disturbingly, we have seen our once proudly independent broadcasting sector polluted with the arrival of GB News to challenge long-standing, universally accepted standards. Its aim: to become as successful as Fox News in the dissemination of on-air propaganda online and offline. We all hope that the Online Safety Act 2023 will help but, alas, I fear that the evidence hitherto suggests that our woefully passive regulator, Ofcom, will continue to be found wanting.