Online Filter Bubbles: Misinformation and Disinformation Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Online Filter Bubbles: Misinformation and Disinformation

Mark Hendrick Excerpts
Tuesday 16th January 2024

(3 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Stewart Malcolm McDonald Portrait Stewart Malcolm McDonald
- Hansard - - - Excerpts

Indeed. The hon. Lady is entirely correct. The fact that so much of this has spread like a great blob—some might say—around Whitehall benefits only our adversaries and those who wish to pursue disinformation in this country. That is before we get to the growing problem of the things the hon. Member for Weston-super-Mare mentioned—deep fakes and AI-generated disinformation—all of which is going to get worse and worse. As long as responsibility and lines of accountability and policy formation are a bit all over the place, when in my mind the obvious place for them to lie would be with the Cabinet Office, that will be of benefit only to those who want to sow disinformation.

In June 2021, in the spirit of trying to be a helpful Scottish nationalist, which might be an oxymoron to some people, I published a report that made nine recommendations on how, in fairness to the UK Government and Scottish Government, they can better counter disinformation in public life. I want to go through a couple of those. First, we need a proper national strategy that is a whole-society national strategy, imitating the excellent work done in countries such as Finland and Latvia, where countering disinformation and hybrid threats is not the job of the Department of Defence or even the Government but involves public institutions, other public bodies, the private sector, non-governmental organisations, civil society and private citizens. There is much that can be done. Surely we saw that in the generosity people showed during the pandemic. There is so much good will out there among the population to counter hybrid threats when they arise.

Although we have the counter disinformation unit, I would suggest a commissioner, perhaps similar to the Information Commissioner, with statutory powers on implementing the national strategy and countering disinformation. There is a job for our friends in the media, too. The media need to open up to explain to the public how stories are made. There is a job to be done in newspapers and broadcast media. It would be to the benefit of mainstream media—that phrase is often used in a derisory way, although I like my media to be mainstream—as the more the media explain to the public how they make news, the better that would be for those of us who consume it.

There should also be an audit of the ecosystem. One thing I suggested in the report is an annual update to Parliament of a threat assessment of hostile foreign disinformation to this country. The better we understand the information ecosystem, the better we can equip ourselves to counter hostile foreign disinformation. I also suggest literacy programmes across all public institutions, especially for public servants, whether elected or unelected. My goodness, some of them could do with that in this House.

I also suggest we look to host an annual clean information summit. There is so much good work that goes on, especially in Taiwan, and right on our own doorstep in Europe. So much good work goes on that we could learn from, and hopefully implement here. If we do not have a whole-society approach, involving public bodies, faith groups, trade unions, private enterprise and even political parties, fundamentally any strategy will fail.

I will end on this: political parties need to get their acts together, and not just on some of the stuff that gets put out. I am not going into things that individual parties have put out. But at either this election or the next—I would argue that the upcoming election is already at risk of a hostile foreign disinformation attack—what will happen when that disinformation gets more sophisticated, better funded and better resourced than anything we have to see it off? I come back to the conference I attended with the hon. Member for Folkestone and Hythe, where we took part in a war game: it was a presidential election, and our candidate was subject to a hostile foreign disinformation attack to spread smears and lies about them. We need to get used to this now. Political parties need to set their arms to one side and work together so that we can preserve that thing we call democracy. I think it is worth fighting for. I look forward to the other suggestions we will hear in the rest of the debate.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - -

I note the number of people present, and ask Members to keep their contributions to around seven minutes so that we can get everybody in.

--- Later in debate ---
Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - -

I am conscious of the time, so I will limit Back-Bench contributions to six minutes each.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It is a pleasure to speak in this debate under your chairship today, Sir Mark. I thank the hon. Member for Weston-super-Mare (John Penrose) for securing this timely and important debate on such an important issue.

Let us be clear that the Online Safety Act is an extremely important and very long-overdue piece of legislation. In reality, however, gaps remain in that legislation that need to be addressed. In this debate, we have heard about what are hopefully some positive ways forward.

There is huge cross-party consensus. It is a shame and a frustration that, when cross-party amendments were tabled to the Online Safety Bill, they were not taken seriously enough in Committee and were not carried forward. We raised serious questions about social media platforms, their responsibility to protect users, young and old, and to tackle the rise in disinformation. That was a clear opportunity that has now sadly passed, but the fact remains that this issue needs to be tackled.

This debate is timely, but when the Bill was progressing through Parliament, the debate focused on misleading information around the conflict in Ukraine. We all know that an alarming amount of information has recently been shared online regarding the situation in Israel and the middle east. As the hon. Member for Brigg and Goole (Andrew Percy) mentioned, the horrendous atrocities that occurred on 7 October were livestreamed by Hamas. They wore GoPros and uploaded the footage directly to social media platforms, yet an incredible number of people still did not believe it, saying it was not true or that it was a hoax. How far do we have to go for women in particular to be believed when they report crimes against them and to take this seriously? I cannot help but think that if the Government had listened to the concerns that I and others raised at that time, then we would be in a much better position to deal with these issues. Sadly, we are where we are.

As colleagues have mentioned, we also need to consider the role that AI plays in relation to misinformation and disinformation, particularly the impact of generative AI. That has the power and the potential to be at the forefront of economic growth in the country but, as others have mentioned, with a huge number of elections happening across the world this year, there has never been a more crucial time to tackle the spread of this misinformation and disinformation and the impact that it could have on our democracy. I would be grateful if the Minister reassured us that the Government have a plan; I would welcome his assurances, specifically in light of whatever discussions he has had with the Electoral Commission regarding plans to tackle this ahead of the next UK general election.

The Minister knows that, despite the passing of the Online Safety Act, many of the provisions in the legislation will not be in place for quite some time. In the meantime, Twitter—now X—has given the green light for Donald Trump’s return. Political misinformation has increased since Twitter became X, and right-wing extremists continue to gain political traction on controversial YouTube accounts and on so-called free speech platform Rumble. Platforms to facilitate the increase in political misinformation and extremist hate are sadly readily available and can be all-encompassing. As colleagues have rightly noted, that is nothing new. We only need to cast our minds back to 2020 to remember the disturbing level of fake news and misinformation that was circulating on social media regarding the covid pandemic. From anti-vaxxers to covid conspiracists, the pandemic brought that issue to the forefront of our attention. Only today, it was announced in the media that the UK is in the grip of a sudden spike in measles. Health officials have had to declare the outbreak a national incident, and the surge has been directly linked to a decline in vaccine uptake as a result of a rise in health disinformation from anti-vax conspiracy theories. That causes real-world harm and it needs to be addressed.

Misinformation causes anxiety and fear among many people, and I fear that the provisions in the Act would not go far enough if we faced circumstances similar to the pandemic. We all know that this is wide-ranging, from conspiracy theories about the safety of 5G to nonsense information about the so-called dangers of 15-minute cities, about which my hon. Friend the Member for Ellesmere Port and Neston (Justin Madders) spoke so ably. Sadly, those conspiracy theories were not just peddled by lone-wolf actors on social media; they were promoted by parliamentarians. We have to take that issue very seriously.

There are dangerous algorithms that seem to feed off popular misinformation and create these echo chambers and filter bubbles online. They have not helped but have amplified the situation. Would the Minister explain why the Government have decided to pursue an Online Safety Act that has failed to consider platforms’ business models and has instead become entirely focused on regulating content?

Moving on, possibly my biggest concern about misinformation and disinformation is the relationship between what is acceptable online and what is acceptable offline. As we all know, the issue of misinformation and disinformation is closely connected to online extremism. Although the Minister may claim that people can avoid harm online simply by choosing not to view content, that is just not realistic. After all, there is sadly no way to avoid abuse and harassment offline if individuals choose to spout it. In fact, only recently, when I dared to raise concerns and make comments here in Parliament about online misogyny and incel culture and ideology, I experienced a significant level of hate and harassment. Other colleagues have had similar experiences, as we have heard today.

This is a worrying trend, because we all know that online extremism can translate into the radicalisation of people in real-life terms, which can then heighten community tensions and make minority groups more susceptible to verbal and physical abuse and discrimination.

Online harm costs the UK economy £1.5 billion a year; we cannot afford to get this wrong. Online harm not only has that real-world impact, but it puts our country at an economic disadvantage. Recent research has shown that when there is a spike in online abuse towards one specific demographic, that translates to real-world abuse and real-world harm two weeks later, as the police themselves have said. There is only a two-week lag before online harm results in real-world attacks on certain individuals. No one should have to fear for their safety, online or offline.

In short, the Online Safety Act 2023 had the potential to be as world-leading as it was once billed to be, but the Minister must know that is far from being a perfect piece of legislation, particularly when it comes to misinformation and disinformation.

I am clearly not alone in expressing these views. I hope that the Minister has heard the concerns expressed in this wide-ranging debate, and that he will take seriously some of the proposals and move them forward.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - -

We now move to the contributions from the Front Benches.

--- Later in debate ---
Saqib Bhatti Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Saqib Bhatti)
- Hansard - - - Excerpts

I am conscious of time and of the broad range of this debate, but I will try to address as many issues as possible. I commend my hon. Friend the Member for Weston-super-Mare (John Penrose) for securing this important debate on preventing misinformation and disinformation in online filter bubbles, and for all his campaigning on the subject throughout the passage of the Online Safety Act. He has particularly engaged with me in the run-up to today’s well-versed debate, for which I thank hon. Members across the Chamber.

May I echo the sentiments expressed towards my hon. Friend the Member for Brigg and Goole (Andrew Percy)? I thank him for sharing his reflections. I was not going to say this today, but after the ceasefire vote I myself have faced a number of threats and a lot of abuse, so I have some personal reflections on the issue as well. I put on the record my invitation to Members across the House to share their experiences. I certainly will not hesitate to deal with social media companies where I see that they must do more. I know anecdotally, from speaking to colleagues, that it is so much worse for female Members. Across the House, we will not be intimidated in how we vote and how we behave, but clearly we are ever vigilant of the risk.

Since the crisis began, the Technology Secretary and I have already met with the large social media platforms X, TikTok, Meta, Snap and YouTube. My predecessor—my hon. Friend the Member for Sutton and Cheam (Paul Scully)—and the Technology Secretary also held a roundtable with groups from the Jewish community such as the Antisemitism Policy Trust. They also met Tell MAMA to discuss Muslim hate, which has been on the rise. I will not hesitate to reconvene those groups; I want to put that clearly on the record.

It is evident that more and more people are getting their news through social media platforms, which use algorithms. Through that technology, platform services can automatically select and promote content for many millions of users, tailored to them individually following automated analysis of their viewing habits. Many contributors to the debate have argued that the practice creates filter bubbles, where social media users’ initial biases are constantly reaffirmed with no counterbalance.

The practice can drive people to adopt extreme and divisive political viewpoints. This is a hugely complex area, not least because the creation of nudge factors in these echo chambers raises less the question of truth, but of how we can protect the free exchange of ideas and the democratisation of speech, of which the internet and social media have often been great drivers. There is obviously a balance to be achieved.

I did not know that you are a Man City fan, Sir Mark. I am a Manchester United fan. My hon. Friend the Member for Weston-super-Mare talked about fish tackle videos; as a tortured Manchester United fan, I get lots of videos from when times were good. I certainly hope that they return.

The Government are committed to preserving freedom of expression, both online and offline. It is vital that users are able to choose what content they want to view or engage with. At the same time, we agree that online platforms must take responsibility for the harmful effects of the design of their services and business models. Platforms need to prioritise user safety when designing their services to ensure that they are not being used for illegal activity and ensure that children are protected. That is the approach that drove our groundbreaking Online Safety Act.

I will move on to radicalisation, a subject that has come up quite a bit today. I commend my hon. Friend the Member for Folkestone and Hythe (Damian Collins) for his eloquent speech and his description of the journey of the Online Safety Act. Open engagement-driven algorithms have been designed by tech companies to maximise revenue by serving content that will best elicit user engagement. There is increasing evidence that the recommender algorithms amplify extreme material to increase user engagement and de-amplify more moderate speech.

Algorithmic promotion, another piece of online architecture, automatically nudges the user towards certain online choices. Many popular social media platforms use recommender algorithms, such as YouTube’s filter bubble. Critics argue that they present the user with overly homogeneous content based on interests, ideas and beliefs, creating extremist and terrorist echo chambers or rabbit holes. There are a multitude of features online that intensify and support the creation of those echo chambers, from closed or selective chat groups to unmoderated forums.

Research shows that individuals convicted of terrorist attacks rarely seek opposing information that challenges their beliefs. Without diverse views, online discussion groups grow increasingly partisan, personalised and compartmentalised. The polarisation of online debates can lead to an environment that is much more permissive of extremist views. That is why the Online Safety Act, which received Royal Assent at the end of October, focuses on safety by design. We are in the implementation phase, which comes under my remit; we await further evidence from the data that implementation will produce.

Under the new regulation, social media platforms will need to assess the risk of their services facilitating illegal content and activity such as illegal abuse, harassment or stirring up hatred. They will also need to assess the risk of children being harmed on their services by content that does not cross the threshold of illegality but is harmful to them, such as content that promotes suicide, self-harm or eating disorders.

Platforms will then need to take steps to mitigate the identified risks. Ofcom, the new online safety regulator, will set out in codes of practice the steps that providers can take to mitigate particular risks. The new safety duties apply across all areas of a service, including the way in which it is designed, used and operated. If aspects of a service’s design, such as the use of algorithms, exacerbate the risk that users will carry out illegal activity such as illegal abuse or harassment, the new duties could apply. Ofcom will set out the steps that providers can take to make their algorithms safer.

I am conscious of time, so I will move on to the responsibility around extremism. Beyond the duties to make their services safe by design and reduce risk in that way, the new regulation gives providers duties to implement systems and processes for filtering out and moderating content that could drive extremism. For example, under their illegal content duty, social media providers will need to put systems in place to seek out and remove content that encourages terrorism. They will need to do the same for abusive content that could incite hatred on the basis of characteristics such as race, religion or sexual orientation. They will also need to remove content in the form of state-sponsored or state-linked disinformation aimed at interfering with matters such as UK elections and political decision making, or other false information that is intended to cause harm.

Elections have come up quite a bit in this debate. The defending democracy taskforce, which has been instituted to protect our democracy, is meeting regularly and regular discussions are going on; it is cross-nation and cross-Government, and we certainly hope to share more information in the coming months. We absolutely recognise the responsibilities of Government to deal with the issue and the risks that arise from misinformation around elections. We are not shying away from this; we are leading on it across Government.

The idea put forward by my hon. Friend the Member for Weston-super-Mare has certainly been debated. He has spoken to me about it before, and I welcome the opportunity to have this debate. He was right to say that this is the start of the conversation—I accept that—and right to say that he may not yet have the right answer, but I am certainly open to further discussions with him to see whether there are avenues that we could look at.

I am very confident that the Online Safety Act, through its insistence on social media companies dealing with the issue and on holding social media companies to account on their terms and conditions, will be a vital factor. My focus will absolutely be on the implementation of the Act, because we know that that will go quite a long way.

We have given Ofcom, the new independent regulator, the power to require providers to change their algorithms and their service design where necessary to reduce the risk of users carrying out illegal activity or the risk of children being harmed. In overseeing the new framework, Ofcom will need to carry out its duties in a way that protects freedom of expression. We have also created a range of new transparency and freedom-of-expression duties for the major social media platforms; these will safeguard pluralism in public debate and give users more certainty about what they can expect online. As I have said, the Government take the issue incredibly seriously and will not hesitate to hold social media companies to account.

Mark Hendrick Portrait Sir Mark Hendrick (in the Chair)
- Hansard - -

John Penrose has 30 seconds to wind up.