Online Filter Bubbles: Misinformation and Disinformation

Debate between Gregory Campbell and John Penrose
Tuesday 16th January 2024

(3 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

I beg to move,

That this House has considered the matter of preventing misinformation and disinformation in online filter bubbles.

It is good to see you in the Chair and in charge of our proceedings, Sir Mark. It is also good to see the Minister in his place. He confessed gently to me beforehand that this is the first Westminster Hall debate in which he has had the honour of being a Minister. I am sure that he will mark his debut with elan.

This important issue has not had enough exposure and discussion so far in Parliament, so I am pleased to see so many colleagues present. I suspect that they have all had far too many examples than they can possibly count of not just fake news but fake science, fake medicine or online memes of one kind or another landing in their in-trays from constituents. This is not just about constituents writing to Members of Parliament; it is a much broader issue that affects the whole tenor and fabric of our democracy and public debate. It is particularly important because public debate, in these online days, happens in a far wider and more varied collection of different forums than it used to before the internet was so commonly and widely used. It is something that needs to be addressed.

Gregory Campbell Portrait Mr Gregory Campbell (East Londonderry) (DUP)
- Hansard - -

I congratulate the hon. Member on securing this timely debate. Does he agree that the prevalence of fake news is all consuming? There was a quote put to me, as a Member of Parliament, on social media a few years ago. It was along the lines of, “In the words of Abraham Lincoln, don’t believe all you read on the internet.”

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

The hon. Gentleman is absolutely right. The saying used to be, “Don’t believe everything you read in the newspapers,” but it applies equally strongly in the modern digital world to everything that we read on the internet.

Fake news, or misinformation and disinformation, really matters, particularly this year. A large proportion of the democratic world will have general elections in the next 12 months. The scope for interference in those elections by malign actors, whether they are foreign states, organised criminals or people of a particular religious or political persuasion, is very strong. They try to sway public debate through fair means or foul, and if we are talking about misinformation and disinformation, that means foul means. The potential in the next 12 months for bad things to happen is very high, and that is just when it comes to democracy. That does not cover the other examples of fake news or fake science that I mentioned, such as fake medicine. Believing quack cures can lead to more deaths, more people getting the wrong kinds of medical treatments and so on. There are many different opportunities.

There is also a serious issue around radicalisation. Somebody who is fed a diet of alt-left or alt-right political views, or extremist religious views—it does not really matter what it is—can easily disappear down a rabbit hole, into an echo chamber of views where only one particular strand of opinion is put in front of them again and again. That way leads to radicalisation on a whole range of different topics, and it undermines both our society and our democracy, and science in many cases. It means that societies as a whole become much more brittle and more divided, and it is much harder for democracy to flourish.

What is misinformation and disinformation, without getting sucked into technocratic definitions? It is rather like trying to define pornography. As the famous phrase goes, “You may not be able to define it, but like a hippopotamus, you recognise it when you see it.” [Interruption.] I will ignore the heckling on my right; it will not help. There are two underlying facets to misinformation and disinformation. One is that if someone is simply making stuff up, telling lies and saying things that are factually inaccurate and false, that can easily become misinformation and disinformation. The second is when things are factually accurate and correct but really one-sided and biased. That matters too; it is extremely important, and we have long had rules for our broadcasters, to which I will return in a minute, that are designed to prevent it.

The good news is that the Online Safety Act 2023 took a few early steps to do something about factual inaccuracy, at least. It does not do a great deal—it should do more—but it takes some early steps, and it would be churlish to pretend that there is nothing there at all. I tabled a couple of early amendments to get us to think about factual inaccuracy and to work out where it came from—provenance, in the jargon—so that we could tell whether something comes from a trusted source. We ended up with some useful points, particularly duties on Ofcom relating to media literacy and making sure that people know what questions to ask when they see something on the internet and do not, as we were just hearing, necessarily believe everything they read online but ask questions about where it came from, who produced it and whether it has been altered. Ofcom has that duty now; it has not yet grown teeth and claws or started to bite but at least, in principle, that power is there and is very welcome.

There is also the advisory committee enshrined in the Act, which ought to make a difference, although precisely how will depend on how actively it flexes the muscles it has been given. Separately from the Online Safety Act, there are the national security laws about foreign interference too. There is some protection, therefore, but it is not nearly enough. The Minister’s predecessors, in what used to be the Department for Digital, Culture, Media and Sport before it was reorganised, will say that in the early days of the Online Safety Act’s gestation, it was intended to cover misinformation and disinformation, but that was hived off and fell away at an early stage. That is an important omission, and we need to come back to it now.

I want to make a modest proposal. The Online Safety Act will start to make modest progress towards media literacy and people understanding and asking questions about factual accuracy and where something comes from when they see it on the web. It will go some way to addressing the first of the two sources of misinformation and disinformation—people telling lies, making stuff up, deepfakes of one kind or another. The sad fact is that the chances of deepfakes getting better with the advent of artificial intelligence is very high indeed so that, even if we think we can spot them now, we are probably kidding ourselves and in a year or two’s time it will be doubly, trebly or quadruply difficult to work out what is real and what is completely made up.

If we accept that at least something is in place in this country to deal with factual inaccuracy, we are still stuck with absolutely nothing, as yet, to deal with the one-sided and deeply biased presentation of factually correct narratives. I therefore want to draw a comparison, as I mentioned earlier, with what we already do and have been doing very successfully for decades in the broadcasting world, where Ofcom, through the broadcasting code, has been in charge of the duty of balance and undue prominence. That duty has very successfully said for decades that the analogue broadcasting world has to make sure that, when it presents something that is supposedly factual in a broadcast news programme, it must be balanced and must not give undue prominence to one side of the argument. That works really rather well, and has been a core part of ensuring that our public debates in this country are not sidetracked by fake news.

I suspect that every one of us here will, at various different times, have gnashed our teeth and shouted at the telly because we felt that the BBC, ITV or Sky News was presenting something in a slightly partisan way; depending on which side of the House we are on, we may have thought that the partisanship was on one side of the argument rather than the other. However, the fact remains that we all know the way they are supposed to do it and that there is some kind of redress, and there is generally an acceptance that it is the right thing to do. The duty matters not just because politicians think it is important, but because it has—very successfully, I would argue—made sure that there is a tendency towards reasoned, evidence-based consensus in British public debate, online and in broadcast news, over more than half a century.

The title of this debate is not just, “Misinformation and Disinformation”; it is about those two things in online filter bubbles. Online filter bubbles bear some quite important similarities to what broadcast news editorial room decision making has long been doing. The reason is that when we go online, we all have our own personal online filter bubble. Whether we use Google, Facebook, TikTok, all of the above, or whatever it might be, those platforms have an algorithm that says, “John Penrose likes looking at stuff to do with fishing tackle—we’re going to send him more stuff about fishing tackle.” I am not sure what the equivalent would be for the Minister; I am sure he will tell us in due course, unless he is too shy.

The algorithm works out what we have personally chosen to look at and gives us more of the same. That can also lead to radicalisation. If I start looking at things to do with Islamic jihad, it will say, “Oh! He’s interested in Islamic jihad”, and send me more and more things about Islamic jihad—or the alt-left, the alt-right, or whatever it might be. The algorithm’s decision to send people more of what they have already chosen—when it sends people things they have not chosen, but which it thinks they will like—is effectively a digital editorial decision that is, in principle, very similar to the editorial decisions going on in the Sky, ITV or BBC newsrooms, either for radio or for broadcast TV.

We need to come up with a modern, digital version of the long-established and, as I said, very successful principle of the duty of balance and undue prominence and apply it to the modern, digital world. Then, if I started looking at Islamic jihad, and I got sent more and more things about Islamic jihad, as I saw more and more things about Islamic jihad, the algorithm that was creating my personal filter bubble would start sending me things saying, “You do know that there is an alternative here? You do know that there is another side of this argument? You do know that the world is not just this, and this particular echo chamber—this rabbit hole of radicalisation that you are enthusiastically burrowing your way down—may actually be exactly that? You need to understand that there is more to it.” That is something that happens to all of us all the time in the old, analogue world, but does not happen in the digital world. I would argue that it is one of the reasons that many of us here, across the political spectrum, are so worried about the divisive nature of the online world and the rising levels of disrespect and potential incitement of violence there.

I plan to do something rather unusual for politicians and stop talking very soon, because I hope that this has served as a proposal for colleagues to consider. It is something that would need cross-party consensus behind it in order to be taken forward, and there may be better ways of doing it, but I am absolutely certain that we do not have anything in our legal arsenal in this area at the moment. I would argue that we need to act quite promptly. As I have said, the stakes in the next 12 months democratically are very high, but the stakes have been very high in other areas, such as medical disinformation, for a very long time because we have just come through a pandemic. The scope for damage—to our society, to our health and to our entire way of life—is very high.

Therefore, I hope that colleagues will consider what I have said, and if they have a better answer I am all ears and I would be absolutely delighted to hear it. This is a very early stage in the political debate about this issue, but we cannot carry on not having a political debate about it; we cannot carry on not addressing this central issue. So, I am glad that everybody is here today and I hope that we will all go forth and tell our political friends and neighbours that this issue is important and that they need to address it as well. And as I say, if people have better solutions than the one that I have just politely proffered, then my pen is poised and I look forward to taking notes.