Lord Allan of Hallam debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I draw attention to my interests in the register, which I declared in full at Second Reading. It is an absolute pleasure to follow the noble Lord, Lord Stevenson, and, indeed, to have my name on this amendment, along with those of fellow members of the pre-legislative committee. It has been so long that it almost qualifies as a reunion tour.

This is a fortuitous amendment on which to start our deliberations, as it sets out the very purpose of the Bill—a North Star. I want to make three observations, each of which underlines its importance. First, as the pre-legislative committee took evidence, it was frequently remarked by both critics and supporters that it was a complicated Bill. We have had many technical briefings from DSIT and Ofcom, and they too refer to the Bill as “complicated”. As we took advice from colleagues in the other place, expert NGOs, the tech sector, academics and, in my own case, the 5Rights young advisory group, the word “complicated” repeatedly reared its head. This is a complex and ground-breaking area of policy, but there were other, simpler structures and approaches that have been discarded.

Over the five years with ever-changing leadership and political pressures, the Bill has ballooned with caveats and a series of very specific, and in some cases peculiar, clauses—so much so that today we start with a Bill that even those of us who are paying very close attention are often told that we do not understand. That should make the House very nervous.

It is a complicated Bill with intersecting and dependent clauses—grey areas from which loopholes emerge—and it is probably a big win for the deepest pockets. The more complicated the Bill is, the more it becomes a bonanza for the legal profession. As the noble Lord, Lord Stevenson, suggests, the Minister is likely to argue that the contents of the amendment are already in the Bill, but the fact that the word “complicated” is firmly stuck to its reputation and structure is the very reason to set out its purpose at the outset, simply and unequivocally.

Secondly, the OSB is a framework Bill, with vast amounts of secondary legislation and a great deal of work to be implemented by the regulator. At a later date we will discuss whether the balance between the Executive, the regulator and Parliament is exactly as it should be, but as the Bill stands it envisages a very limited future role for Parliament. If I might borrow an analogy from my previous profession, Parliament’s role is little more than that of a background extra.

I have some experience of this. In my determination to follow all stages of the age-appropriate design code, I found myself earlier this week in the Public Gallery of the other place to hear DSIT Minister Paul Scully, at Second Reading of the Data Protection and Digital Information (No. 2) Bill, pledge to uphold the AADC and its provisions. I mention this in part to embed it on the record—that is true—but primarily to make this point: over six years, there have been two Information Commissioners and double figures of Secretaries of State and Ministers. There have been many moments at which the interpretation, status and purpose of the code has been put at risk, at least once to a degree that might have undermined it altogether. At these moments, each time the issue was resolved by establishing the intention of Parliament beyond doubt. Amendment 1 moves Parliament from background extra to star of the show. It puts the intention of Parliament front and centre for the days, weeks, months and years ahead in which the work will still be ongoing—and all of us will have moved on.

The Bill has been through a long and fractured process in which the pre-legislative committee had a unique role. Many attacks on the Bill have been made by people who have not read it. Child safety was incorrectly cast as the enemy of adult freedom. While some wanted to apply the existing and known concepts and terms of public interest, protecting the vulnerable, product safety and the established rights and freedoms of UK citizens, intense lobbying has seen them replaced by untested concepts and untried language over which the tech sector has once again emerged as judge and jury. This has further divided opinion.

In spite of all the controversy, when published, the recommendations of the committee report received almost universal support from all sides of the debate. So I ask the Minister not only to accept the committee’s view that the Bill needs a statement of purpose, the shadow of which will provide shelter for the Bill long into the future, but to undertake to look again at the committee report in full. In its pages lies a landing strip of agreement for many of the things that still divide us.

This is a sector that is 100% engineered and almost all privately owned, and within it lie solutions to some of the greatest problems of our age. It does not have to be as miserable, divisive and exploitative as this era of exceptionalism has allowed it to be. As the Minister is well aware, I have quite a lot to say about proposed new subsection (1)(b),

“to provide a higher level of protection for children than for adults”,

but today I ask the Minister to tell us which of these paragraphs (a) to (g) are not the purpose of the Bill and, if they are not, what is.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that we are starting our Committee debate on this amendment. It is a pleasure to follow the noble Lord, Lord Stevenson, and the noble Baroness, Lady Kidron.

In this Bill, as has already been said, we are building a new and complex system and we can learn some lessons from designing information systems more generally. There are three classic mistakes that you can make. First, you can build systems to fit particular tools. Secondly, you can overcommit beyond what you can actually achieve. Thirdly, there is feature creep, through which you keep adding things on as you develop a new system. A key defence against these mistakes is to invest up front in producing a really good statement of requirements, which I see in Amendment 1.

On the first risk, as we go through the debate, there is a genuine risk that we get bogged down in the details of specific measures that the regulator might or might not include in its rules and guidance, and that we lose sight of our goals. Developing a computer system around a particular tool—for example, building everything with Excel macros or with Salesforce—invariably ends in disaster. If we can agree on the goals in Amendment 1 and on what we are trying to achieve, that will provide a sound framework for our later debates as we try to consider the right regulatory technologies that will deliver those goals.

The second cardinal error is overcommitting and underdelivering. Again, it is very tempting when building a new system to promise the customer that it will be all-singing, all-dancing and can be delivered in the blink of an eye. Of course, the reality is that in many cases, things prove to be more complex than anticipated, and features sometimes have to be removed while timescales for delivering what is left are extended. A wise developer will instead aim to undercommit and overdeliver, promising to produce a core set of realistic functions and hoping that, if things go well, they will be able to add in some extra features that will delight the customer as an unexpected bonus.

This lesson is also highly relevant to the Bill, as there is a risk of giving the impression to the public that more can be done quicker than may in fact be possible. Again, Amendment 1 helps us to stay grounded in a realistic set of goals once we put those core systems in place. The fundamental and revolutionary change here is that we will be insisting that platforms carry out risk assessments and share them with a regulator, who will then look to them to implement actions to mitigate those risks. That is fundamental. We must not lose sight of that core function and get distracted by some of the bells and whistles that are interesting, but which may take the regulator’s attention away from its core work.

We also need to consider what we mean by “safe” in the context of the Bill and the internet. An analogy that I have used in this context, which may be helpful, is to consider how we regulate travel by car and aeroplane. Our goal for air travel is zero accidents, and we regulate everything down to the nth degree: from the steps we need to take as passengers, such as passing through security and presenting identity documents, to detailed and exacting safety rules for the planes and pilots. With car travel, we have a much higher degree of freedom, being able to jump in our private vehicles and go where we want, when we want, pretty much without restrictions. Our goal for car travel is to make it incrementally safer over time; we can look back and see how regulation has evolved to make vehicles, roads and drivers safer year on year, and it continues to do so. Crucially, we do not expect car travel to be 100% safe, and we accept that there is a cost to this freedom to travel that, sadly, affects thousands of people each year, including my own family and, I am sure, many others in the House. There are lots of things we could do to make car travel even safer that we do not put into regulation, because we accept that the cost of restricting freedom to travel is too high.

Without over-labouring this analogy, I ask that we keep it in mind as we move through Committee—whether we are asking Ofcom to implement a car-like regime whereby it is expected to make continual improvements year on year as the state of online safety evolves, or we are advocating an aeroplane-like regime whereby any instance of harm will be seen as a failure by the regulator. The language in Amendment 1 points more towards a regime of incremental improvements, which I believe is the right one. It is in the public interest: people want to be safer online, but they also want the freedom to use a wide range of internet services without excessive government restriction, and they accept some risk in doing so.

I hope that the Minister will respond positively to the intent of Amendment 1 and that we can explore in this debate whether there is broad consensus on what we hope the Bill will achieve and how we expect Ofcom to go about its work. If there is not, then we should flush that out now to avoid later creating confused or contradictory rules based on different understandings of the Bill’s purpose. I will keep arguing throughout our proceedings for us to remain focused on giving the right goals to Ofcom and allowing it considerable discretion over the specific tools it needs, and for us to be realistic in our aims so that we do not overcommit and underdeliver.

Finally, the question of feature creep is very much up to us. There will be a temptation to add things into the Bill as it goes through. Some of those things are essential; I know that the noble Baroness, Lady Kidron, has some measures that I would also support. This is the right time to do that, but there will be other things that would be “nice to have”, and the risk of putting them in might detract from those core mechanisms. I hope we are able to maintain our discipline as we go through these proceedings to ensure we deliver the right objectives, which are incredibly well set out in Amendment 1, which I support.

--- Later in debate ---
We need to be bravely dispassionate in our discussions on protecting children online, and to scrutinise the Bill carefully for unintended consequences for children. But we must also avoid allowing our concern for children to spill over into infantilising adults and treating adult British citizens as though they are children who need protection from speech. There is a lot to get through in the Bill but the amendment, despite its good intentions, does not resolve the dilemmas we are likely to face in the following weeks.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I have had a helpful reminder about declarations of interest. I once worked for Facebook; I divested myself of any financial interest back in 2020, but of course a person out there may think that what I say today is influenced by the fact that I previously took the Facebook shilling. I want that to be on record as we debate the Bill.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have not engaged with this amendment in any particular detail—until the last 24 hours, in fact. I thought that I would come to listen to the debate today and see if there was anything that I could usefully contribute. I have been interested in the different points that have been raised so far. I find myself agreeing with some points that are perhaps in tension or conflict with each other. I emphasise from the start, though, my complete respect for the Joint Committee and the work that it did in the pre-legislative scrutiny of the Bill. I cannot compare my knowledge and wisdom on the Bill with those who, as has already been said, have spent so much intensive time thinking about it in the way that they did at that stage.

Like my noble friend Lady Harding, I always have a desire for clarity of purpose. It is critical for the success of any organisation, or anything that we are trying to do. As a point of principle, I like the idea of setting out at the start of this Bill its purpose. When I looked through the Bill again over the last couple of weeks in preparation for Committee, it was striking just how complicated and disjointed a piece of work it is and so very difficult to follow.

There are many reasons why I am sympathetic towards the amendment. I can see why bringing together at the beginning of the Bill what are currently described as “Purposes” might be for it to meet its overall aims. But that brings me to some of the points that the noble Baroness, Lady Fox, has just made. The Joint Committee’s report recommends that the objectives of the Bill

“should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers”—

it then set out objectives aimed at Ofcom rather than them actually being the purposes of the Bill.

I was also struck by what the noble Lord, Lord Allen, said about what we are looking for. Are we looking for regulation of the type that we would expect of airlines, or of the kind we would expect from the car industry? If we are still asking that question, that is very worrying. I think we are looking for something akin to the car industry model as opposed to the airline model. I would be very grateful if my noble friend the Minister was at least able to give us some assurance on that point.

If I were to set out a purpose of the Bill at the beginning of the document, I would limit myself to what is currently in proposed new subsection (1)(g), which is

“to secure that regulated internet services operate with transparency and accountability in respect of online safety”.

That is all I would say, because that, to me, is what this Bill is trying to do.

The other thing that struck me when I looked at this—I know that there has been an approach to this legislation that sought to adopt regulation that applies to the broadcasting world—was the thought, “Somebody’s looked at the BBC charter and thought, well, they’ve got purposes and we might adopt a similar sort of approach here.” The BBC charter and the purposes set out in it are important and give structure to the way the BBC operates, but they do not give the kind of clarity of purpose that my noble friend Lady Harding is seeking—which I too very much support and want to see—because there is almost too much there. That is my view on what the place to start would be when setting out a very simple statement of purpose for this Bill.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Baroness. I hope I have not appeared to rush the proceedings, but I am conscious that there are three Statements after the Bill. I thank the noble Lord, Lord Stevenson, for tabling this amendment, speaking so cogently to it and inspiring so many interesting and thoughtful speeches today. He and I have worked on many Bills together over the years, and it has been a real pleasure to see him back in harness on the Opposition Front Bench, both in the Joint Committee and on this Bill. Long may that last.

It has been quite some journey to get to this stage of the Bill; I think we have had four Digital Ministers and five Prime Ministers since we started. It is pretty clear that Bismarck never said, “Laws are like sausages: it’s best not to see them being made”, but whoever did say it still made a very good point. The process leading to today’s Bill has been particularly messy, with Green and White Papers; a draft Bill; reports from the Joint Committee and Lords and Commons Select Committees; several versions of the Bill itself; and several government amendments anticipated to come. Obviously, the fact that the Government chose to inflict last-minute radical surgery on the Bill to satisfy what I believe are the rather unjustified concerns of a small number in the Government’s own party made it even messier.

It is extremely refreshing, therefore, to start at first principles, as the noble Lord, Lord Stevenson, has done. He has outlined them and the context in which we should see them—namely, we should focus essentially on the systems, what is readily enforceable and where safety by design and transparency are absolutely the essence of the purpose of the Bill. I share his confidence in Ofcom and its ability to interpret those purposes. I say to the noble Baroness, Lady Stowell, that I am not going to dance on the heads of too many pins about the difference between “purpose” and “objective”. I think it is pretty clear what the amendment intends, but I do have a certain humility about drafting; the noble Baroness, Lady Chakrabarti, reminded us of that. Of course, one should always be open to change and condensation of wording if we need to do that. But we are only at Amendment 1 in Committee, so there is quite a lot of water to flow under the bridge.

It is very heartening that there is a great deal of cross-party agreement about how we must regulate social media going forward. These Benches—and others, I am sure—will examine the Bill extremely carefully and will do so in a cross-party spirit of constructive criticism, as we explained at Second Reading. Our Joint Committee on the draft Bill exemplified that cross-party spirit, and I am extremely pleased that all four signatories to this amendment served on the Joint Committee and readily signed up to its conclusions.

Right at the start of our report, we made a strong case for the Bill to set out these core objectives, as the noble Lord, Lord Stevenson, has explained, so as to provide clarity—that word has been used around the Committee this afternoon—for users and regulators about what the Bill is trying to achieve and to inform the detailed duties set out in the legislation. In fact, I believe that the noble Lord, Lord Stevenson, has improved on that wording by including a duty on the Secretary of State, as well as Ofcom, to have regard to the purposes.

We have heard some very passionate speeches around the Committee for proper regulation of harms on social media. The case for that was made eloquently to the Joint Committee by Ian Russell and by witnesses such as Edleen John of the FA and Frances Haugen, the Facebook whistleblower. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address the systemic issues inherent in their services and business models.

The introduction to our Joint Committee report makes it clear that without the original architecture of a duty of care, as the White Paper originally proposed, we need an explicit set of objectives to ensure clarity for Ofcom when drawing up the codes and when the provisions of the Bill are tested in court, as they inevitably will be. Indeed, in practice, the tests that many of us will use when judging whether to support amendments as the Bill passes through the House are inherently bound up with these purposes, several of which many of us mentioned at Second Reading. Decisions may need to be made on balancing some of these objectives and purposes, but that is the nature of regulation. I have considerable confidence, as I mentioned earlier, in Ofcom’s ability to do this, and those seven objectives—as the right reverend Prelate reminded us, the rule of seven is important in other contexts—set that out.

In their response to the report published more than a year ago, the Government repeated at least half of these objectives in stating their own intentions for the Bill. Indeed, they said:

“We are pleased to agree with the Joint Committee on the core objectives of the Bill”,


and, later:

“We agree with all of the objectives the Joint Committee has set out, and believe that the Bill already encapsulates and should achieve these objectives”.


That is exactly the point of dispute: we need this to be explicit, and the Government seem to believe that it is implicit. Despite agreeing with those objectives, at paragraph 21 of their response the Government say:

“In terms of the specific restructure that the Committee suggested, we believe that using these objectives as the basis for Ofcom’s regulation would delegate unprecedented power to a regulator. We do not believe that reformulating this regulatory framework in this way would be desirable or effective. In particular, the proposal would leave Ofcom with a series of high-level duties, which would likely create an uncertain and unclear operating environment”.


That is exactly the opposite of what most noble Lords have been saying today.

It has been an absolute pleasure to listen to so many noble Lords across the Committee set out their ambitions for the Bill and their support for this amendment. It started with the noble Baroness, Lady Kidron, talking about this set of purposes being the “North Star”. I pay tribute to her tireless work, which drove all of us in the Joint Committee on in an extremely positive way. I am not going to go through a summing-up process, but what my noble friend had to say about the nature of the risk we are undertaking and the fact that we need to be clear about it was very important. The whole question of clarity and certainty for business and the platforms, in terms of making sure that they understand the purpose of the Bill—as the noble Baroness, Lady Harding, and many other noble Lords mentioned—is utterly crucial.

If noble Lords look at the impact assessment, they will see that the Government seem to think the cost of compliance is a bagatelle—but, believe me, it will not be. It will be a pretty expensive undertaking to train people in those platforms, across social media start-ups and so on to understand the nature of their duties.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I was just refreshing myself on what the impact assessment says. It says that the cost of reading and understanding the regulations will range from £177 for a small business to £2,694 for a large category 1 service provider. To reinforce my noble friend’s point: it says it will cost £177 to read and understand the Bill. I am not sure that will be what happens in practice.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I thank my noble friend for having the impact assessment so close to hand; that is absolutely correct.

The noble Baroness, Lady Fox, talked about unintended consequences—apart from bringing the people of Ukraine into the argument, which I thought was slightly extraneous. I think we need a certain degree of humility about the Bill. As the noble Lord, Lord Knight, said, this may well be part 1; we may need to keep iterating to make sure that this is effective for child safety and for the various purposes set out in the Bill. The Government have stated that this amendment would create greater uncertainty, but that is exactly the opposite of what our committee concluded. I believe, as many of us do, that the Government are wrong in taking the view that they have; I certainly hope that they will reconsider.

At Second Reading, the noble Lord, Lord Stevenson, made something that he probably would not want, given the antecedence of the phrase, to characterise as a big open offer to the Minister to work on a cross-party basis to improve the Bill. We on these Benches absolutely agree with that approach. We look forward to the debates in Committee in that spirit. We are all clearly working towards the same objective, so I hope the Government will respond in kind. Today is the first opportunity to do so—I set out that challenge to the Minister.

Broadband: Price

Lord Allan of Hallam Excerpts
Thursday 2nd February 2023

(1 year, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Ofcom does have an important role to play here as the independent regulator, but, as I say, mindful of the particular challenges that households are facing, my right honourable friend the Secretary of State spoke directly to companies, asking them to consider very carefully the decisions they are making and the impact on their customers.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, was the Minister struck, as I was, by the observation in Ofcom’s December pricing trends report that there are millions of consumers who are out of contract, and so free to switch, but have not yet done so? Does he agree that these people could make significant savings, often without having to switch at all, as many providers will drop their prices as soon as you ring and threaten to leave? What are the Government doing to make this group aware that they can do this?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

Yes, it is very striking. Many people could be saving money and are not aware of it. That is why it is important that contracts are clear, but it also highlights the importance of consumer advice groups and, indeed, debates such as this, to draw the attention of people to the contracts they have signed.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I have two observations, two pleas, one offer of help and four minutes to deliver all this, so here goes.

Observation one is that this Bill is our answer to the age-old question of “quis custodiet ipsos custodes?” or, in the vernacular, “Who watches the watchmen?” With several thousand strokes of the pen, Parliament is granting to itself the power to tell tens of thousands of online services how they should manage their platforms if they wish to access the UK market. Parliament will give directions to Ofcom about the outcomes it wants to see and Ofcom will translate these into detailed instructions and ensure compliance through a team of several hundred people that the platforms will pay for. In-scope services will be given a choice—pay up and follow Ofcom’s instructions or get out of the UK market. We are awarding ourselves significant superpowers in this Bill, and with power comes great scrutiny as I am sure will happen in this House.

My second observation is that regulating online content is hard. It is hard because of scale. If regulating traditional media is like air traffic controllers managing a few thousand flights passing over the UK each day, then regulating social media is more like trying to control all the 30 million private cars that have access to UK roads. It is hard because it requires judgment. For many types of speech there is not a bright line between what is legal and illegal so you have to work on the basis of likelihoods and not certainties. It is hard because it requires trade-offs—processes designed to remove “bad” content will invariably catch some “good” content and you have to decide on the right balance between precision and recall for any particular system, and the noble Baroness, Lady Anderson of Stoke-on-Trent, has already referred to some of these challenges with specific examples.

I make this observation not to try and elicit any sympathy for online services, but rather some sympathy for Ofcom as we assign it the most challenging of tasks. This brings me to my first plea, which is that we allow Ofcom to make decisions about what constitutes compliance with the duties of care in the Bill without others second-guessing it. Because judgments and trade-offs are a necessary part of content moderation, there will always be people who take opposing views on where lines should have been drawn. These views may come from individuals, civil society or even Ministers and may form important and valuable input for Ofcom’s deliberations. But we should avoid creating mechanisms that would lead to competing and potentially conflicting definitions of compliance emerging. One chain of command—Parliament to Ofcom to the platforms—is best for accountability and effective regulation.

My second plea is for us to avoid cookie banner syndrome. The pop-ups that we all click on when visiting websites are not there for any technical reason but because of a regulatory requirement. Their origins lie in a last-minute amendment to the e-privacy directive from Members of the European Parliament who had concerns about online behavioural advertising. In practice, they have had little impact on advertising while costing many millions and leaving most users at best mildly irritated and at worst in greater risk as they learn to click through anything to close banners and get to websites.

There are several elements in this Bill that are at risk of cookie banner syndrome. Measures such as age and identity verification and content controls can be useful if done well but could also be expensive and ineffective if we mandate solutions that look good on paper but do not work in practice. If you see me mouthing “cookies” at you as we discuss the Bill, please do not see it as an offer of American biscuits but as a flag that we may be about to make an expensive mistake.

This brings to me to my final point, which is an offer of technical advice for any noble Lords trying to understand how the Bill will work in practice: my door and inbox are always open. I have spent 25 years working on internet regulation as poacher turned gamekeeper, turned poacher, turned gamekeeper. I may have a little more sympathy with the poachers than most politicians, but I am all gamekeeper now and keen to see this Bill become law. For those who like this kind of thing, I share more extensive thoughts on the Bill than I can get into four minutes in a blog and podcast called “Regulate Tech”.

Freedom of Expression (Communications and Digital Committee Report)

Lord Allan of Hallam Excerpts
Thursday 27th October 2022

(1 year, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I shall seek not to go over. I congratulate the noble Lord, Lord Gilbert, and the committee on the report. It is very timely to debate it today—the day on which the EU’s Digital Services Act comes into law, and as we ourselves eagerly anticipate the Online Safety Bill. I want to make a short contribution on the basis of having spent a decade inside one of the platforms, making decisions about how to manage content.

We are here with the Online Safety Bill and the Digital Services Act because we, the politicians, do not trust private companies to make decisions about their platforms. The noble Lord, Lord Gilbert, outlined some of the reasons why that trust has evaporated. The position now is that we are taking power to ourselves to tell platforms how to manage content, as a condition of operating in the UK market, and we will delegate the day-to-day enforcement of those rules to our chosen regulator, Ofcom.

An important question that arises from this, which the report rightly focuses on, is whether we should instruct Ofcom to consider only illegal speech or to bring in a wide range of other types of harmful speech. Because of the concerns about whether the regulator should enforce against only legal speech, there is now an interest in whether the definitions of “legal” and “harmful” could be more closely aligned. Today, I want to make a necessarily condensed argument for why this would be a mistake, both as a matter of principle and as a practical matter.

Turning first to the principle, we often hear calls to align online and offline standards. In our real-world interactions, we do not rely solely on the law to manage speech behaviour; this is to build on some of the arguments made by the noble Baroness, Lady O’Neill. To take an example, I could cover myself in swastikas and hand out copies of Mein Kampf entirely legally in the United Kingdom. There is no law that prohibits me. Yet were I to try to do that in most public spaces, such as by going to a football ground, I would be stopped on the basis that the speech norms prohibit my doing that, rather than because I had broken the law. We have a gap between what is unacceptable speech and what is illegal speech. This is not a bug but a feature of our speech norms in the United Kingdom.

It would be a mistake to try to make all unacceptable speech illegal or, equally, to deem all legal speech acceptable and try to force platforms to carry it. We are left with a sustained situation where there will be a gap between what we as a population believe is acceptable and what the law outlines, and that is right. We want to keep the legal prohibitions—the criminalisation of speech—as minimal as possible.

Turning to the practical considerations, which the noble Lord, Lord Gilbert, again talked about, it is sometimes assumed that there is a bright line between legal and illegal content. My experience over many years is that there is no such bright line but many shades of grey. Again, to illustrate this with a specific example, many people would post on social media pictures of Abdullah Öcalan, the leader of the PKK, a proscribed terrorist organisation in the UK. Now, when someone posts that picture, are they supporting the peace process he is engaged in in Turkey? Are they supporting him as a terrorist? Are they supporting his socialist ideals or the YPG in Syria, which also looks to Abdullah Öcalan? There is no way to understand the purpose and intent from that photo, so you have to make a judgment. At one end of the spectrum, you could say, “Look, I am so worried about terrorist content that I am going to take down every picture of Abdullah Öcalan”, knowing that you will be taking down many forms of legal expression. At the other end, you could say, “I will leave them all up, and if I do so I know that I will be permitting some expressions of support for terrorism, or some illegality to take place.” There are of course many points in between.

We have an opportunity now to shift where those judgments are made in the new structure outlined in the Online Safety Bill. Platforms will have to respond to guidance and codes of conduct precisely on these issues of how they make judgment and we, as Parliament, will have a role in setting that guidance and those codes and of conduct, as Ofcom will bring them to us. We are moving into a world where decisions will not necessarily get any easier but will no longer be the sole preserve of the platforms. It is a benefit for public accountability that there will be an official government or parliamentary view expressed through Ofcom’s codes of conduct. Equally, we as Parliament—or the British establishment—will be responsible in future for the decisions made around content moderation. I fear that I may have jumped out of the platform frying pan into the regulatory fire by engaging from this side of the argument, but the Online Safety Bill will be a significant improvement.

Social Media: Deaths of Children

Lord Allan of Hallam Excerpts
Thursday 20th January 2022

(2 years, 3 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I will speak to one particular issue that the noble Baroness has raised, quite rightly in my opinion, in this debate and in the report of the Draft Online Safety Bill Joint Committee, of which I know she was a very active member. This is the question of access to data from the accounts of people who have sadly taken their own lives where there is a view that it may reveal something useful and important for their grieving relatives.

I do this as somebody who used to work for a social media platform and took part in the decision-making process on responding to requests for data in these tragic circumstances. In the internal debate, we had to weigh two potential harms against each other. It was obvious that refusing to disclose data would add to the pain and distress of grieving families, which the noble Baroness eloquently described for us, and, importantly, reduce opportunities for lessons to be learned from these awful situations. But there was also a fear that disclosing data might lead to other harms if it included sensitive information related to the connections of the person who had passed away.

The reluctance to disclose is sometimes described as being for “privacy reasons”. We should be more explicit; the concern in these cases is that, in trying to address one tragedy, we take an action that leads to further tragedy. The nightmare scenario for those discussing these issues within the companies is that another young person becomes so distressed by something that has been disclosed that they go on to harm themselves in turn. This genuine fear means that platforms will likely err on the side of non-disclosure as long as providing data is discretionary for them. If we want to solve this problem, we need to move to a system where disclosure is mandated in some form of legal order. I will briefly describe how this might work.

Families should not have to go directly to companies at a time of serious distress; they should instead be able to turn to a specialist unit within our court system which can assess their request and send disclosure orders to relevant companies. The noble Baroness eloquently described the problem we have with the status quo, where people approach companies directly. The platforms would then be required to provide data to the courts, which would need to be able to carry out two functions before making it available to families and coroners as appropriate.

First, they should be able to go through the data to identify whether there are particular sensitivities that might require them to withhold or effectively anonymise any of the content. To the extent possible, they should notify affected people and seek consent to the disclosure. In many cases, the platforms will have contact details for those individuals. Secondly, they must be able to consider any conflicts of law that might arise from disclosure, especially considering content related to individuals who may be protected by laws outside of the jurisdiction of the UK courts. This would need to include making decisions on content where consent has been withheld. If we could set up a structure such as this, we could have a workable regime that would work for all interested parties.

A few minutes is obviously not long enough to cover all these issues in detail, so I will publish a more comprehensive post on my blog, which is aptly named regulate.tech. I thank the noble Baroness for creating an opportunity to consider this important issue, one I am sure we will return to during the passage of the online safety Bill.