Viscount Colville of Culross debates involving the Department for Digital, Culture, Media & Sport during the 2019 Parliament

Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 23rd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Thu 11th May 2023
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 2nd May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Wed 1st Feb 2023
Wed 11th Jan 2023
Baroness Berridge Portrait Baroness Berridge (Con)
- Hansard - - - Excerpts

I, too, thank my noble friend the Government Whip. I apologise too if I have spoken out of discourtesy in the Committee: I was not sure whose name was on which amendment, so I will continue.

Physically, I am, of course, working in my home. If that behaviour had happened in the office, it would be an offence, an assault: “intentional or reckless application of unlawful force to another person”. It will not be an offence in the metaverse and it is probably not harassment because it is not a course of conduct.

Although the basic definition of user-to-user content covers the metaverse, as does encountering, as has been mentioned in relation to content under Clause 207, which is broad enough to cover the haptic suits, the restriction to illegal content could be problematic, as the metaverse is a complex of live interactions that mimics real life and such behaviours, including criminal ones. Also, the avatar of an adult could sexually assault the avatar of a child in the metaverse, and with haptic technologies this would not be just a virtual experience. Potentially even more fundamentally than Amendment 125, the Bill is premised on the internet being a solely virtual environment when it comes to content that can harm. But what I am seeking to outline is that conduct can also harm.

I recognise that we cannot catch everything in this Bill at this moment. This research is literally hot off the press; it is only a few weeks old. At the very least, it highlights the need for future-proofing. I am aware that some of the issues I have highlighted about the fundamental difference between conduct and content refer to clauses noble Lords may already have debated. However, I believe that these points are significant. It is just happenstance that the research came out and is hot off the press. I would be grateful if the Minister would meet the Dawes Centre urgently to consider whether there are further changes the Government need to make to the Bill to ensure that it covers the harms I have outlined.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I have put my name to Amendments 195, 239 and 263. I also strongly support Amendment 125 in the name of my noble friend Lady Kidron.

During this Committee there have been many claims that a group of amendments is the most significant, but I believe that this group is the most significant. This debate comes after the Prime Minister and the Secretary of State for Science and Technology met the heads of leading AI research companies in Downing Street. The joint statement said:

“They discussed safety measures … to manage risks”


and called for

“international collaboration on AI safety and regulation”.

Surely this Bill is the obvious place to start responding to those concerns. If we do not future-proof this Bill against the changes in digital technology, which are ever increasing at an ever-faster rate, it will be obsolete even before it is implemented.

My greatest concern is the arrival of AI. The noble Baroness, Lady Harding, has reminded us of the warnings from the godfather of AI, Geoffrey Hinton. If he is not listened to, who on earth should we be listening to? I wholeheartedly support Amendment 125. Machine-generated content is present in so much of what we see on the internet, and its presence is increasing daily. It is the future, and it must be within scope of this Bill. I am appalled by the examples that the noble Baroness, Lady Harding, has brought before us.

In the Communications and Digital Committee inquiry on regulating the internet, we decided that horizon scanning was so important that we called for a digital authority to be created which would look for harms developing in the digital world, assess how serious a threat they posed to users and develop a regulated response. The Government did not take up these suggestions. Instead, Ofcom has been given the onerous task of enforcing the triple shield which under this Bill will protect users to different degrees into the future.

Amendment 195 in the name of the right reverend Prelate the Bishop of Oxford will ensure that Ofcom has knowledge of how well the triple shield is working, which must be essential. Surveys of thousands of users undertaken by companies such as Kantar give an invaluable snapshot of what is concerning users now. These must be fed into research by Ofcom to ensure that future developments across the digital space are monitored, updated and brought to the attention of the Secretary of State and Parliament on a regular basis.

Amendment 195 will reveal trends in harms which might not be picked up by Ofcom under the present regime. It will look at the risk arising for individuals from the operation of Part 3 services. Clause 12 on user empowerment duties has a list of content and characteristics from which users can protect themselves. However, the characteristics for which or content with which users can be abused will change over time and these changes need to be researched, anticipated and implemented.

This Bill has proved in its long years of gestation that it takes time to change legislation, while changes on the internet take just minutes or are already here. The regime set up by these future-proofing amendments will at least go some way to protecting users from these fast-evolving harms. I stress to your Lordships’ Committee that this is very much precautionary work. It should be used to inform the Secretary of State of harms which are coming down the line. I do not think it will give power automatically to expand the scope of harms covered by the regime.

Amendment 239 inserts a new clause for an Ofcom future management of risks review. This will help feed into the Secretary of State review regime set out in Clause 159. Clause 159(3)(a) currently looks at ensuring that regulated services are operating using systems and process which, so far as relevant, are minimising the risk of harms to individuals. The wording appears to mean that the Secretary of State will be viewing all harms to individuals. I would be grateful if the Minister could explain to the Committee the scope of the harms set out in Clause 159(3)(a)(i). Are they meant to cover only the harms of illegality and harms to children, or are they part of a wider examination of the harms regime to see whether it needs to be contracted or expanded? I would welcome an explanation of the scope of the Secretary of State’s review.

The real aim of Amendment 263 is to ensure that the Secretary of State looks at research work carried out by Ofcom. I am not sure how politicians will come to any conclusions in the Clause 159 review unless they are required to look at all the research published by Ofcom on future risk. I would like the Minister to explain what research the Secretary of State would rely on for this review unless this amendment is accepted. I hope Amendment 263 will also encourage the Secretary of State to look at possible harms not only from content, but also from the means of delivering this content.

This aim was the whole point of Amendment 261, which has already been debated. However, it needs to be borne in mind when considering that harms come not just from content, but also from the machine technology which delivers it. Every day we read about new developments and threats posed by a fast-evolving internet. Today it is concerns about ChatGPT and the race for the most sophisticated artificial intelligence. The amendments in this group will provide much-needed reinforcement to ensure that the Online Safety Bill remains a beacon for continuing safety online.

Lord Bishop of Chelmsford Portrait The Lord Bishop of Chelmsford
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak in favour of Amendments 195, 239 and 263, tabled in the names of my right reverend friend the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, who I thank for his comments.

My right reverend friend the Bishop of Oxford regrets that he is unable to attend today’s debate. I know he would have liked to be here. My right reverend friend tells me that the Government’s Centre for Data Ethics and Innovation, of which he was a founding member, devoted considerable resource to horizon scanning in its early years, looking for the ways in which AI and tech would develop across the world. The centre’s analysis reflected a single common thread: new technologies are developing faster than we can track them and they bring with them the risk of significant harms.

This Bill has also changed over time. It now sets out two main duties: the illegal content duty and the children duty. These duties have been examined and debated for years, including by the joint scrutiny committee. They are refined and comprehensive. Risk assessments are required to be “suitable and sufficient”, which is traditional language from 20 years of risk-based regulation. It ensures that the duties are fit for purpose and proportionate. The duties must be kept up to date and in line with any service changes. Recent government amendments now helpfully require companies to report to Ofcom and publish summaries of their findings.

However, in respect of harms to adults, in November last year the Government suddenly took a different tack. They introduced two new groups of duties as part of a novel triple shield framework, supplementing the duty to remove illegal harms with a duty to comply with their own terms of service and a duty to provide user empowerment tools. These new duties are quite different in style to the illegal content and children duties. They have not benefited from the prior years of consultation.

As this Committee’s debates have frequently noted, there is no clear requirement on companies to assess in the round how effective their implementation of these new duties is or to keep track of their developments. The Government have changed this Bill’s system for protecting adults online late in the day, but the need for risk assessments, in whatever system the Bill is designed around, has been repeated again and again across Committee days. Even at the close of day eight on Tuesday, the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, referred explicitly to the role of risk assessment in validating the Bill’s systems of press reforms. Surely this persistence across days and groups of debate reflects the systemically pivotal role of risk assessments in what is, after all, meant to be a systems and processes rather than a content-orientated Bill.

But it seems that many people on many sides of this Committee believe that an important gap in risk assessment for harms to adults has been introduced by these late changes to the Bill. My colleague the right reverend Prelate is keen that I thank Carnegie UK for its work across the Bill, including these amendments. It notes:

“Harms to adults which might trickle down to become harms to children are not assessed in the current Bill”.


The forward-looking parts of its regime need to be strengthened to ensure that Parliament and the Secretary of State review new ways in which harms manifesting as technology race along, and to ensure that they then have the right advice for deciding what to do about them. To improve that advice, Ofcom needs to risk assess the future and then to report its findings.

Online Safety Bill

Viscount Colville of Culross Excerpts
I also hope that the Government support Parliament in enhancing its oversight of the regulators in which so much power is being vested. However expert, independent and professional they may be—I note that my noble friend Lord Grade is not in the Chamber today, as I believe he is overseas this week, but no one respects and admires my noble friend more than I do, and I am not concerned in any way about the expertise and professionalism of Ofcom—none the less we are in a situation where they are being vested with a huge amount of power and we need to make sure that the oversight of them is right. Even if I do not support that which is specifically put forward by the noble Lord, Lord Stevenson, this is an area where we need to move forward but we need the Government to support us in doing so if we are going to make it happen. I look forward to what my noble friend has to say in response to this group.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I have put my name to Amendments 113, 114, 117, 118, 120 and 257. As the noble Baroness, Lady Stowell, has said, it is crucial that Ofcom both has and is seen to have complete independence from political interference when exercising its duty as a regulator.

On Ofcom’s website there is an article titled “Why Independence Matters in Regulating TV and Radio”—for the purposes of the Bill, I suggest that we add “Online”. It states:

“We investigate following our published procedures which contain clear, transparent and fair processes. It’s vital that our decisions are always reached independently and impartially”.


I am sure there are few Members of the Committee who would disagree with that statement. That sentiment is supported by a recent UNESCO conference to create global guidance for online safety regulation, whose concluding statement said that

“an independent authority is better placed to act impartially in the public interest and to avoid undue influence from political or industry interests”.

As the noble Baroness, Lady Stowell, has said, that is what successive Governments have striven to do with Ofcom’s regulation of broadcast and radio. Now the Government and Parliament must succeed in doing the same by setting up this Bill to ensure absolute independence for Ofcom in regulating the digital space.

The codes of practice drawn up by Ofcom will be central to the guidance for the parameters set out by the media regulator for the tech companies, so it is essential that the regulator, when setting them up, can act independently from political interference. In my view and that of many local Lords, Clause 39 does not provide that level of independence from political interference. No impartial observer can think that the clause as drafted allows Ofcom the independence that it needs to shape the limits of the tech platforms’ content. In my view, this is a danger to freedom of expression in our country by giving permission for the Secretary of State to interfere continually and persistently in Ofcom’s work.

Amendments 114 and 115 would ensure a badly needed reinforcement of the regulator’s independence. I see why the Minister would want a Secretary of State to have the right to direct the regulator, but I ask him to bear in mind that it will not always be a Minister he supports who is doing the directing. In those circumstances, surely he would prefer a Secretary of State to observe or have regard to the views on the draft codes of practice. Likewise, the endless ping-pong envisaged by Clause 39(7) and (9) allows huge political pressure and interference to be placed on the regulator. This would not be allowed in broadcast regulation, so why is it allowed for online regulation, which is already the dominant medium and can get only more dominant and more important?

Amendment 114 is crucial. Clause 39(1)(a), allowing the Minister’s direction to cover public policy, covers almost everything and is impossibly broad and vague. If the Government want an independent regulator, can the Minister explain how this power would facilitate that goal? I am unsure of how the Government will approach this issue, but I am told that they want to recognise the concerns about an overmighty Secretary of State by bringing forward their own amendment, limiting the powers of direction to specific policy areas. Can the Minister confirm that he is looking at using the same areas as in the Communications Act 2003, which are

“national security … relations with the government of a country … compliance with international obligations of the United Kingdom … the safety of the public or of public health”?

I worry about any government amendment which might go further and cover economic policy and burden to business. I understand that the Government would want to respond to the concerns that this Bill might create a burden on business and therefore could direct Ofcom to ease regulations in these areas. However, if this area is to be included, surely it will create a lobbyists’ charter. We all know how effective the big tech companies have been at lobbying the Government and slowing down the process of shaping this Bill. The Minister has only to talk to some of the Members who have helped to shape the Bill to know the determination and influence of those lobbying companies.

To allow the DCMS Secretary of State to direct Ofcom continuously to modify the codes of practice until they are no longer a burden to business would dramatically dilute the power and independence of the UK’s world-respected media regulator. Surely this is not what the people of Britain would want; the Minister should not want it either. The words “vague” and “broad” are used repeatedly by freedom of speech campaigners when looking at the powers of political interference in the Bill.

When the draft Bill came out, I was appalled by the extraordinary powers that it gave the Secretary of State to modify the content covered by “legal but harmful”, and I am grateful to the Government for responding to the Joint Committee and many other people’s concerns about this potentially authoritarian power. Clause 39 is not in the same league, but for all of us who want to ensure that Ministers do not have the power to interfere in the independence of Ofcom, I ask the Minister to accept the well-thought-through solutions represented by these amendments and supported by all Benches. I also support the request made by the noble Baroness, Lady Stowell, that Parliament should be involved in the oversight of Ofcom. I ask the Minister to respond to these widely supported amendments, either by accepting them or by tabling amendments of his own which guarantee the independence of the regulator.

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I also have a pair of amendments in this group. I am patron of a charity called JobsAware, which specialises in dealing with fraudulent job advertisements. It is an excellent example of collaboration between government and industry in dealing with a problem such as this. Going forward, though, they will be much more effective if there is a decent flow of information and if this Bill provides the mechanism for that. I would be very grateful if my noble friend would agree to a meeting, between Committee and Report, to discuss how that might best be achieved within the construct of this Bill.

It is not just the authorities who are able to deter these sort of things from happening. If there is knowledge spread through reputable networks about who is doing these things, it becomes much easier for other people to stop them happening. At the moment, the experience in using the internet must bear some similarity to walking down a Victorian street in London with your purse open. It really is all our responsibility to try to do something about this, since we now live so much of our life online. I very much look forward to my noble friend’s response.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I had the great privilege of serving as a member of this House’s Fraud Act 2006 and Digital Fraud Committee under the excellent chairing of the noble Baroness, Lady Morgan. She has already told us of the ghastly effects that fraud has on individuals and indeed its adverse effects on businesses. We heard really dramatic statistics, such as when Action Fraud told us that 80% of fraud is cyber enabled.

Many of us here will have been victims of fraud—I have been a victim—or know people who have been victims of fraud. I was therefore very pleased when the Government introduced the fraudulent advertising provisions into the Bill, which will go some way to reducing the prevalence of online fraud. It seems to me that it requires special attention, which is what these amendments should do.

We heard in our inquiry about the problems that category 1 companies had in taking down fraudulent advertisements quickly. Philip Milton, the public policy manager at Meta, told us that it takes between 24 and 48 hours to review possibly harmful content after it has been flagged to the company. He recognised that, due to the deceptive nature of fraudulent advertising, Meta’s systems do not always recognise that advertising is fraudulent and, therefore, take-down rates would be variable. That is one of the most sophisticated tech platforms—if it has difficulties, just imagine the difficulty that other companies have in both recognising and taking down fraudulent advertising.

Again and again, the Bill recognises the difficulties that platforms have in systematising the protections provided in the Bill. Fraud has an ever-changing nature and is massively increasing—particularly so for fraudulent advertising. It is absolutely essential that the highest possible levels of transparency are placed upon the tech companies to report their response to fraudulent advertising. Both Ofcom and users need to be assured that not only do the companies have the most effective reporting systems but, just as importantly, they have the most effective transparency to check how well they are performing.

To do this, the obligations on platforms must go beyond the transparency reporting requirements in the Bill. These amendments would ensure that they include obligations to provide information on incidence of fraud advertising, in line with other types of priority illegal content. These increased obligations are part of checking the effectiveness of the Bill when it comes to being implemented.

The noble Baroness, Lady Stowell, told us on the fifth day of Committee, when taking about the risk-assessment amendments she had tabled:

“They are about ensuring transparency to give all users confidence”.—[Official Report, 9/5/23; col. 1755.]


Across the Bill, noble Lords have repeatedly stated that there needs to be a range of ways to judge how effectively the protections provided are working. I suggest to noble Lords that these amendments are important attempts to help make the Bill more accountable and provide the data to future-proof the harms it is trying to deal with. As we said in the committee report:

“Without sufficient futureproofing, technology will most likely continue to create new opportunities for fraudsters to target victims”.


I ask the Minister to at least look at some of these amendments favourably.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I shall say very briefly in support of these amendments that in 2017, the 5Rights Foundation, of which I am the chair, published the Digital Childhood report, which in a way was the thing that put the organisation on the map. The report looked at the evolving capacity of children through childhood, what technology they were using, what happened to them and what the impact was. We are about to release the report again, in an updated version, and one of the things that is most striking is the introduction of fraud into children’s lives. At the point at which they are evolving into autonomous people, when they want to buy presents for their friends and parents on their own, they are experiencing what the noble Baroness, Lady Morgan, expressed as embarrassment, loss of trust and a sense of deserting confidence—I think that is probably the phrase. So I just want to put on the record that this is a problem for children also.

Online Safety Bill

Viscount Colville of Culross Excerpts
Baroness Bull Portrait Baroness Bull (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to the amendments in the name of the noble Baroness, Lady Stowell, to which I have added my name. As we heard, the amendments originally sat in a different group, on the treatment of legal content accessed by adults. Noble Lords will be aware from my previous comments that my primary focus for the Bill has been on the absence of adequate provisions for the protection of adults, particularly those who are most vulnerable. These concerns underpin the brief remarks I will make.

The fundamental challenge at the heart of the Bill is the need to balance protection with the right to freedom of expression. The challenge, of course, is how. The noble Baroness’s amendments seek to find that balance. They go beyond the requirements on transparency reporting in Clause 68 in several ways. Amendment 46 would provide a duty for category 1 services to maintain an up-to-date document for users of the service, ensuring that users understand the risks they face and how, for instance, user empowerment tools can be used to help mitigate these risks. It also provides a duty for category 1 services to update their risk assessments before making any “significant change” to the design or operation of their service. This would force category 1 services to consider the impact of changes on users’ safety and make users aware of changes before they happen, so that they can take any steps necessary to protect themselves and prepare for them. Amendment 47 provides additional transparency by providing a duty for category 1 services to release a public statement of the findings of the most recent risk assessment, which includes any impact on freedom of expression.

The grouping of these amendments is an indication, if any of us were in doubt, of the complexity of balancing the rights of one group against the rights of another. Regardless of the groupings, I hope that the Minister takes note of the breadth and depth of concerns, as well as the willingness across all sides of the Committee to work together on a solution to this important issue.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I put my name to Amendment 51, which is also in the name of the noble Lords, Lord Stevenson and Lord McNally. I have done so because I think Clause 15 is too broad and too vague. I declare an interest, having been a journalist for my entire career. I am currently a series producer of a series of programmes on Ukraine.

This clause allows journalism on the internet to be defined simply as the dissemination of information, which surely covers all posts on the internet. Anyone can claim that they are a journalist if that is the definition. My concern is that it will make a nonsense of the Bill if all content is covered as journalism.

I support the aims behind the clause to protect journalism in line with Article 10. However, I am also aware of the second part of Article 10, which warns that freedom of speech must be balanced by duties and responsibilities in a democratic society. This amendment aims to hone the definition of journalism to that which is in the public interest. In doing so, I hope it will respond to the demands of the second part of Article 10.

It has never been more important to create this definition of journalism in the public interest. We are seeing legacy journalism of newspapers and linear television being supplanted by digital journalism. Both legacy and new journalism need to be protected. This can be a single citizen journalist, or an organisation like Bellingcat, which draws on millions of digital datapoints to create astonishing digital journalism to prove things such as that Russian separatist fighters shot down flight MH17 over Ukraine.

The Government’s view is that the definition of “in the public interest” is too vague to be useful to tech platforms when they are systematically filtering through possible journalistic content that needs to be protected. I do not agree. The term “public interest” is well known to the courts from the Defamation Act 2013. The law covers the motivation of a journalist, but does not go on to define the content of journalism to prove that it is in the public interest.

--- Later in debate ---
Amendment 51 in the name of the noble Lord, Lord Stevenson of Balmacara, seeks to change the duty of category 1 services to protect journalistic content so it applies only to journalism which they have judged to be in the public interest. This would delegate an inappropriate amount of power to platforms. Category 1 platforms are not in a position to decide what information is in the interests of the British public. Requiring them to do so would undermine why we introduced the Clause 15 duties—
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - -

Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.

We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.

In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.

Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.

Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.

Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.

Online Safety Bill

Viscount Colville of Culross Excerpts
I very much hope that my noble friend will say what I want to say, which is that, yes, there is an issue and we would like to do something. We understand the motivation here, but this is very much the wrong way of going about it. It is inimical to free speech and it leads to absurd conclusions.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

I support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be

“designed to effectively … reduce the likelihood of the user encountering content”

they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.

At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.

The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.

Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.

The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.

The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.

Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.

Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.

I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.

I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.

I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.

Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.

I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

I too wish my noble friend Lady Kidron a happy birthday.

I will speak to Amendment 261. Having sat through the Communications Committee’s inquiries on regulating the internet, it seemed to me that the real problem was the algorithms and the way they operated. We have heard that again and again throughout the course of the Bill. It is no good worrying just about the content, because we do not know what new services will be created by technology. This morning we heard on the radio from the Google AI expert, who said that we have no idea where AI will go or whether it will become cleverer than us; what we need to do is to keep an eye on it. In the Bill, we need to make sure that we are looking at the way technology is being developed and the possible harms it might create. I ask the Minister to include that in his future-proofing of the Bill, because, in the end, this is a very fast-moving world and ecosystem. We all know that what is present now in the digital world might well be completely changed within a few years, and we need to remain cognisant of that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have already had some very significant birthdays during the course of the Bill, and I suspect that, over many more Committee days, there will be many more happy birthdays to celebrate.

This has been a fascinating debate and the Committee has thrown up some important questions. On the second day, we had a very useful discussion of risk which, as the noble Lord, Lord Russell, mentioned, was prompted by my noble friend Lord Allan. In many ways, we have returned to that theme this afternoon. The noble Baroness, Lady Fox, who I do not always agree with, asked a fair question. As the noble Baroness, Lady Kidron, said, it is important to know what harms we are trying to prevent—that is how we are trying to define risk in the Bill—so that is an absolutely fair question.

The Minister has shown flexibility. Sadly, I was not able to be here for the previous debate, and it is probably because I was not that he conceded the point and agreed to put children’s harms in the Bill. That takes us a long way further, and I hope he will demonstrate that kind of flexibility as we carry on through the Bill.

The noble Lord, Lord Moylan, and I have totally different views about what risk it is appropriate for children to face. I am afraid that I absolutely cannot share his view that there is this level of risk. I do not believe it is about eliminating risk—I do not see how you can—but the Bill should be about preventing online risk to children; it is the absolute core of the Bill.

As the noble Lord, Lord Russell, said, the Joint Committee heard evidence from Frances Haugen about the business model of the social media platforms. We listened to Ian Russell, the father of Molly, talk about the impact of an unguarded internet on his daughter. It is within the power of the social media companies to do something about that; this is not unreasonable.

I was very interested in what the noble Viscount, Lord Colville, said. He is right that this is about algorithms, which, in essence, are what we are trying to get to in all the amendments in this really important group. It is quite possible to tackle algorithms if we have a requirement in the Bill to do so, and that is why I support Amendment 261, which tries to address to that.

However, a lot of the rest of the amendments are trying to do exactly the same thing. There is a focus not just on moderating harmful content but on the harmful systems that make digital services systematically unsafe for children. I listened with great interest to what the noble Lord, Lord Russell, said about the 5Rights research which he unpacked. We tend to think that media platforms such as Reddit are relatively harmless but that is clearly not the case. It is very interesting that the use of avatars is becoming quite common in the advertising industry to track where advertisements are ending up—sometimes, on pornography sites. It is really heartening that an organisation such as 5Rights has been doing that and coming up with its conclusions. It is extremely useful for us as policymakers to see the kinds of risks that our children are undertaking.

We were reminded about the origins—way back, it now seems—of the Carnegie duty of care. In a sense, we are trying to make sure that that duty of care covers the systems. We have talked about the functionality and harms in terms of risk assessment, about the child safety duties and about the codes of practice. All those need to be included within this discussion and this framework today to make sure that that duty of care really sticks.

I am not going to go through all the amendments. I support all of them: ensuring functionalities for both types of regulated service, and the duty to consider all harms and not just harmful content. It is absolutely not just about the content but making sure that regulated services have a duty to mitigate the impact of harm in general, not just harms stemming from content.

The noble Baroness, Lady Harding, made a terrific case, which I absolutely support, for making sure that the codes of practice are binding and principle based. At the end of the day, that could be the most important amendment in this group. I must admit that I was quite taken with her description of the Government’s response, which was internally contradictory. It was a very weak response to what I, as a member of the Joint Committee, thought was a very strong and clear recommendation about minimum standards.

This is a really important group of amendments and it would not be a difficult concession for the Government to make. They may wish to phrase things in a different way but we must get to the business case and the operation of the algorithms; otherwise, I do not believe this Bill is going to be effective.

I very much take on board what about the noble Viscount said about looking to the future. We do not know very much about some of these new generative AI systems. We certainly do not know a great deal about how algorithms within social media companies operate. We will come, no doubt, to later amendments on the ability to find out more for researchers and so on, but transparency was one of the things our Joint Committee was extremely keen on, and this is a start.

Online Safety Bill

Viscount Colville of Culross Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I declare an interest as a series producer of online and linear content. I, like many noble Lords, can hardly believe that this Bill has finally come before your Lordships’ House. It was in 2017, when I first joined the Communications and Digital Committee, that we started to look at online advertising. We went on to look at regulating the internet in three separate inquiries. I am pleased to see some of those recommendations in the Bill.

It is not surprising that I support the words of the present chair of the committee, the noble Baroness, Lady Stowell, when she said that the Secretary of State still has far too many powers over the regulator. Draft codes of practice, in which Ofcom can give the parameters and direction for the tech companies, and the review of their implementation, are going to be central in shaping its terms of service. Generally, in democracies, we are seeing regulators of the media given increasing independence, with Governments limiting themselves to setting up their framework and then allowing them to get on with the task at hand. I fear the Bill is not doing that. I understand that the codes will be laid before Parliament, but I would support Parliament having a much stronger power over the shaping of those regulations.

I know that Labour supports a Select Committee having the power to scrutinise this work, but having served on the Communications and Digital Committee, I fear that the examination of consultations from Ofcom would monopolise its entire work. I support the pre-legislative committee’s suggestion of a Joint Committee of Parliament, whose sole job would be to examine regulations and give input. I will support amendments to this effect.

I am also worried about Clauses 156 and 157. I listened to the Minister when he said that amendments to the Secretary of State’s powers of guidance will be brought before the House and that they will be used only in exceptional circumstances. However, the list of subjects on which I understand the Minister will then be able to intervene is still substantial, ranging from public safety through economic policy and burdens to business. Are the Government prepared to consider further limiting these powers to intervene?

I will also look at risk assessments in the Bill. They need to go further than illegal content and child safety. The empowerment lists in Clause 12 are not risk assessed and do not seem to have enough flexibility for what noble Lords know is an ever-changing world of harms. The volume of online content means that moderation is carried out by algorithms. During the inquiries in which I was involved, we were told repeatedly that algorithms are very bad at distinguishing humour and context when deciding on harmful content. Ensuring that the platforms’ systems moderate correctly is difficult. There was a recent case of that: the farcical blocking by Twitter of the astronomer Dr Mary McIntyre, whose account was suspended because her six-second video of a meteor shower was mistaken by the Twitter algorithms for a porn video. For weeks, she was unable to get any response from Twitter. Such mistakes happen only too frequently. Dr McIntyre’s complaint is only one of millions made every year against the tech companies, for being either too keen or not keen enough to take down content and, in some cases, to block accounts. So the Bill needs to include a risk assessment which looks at the threat to free speech from any changes in those systems. Ofcom needs to be able to create those risk assessments and to produce annual reports which can then be laid before a Joint Committee for Parliament’s consideration. That should be supported by an ombudsman.

I would also like to see the definition of safety duties on platforms to take down illegal content changed from “reasonable grounds” to the platform being aware that the content is “manifestly illegal”—and, if possible, for third parties, such as the NCA, to be involved in the process. That will reduce the chance of chilling free speech online as much as possible.

I am also aware that there has been concern over the duties to protect news publishers and journalistic content. Like other noble Lords, I am worried that the scope in respect of the latter is drawn too widely in the Bill, and that it covers all content. I would support amendments which concentrate on protecting journalism in the public interest. The term “in the public interest” is well known to the courts, is present in Section 4 of the Defamation Act, and is used to great effect to protect journalism which is judged to be in the public interest.

I welcome the Bill after its long journey to this House. I am sure that the hard work of fellow Peers and collaboration with the Minister will ensure that it leaves this House in a clearer, more comprehensive and safer state. The well-being of future generations of internet users in this country depends on us getting it right.

Channel 4

Viscount Colville of Culross Excerpts
Wednesday 11th January 2023

(1 year, 3 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

My noble friend will know, as a former leader of your Lordships’ House, that that decision is above my pay grade, but it is our intention to bring the media Bill forward when parliamentary time allows. I am grateful to her and the other members of your Lordships’ committee for their thoughts, which have been part of the evidence that my right honourable friend and colleagues at the department have weighed up.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I declare my interest as a series producer for an independent television production company. Like many other noble Lords, I welcome the Minister’s Statement. I heard him say that the Government have now given Channel 4 the freedom to produce its own content in order to stimulate the independent sector. At the moment, Channel 4 commissions over 55% of its content from small qualifying production companies and is a major customer for many of the larger producers. Can the Minister explain how he can ensure that setting up the in-house production base will not adversely affect the independent content producer ecosystem?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- View Speech - Hansard - - - Excerpts

We know how important our independent production sector is, not just to British television but to our creative industries more widely. We are absolutely committed to ensuring that Channel 4 plays its part in supporting what is a £3 billion sector. We will increase the level of Channel 4’s independent production quota, and, in doing so, we are looking at the potential for introducing specific protections for smaller independent producers.

Public Service Broadcasting: BBC Centenary

Viscount Colville of Culross Excerpts
Thursday 3rd November 2022

(1 year, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I declare an interest as the series producer of a new series on the people of Ukraine to be made for international public service broadcasters.

In the words of the Communications and Digital Committee report, Public Service Broadcasting: As Vital as Ever:

“Public service broadcasting can bring the nation together in a way in which other media cannot and can ‘raise the level’ of quality, as well as ensuring continued investment in original UK content”.


In a broadcasting environment in which the PSBs are facing massive threats from the global streamers, they need economic and political support from the Government to remain relevant to British audiences.

Noble Lords have mentioned the uncertainty surrounding Channel 4, with the continued political indecision about whether to go ahead with its privatisation. I take heart from the Secretary of State’s answer in the other place that she is looking at the business case for its sale. As there is no business case, I would suggest to the Minister that it should not take long to resolve the issue. The channel had its most profitable year last year and if its borrowing limits need to be raised to compete with the streamers, then that should be facilitated.

As noble Lords have said, the other important policy for the Government to enact is the prominence regime on digital platforms. At the moment the channels are finding themselves thwarted by the massive power imbalance with the streamers. This matter is urgent. TV manufacturers are demanding huge fees to ensure the prominence of PSB tiles on their platforms. Channel 4 has just had to pull out of talks with the manufacturers of the LG TV sets because they demanded too much money to place its tile in the most prominent position on the home page, while the Amazon platform has just demoted the position of the All 4 tile to make way for the promotion of its own Freevee tiles. Channel 4 has asked Ofcom to investigate the move. The BBC is better placed because of its “must carry” obligations. The broadcasting White Paper had some very important promises to enshrine the principles of “appropriate prominence” on digital platforms, but every month the Government delay the new regime, the PSBs lose money.

PSB commercial channels need further support in their business relationships with the streamers. Noble Lords only have to look at ITV’s anguished negotiations with these global giants to see why it is important. The channel is confronting variants of closed platforms when dealing with Amazon, Google and other tech companies, which define the terms of inclusion of content. This is particularly important with the upcoming launch of ITVX with its wide-ranging digital offer. Amazon has told it that it must accept the standard terms of 30% share of advertising revenue, take it or leave it. It also will not share data with the content providers. Netflix is famous for failing to provide any audience data to content producers. However, that seems to be an own goal, as content commissioners will obviously commission material better tailored to audiences if they have the data on who is watching and how they are watching material.

The imbalance of power between the platforms and the PSBs strikes me as similar to that affecting news publishers, which I spoke about in last week’s debate on the Free for All? report. I suggested a variation of the mandatory code set up in Australia for news publishers. Surely something similar could be established for TV content providers on platforms in this country. This Government pledged to support the growth of British business and surely our indigenous PSBs should be given all possible help to break open the dominance of the platforms when creating terms for use of their content. I ask the Minister whether such measures could be included in the media or digital markets Bills.

I would also like to put in a plea for Ofcom-licensed radio stations. Last week the latest RAJAR audience figures showed that the percentage of online radio listening has increased from 18% last year to 24% this year. That is a huge increase which, if continued, will mean that the majority of listening will be online within five years. Smart speakers make up half of that and voice-controlled, in-car IP platforms are also a growing online market. It is important that Britain’s radio stations have protected positions on these devices. Without them, there is the temptation for platforms to drive listeners away from UK radio and towards their own playlists.

There also needs to be a regime which supports the ad revenue of UK commercial stations to ensure that they can place their own adverts around their content rather than being forced to accept the platforms’ ad offer, with the subsequent losses of revenue. I hope the media Bill will have specific clauses to support radio and protect listeners in these fields. I understand that the stations are having fruitful conversations with DCMS. I would be grateful if the Minister would give your Lordships’ House his thinking on this.

I welcome the Minister back to the Front Bench and I hope that his second term of office will be filled with the long-awaited DCMS Bills on media and digital spaces. They need to come before Parliament as soon as possible to protect our media industry from the onslaught of the streaming giants. The content production sector is a booming, but its mainstay and driver is the power of our public sector broadcasters in this new age. I ask the Government to embrace them in the tender arms of legislative support.

Freedom of Expression (Communications and Digital Committee Report)

Viscount Colville of Culross Excerpts
Thursday 27th October 2022

(1 year, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - -

My Lords, I declare an interest as a freelance TV producer. I had the honour of serving on the Communications Committee when this report was published. I too thank the noble Lord, Lord Gilbert, for his very able chairing of this inquiry.

The noble Lord, Lord Gilbert, suggested that the Government should amend the Online Safety Bill clauses on content that is legal but harmful to adults. I agree with the fears that these clauses will have an extremely deleterious effect on free speech. It is not just that the definition for this material is so vague, but that the Bill gives such dangerous powers to the Secretary of State to specify what is harmful by regulations. I support the recommendation in this report, which were then taken further by the Joint Committee on the Bill, to set up a parliamentary committee that will have the power to interrogate these changes further. I understand that the last Government were minded to drop these clauses. I would be grateful if the Minister would share with your Lordships’ House the new Government’s thinking on this issue.

I want to concentrate my speech on the later recommendations in the report. Recommendations 33 and 34 call for the Digital Markets Unit to be given statutory powers. It has been established for over a year and a half but still has not been given these. This could not be a more urgent issue. The big tech companies are still shockingly dominant. Your Lordships have heard this week of the falls in their share prices, but they still have enormous power in the markets.

In the tech ad market, this power is supreme. The CMA’s report into online platforms and digital marketing space found that Google and Facebook, as it was then called, make up 80% of digital advertising spend. It declared that the market is “no longer … contestable”. Such dominance is an obvious threat to innovative start-ups. Even if they manage to get a share of the advertising revenue, they face the ever-present threat of being bought up before they have grown to scale by the big players, whose dominance is therefore enhanced.

The problem is that the CMA’s monopoly rules concentrate on consumer price benefit. Obviously, when so many of the services offered by the platforms are free, that does not apply. Instead, different metrics must be introduced which take into account how the platforms use data, consumers’ privacy and freedom of expression.

The Government’s response to the committee’s recommendation is to acknowledge that competition is central to unlocking the full potential of the digital economy. They promise to deliver reforms that will bring more vibrant markets, innovation and increase productivity. Who in this House does not agree with that?

I echo the noble Baroness, Lady Stowell, who asked why the Government have been so slow to enact these pledges. The Queen’s Speech dangled before your Lordships the hope of a draft digital markets and competition Bill, which promised to give the DMU statutory powers so that it can tackle tech companies’ abuse of their dominant positions. As the Government delay on this matter, regular businesses and consumers are losing out. The CMA suggests that they are losing £2.4 billion annually from the overpricing of the big platforms on ad sales alone.

Instead, the Government have used valuable legislative time to bring forward a media Bill which, although containing useful elements, promises to privatise Channel 4, which is driven by blind ideology rather than any business case. Can the Minister give the House an indication of when the digital markets Bill will come before it? I hope he will give us an assurance that goes beyond “when parliamentary time allows”.

I should also like to draw your Lordships’ attention to recommendation 42 of the report, which calls for a mandatory bargaining code to be set up to ensure fair negotiations between platforms and news publishers. Since 2010, over 265 regional newspapers in the UK have closed. Those that remain have seen their circulations collapse and this lost revenue is not being replaced by digital subscriptions. The industry faces an existential threat.

The big hope is that it can be resurrected digitally, as 38% of visits to news publishers’ websites came from links on Google or Facebook. However, at the moment the platforms get the content free or at very little cost, even though news content is one of the biggest drivers of traffic. The tech companies have made contracts with some newspaper publishers to pay for their content, but many say that the power imbalance is so great in the platforms’ favour that they are not being paid the true cost for using the content.

A bargaining code has already been introduced in Australia. It is not perfect because it is not sufficiently inclusive of regional players, and some people are worried about a mandatory contract for news content being imposed on the platforms. However, Rod Sims, the ex-head of Australia’s competition commission, told me that this has not happened and he had not been forced to use his powers. The threat of the imposition of a contract has changed the dynamic in the market enough to bring the platforms to agree an equitable price with news publishers for use of their content.

The report needs to see more of its recommendations taken up by the Government. There is still important work to be done if this country is to become a digital world leader. I urge the Minister to do all he can to ensure that there is legislation which allows freedom of expression and for a competitive digital market to allow a plurality of platforms in which those voices can be heard.