Lord Knight of Weymouth debates involving the Department for Science, Innovation & Technology during the 2019 Parliament

Digital Markets, Competition and Consumers Bill

Lord Knight of Weymouth Excerpts
Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- Hansard - - - Excerpts

My Lords, I am pleased to speak on this first day of Committee and thank all noble Lords for their continued and valued engagement on the DMCC Bill, which, as many noble Lords have observed, will drive innovation, grow the economy and deliver better outcomes for consumers. I am grateful for noble Lords’ continued scrutiny and am confident that we will enjoy a productive debate.

I start by briefly speaking to government Amendments 11 and 12, which I hope noble Lords will support. They make the strategic market status notice provisions consistent by obliging the Competition and Markets Authority to provide reasons for its decision not to designate a firm following an initial SMS investigation.

I turn to Amendment 1, tabled by the noble Baroness, Lady Jones of Whitchurch. The amendment seeks to ensure that the CMA will be able to use, in its SMS investigations, previous analysis undertaken in related contexts. I agree entirely that the CMA should not have to repeat work that it has already done and should be able to draw on insights from previous analysis when carrying out an SMS investigation, when it is appropriate and lawful to do so.

I offer some reassurance to the noble Baroness that the Bill as drafted permits the CMA to rely on evidence that it has gathered in the past, so long as it is appropriate and lawful to do so. As she highlighted, a strength of the regime is the flexibility for the CMA to consider different harms in digital markets. I suspect that this is a theme that we will return to often in our deliberations, but being prescriptive about what information the CMA can rely on risks constraining the broad discretion that we have built into the legislation.

Amendments 3, 4, 5 and 6, tabled by the noble Lord, Lord Clement-Jones, would make it explicit that the CMA must consider currently available evidence of expected or foreseeable developments when assessing whether a firm holds substantial and entrenched market power in a digital activity. Amendment 3 would remove the duty for the CMA to consider such developments over a five-year period. The regime will apply regulation to firms for a five-year period; it is therefore appropriate that the CMA takes a forward look over that period to assess whether a firm’s market power is substantial and entrenched, taking account of expected or foreseeable developments that might naturally reduce the firm’s market power, if it were not designated.

Without an appropriate forward look, there is a risk that designation results in firms facing disproportionate or unnecessary regulation that harms innovation and consumers. However, the CMA will not be required to prove that a firm will definitely have substantial and entrenched market powers for the next five years—indeed, that would be impossible. The CMA will have to give reasons for its decisions to designate firms and support any determination with evidence. As a public body, it will also be subject to public law principles, which require it to act reasonably and take into account relevant considerations. Therefore, in our view, these amendments are not necessary.

Amendment 7, tabled by the noble Viscount, Lord Colville of Culross, seeks to remove the power for the Secretary of State to amend by regulations subject to the affirmative procedure the conditions to be met for the CMA to establish a position of strategic significance. I recognise, first, that Henry VIII powers should be used in legislation only when necessary. To the point raised by my noble friend Lady Harding, I also recognise the importance of limiting the scope for too much disputation around this and for too many appeals. In this case, however, the power helps to ensure that the regime can adapt to digital markets that evolve quickly and unpredictably.

Changes in digital markets can result from developments in technology, business models, or a combination of both. The rapid pace of evolution in digital markets, to which many have referred, means that the CMA’s current understanding of power in these markets has changed over the past decade. The concept of strategic significance may therefore also need to evolve in future, and the conditions to be updated quickly, so that the regime remains effective in addressing harms to competition and consumers effectively. The affirmative resolution procedure will give Parliament the opportunity to scrutinise potential changes. It will provide a parliamentary safeguard to ensure that the criteria are not watered down, and should address the noble Lord’s concerns regarding lobbying. For these reasons, I believe that it is important to retain this power.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

To look at Clause 6 and the four conditions laid down there, they appear pretty generic, in terms of size; the number of undertakings; the position in respect of digital activity, which would allow an extension of market power; and the ability to influence the ways in which other undertakings conduct themselves. They are generic conditions, so can the Minister give us a bit more of a taste of the kind of thing that just might crop up? I know that he does not have a crystal ball, but could he tell us what might crop up that would require these Henry VIII powers to be used?

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I was looking forward to hearing the noble Lord, Lord Knight, introduce these amendments but, owing to a glitch in timing when tabling the amendments, I am unfortunately in the hot seat this afternoon. As well as moving Amendment 2, I will speak to Amendments 18, 23, 56 and 61.

These amendments, developed by the Institute for the Future of Work, are aimed in particular at highlighting the direct and indirect impacts on job creation, displacement and conditions and on the work environment in the UK, which are important considerations that are relevant to competition and should be kept closely under review. I look forward to hearing what the noble Lord, Lord Knight, says, as co-chair of the All-Party Parliamentary Group on the Future of Work, which helped the Institute for the Future of Work to develop the amendments.

Digital markets and competition are shaping models for work, the distribution of work, access to work and the conditions and quality of work for several different reasons. Digital connected worker and labour platforms are used across the economy, not just for online or gig work. There is concentration in digital markets, with the emergence of a few dominant actors such as Amazon and Uber, which impacts the number and nature of local jobs created or lost. There are specific anti-competitive practices, such as wage and price fixing, which is currently subject to litigation in the US, and there are secondary and spillover impacts from all the above, including the driving of new models of business that may constrain wages, terms and work quality, directly or indirectly.

A good example is cloud-based connected worker platforms, which use behavioural and predictive algorithms to nudge and predict performance, match and allocate work and set standards. There is also increased market dominance in cloud computing, on which a growing number of UK businesses depend. For example, Amazon Web Services leads four companies in control of 67% of world cloud infrastructure and over 30% of the market.

Other examples are algorithmic hiring, job matching and task-allocation systems, which are trained on data that represents past practices and, as a result, can exclude or restrict groups from labour market opportunities. Social, environmental and well-being risks and impacts, including on work conditions and environments, are under increasing scrutiny from both the consumer and the corporate sustainability perspective—seen, for instance, in the World Economic Forum’s Global Risks Report 2024, and the EU’s new corporate sustainability due diligence directive, due to be formally approved this year, which obliges firms to integrate their human rights and environmental impact into their management systems.

This suggests that consumer interests can extend to local and supply-chain impacts, and informed decision-making will need better information on work impacts. For a start, key definitions such as “digital activity” in Clause 4 need to take into account impacts on UK work and workers in determining whether there is a sufficient link to the UK. Amendment 2 is designed to do this. Secondly, the CMA’s power to impose conduct requirements in Chapter 3 of the Bill should make sure that a designated undertaking can be asked to carry out and share an assessment on work impacts. Similarly, the power in Chapter 4, Clause 46, to make pro-competition interventions, which hinges on having an adverse effect, should be amended to include certain adverse impacts on work. Amendments 18, 23 and 56 are designed to do this.

Thirdly, information and understanding about work impacts should be improved and monitored on an ongoing basis. For example, the CMA should also be able to require an organisation to undertake an assessment to ascertain impacts on work and workers as part of a new power to seek information in Clause 69. This would help investigations carried out to ascertain relevant impacts and decide whether to exercise powers and functions in the Bill.

Evidence is emerging of vertical price fixing at a platform level, which might directly impact the pay of UK workers, including payment of the minimum wage and, therefore, compliance with labour law, as well as customer costs. Such anti-competitive practices via digital platforms are not limited to wages, or gig, remote or office work. Ongoing research on the gigification of work includes connected worker platforms, which tend to be based on the cloud. This is indicative of tight and increasing control, and the retention of scale advantages as these platforms capture information from the workplace to set standards, penalise or incentivise certain types of behaviour, and even advise on business models, such as moving to more flexible and less secure contracts. At the more extreme end, wages are driven so low that workers have no choice but to engage in game-like compensation packages that offer premiums for completion of a high number of tasks in short or unsociable periods of time, engage in risk behaviours or limit mobility.

The Institute for the Future of Work has developed a model which could serve as a basis for this assessment: the good work algorithmic impact assessment. The UK Information Commissioner’s Office grants programme supports it and it is published on the DSIT website. The assessment covers the 10 dimensions of the Good Work Charter, which serves as a checklist of workplace impacts in the context of the digitisation of work: work that promotes dignity, autonomy and equality; work that has fair pay and conditions; work where people are properly supported to develop their talents and have a sense of community. The proposed good work AIA is designed to help employers and engineers to involve workers and their representatives in the design, development and deployment of algorithmic systems, with a procedure for ongoing monitoring.

In summary, these amendments would give the CMA an overarching duty to monitor and consider all these impacts as part of monitoring adverse effects on competition and/or a relevant public interest. We should incorporate this important aspect of digital competition into the Bill. I beg to move.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I congratulate the noble Lord, Lord Clement-Jones, on the way he occupied the hot seat and introduced his amendments. I had hoped to add my name to them but other things prevented me doing so. As he said, I co-chair the All-Party Group on the Future of Work with Matt Warman in the other place. I am grateful to the Institute for the Future of Work, and to Anna Thomas in particular for her help in putting these amendments together.

I start with a reflection on industrialisation, which in its own way created a massive explosion in economic activity and wealth, and the availability of goods and opportunities. There was innovation and it was good for consumers, but it also created considerable harms to the environment and to workers. The trade union movement grew up as a result of that.

In many ways, the technological revolution that we are going through, which this legislation seeks to address and, in part, regulate, is no different. As the Minister said a few moments ago, we see new opportunities with the digital tools and products that are being produced as part of this revolution, more jobs, more small and medium-sized enterprises able to grow, more innovation and more opportunities for consumers. These are all positive benefits that we should celebrate when we think about and support the Bill, as we do on all sides of the Committee.

However, the risks for workers, and the other social and environmental risks, are too often ignored. The risks to workers were totally ignored in the AI summit that was held by the Government last year. That is a mistake. During the Industrial Revolution, it took Parliament quite a while to get to the Factory Acts, and to the legislation needed to provide the protection for society and the environment. We might be making the same mistake again, at a time when people are being hired by algorithm and, as the noble Lord, Lord Clement-Jones, pointed out, managed by algorithm, particularly at the lower end of the labour market and in more insecure employment.

The Institute for the Future of Work’s report, The Amazonian Era, focused on the logistics sector. If you were ever wondering why your Amazon delivery arrives with a knock on the door but there is nobody there when you open it to say hello and check that the parcel has been delivered, it is because the worker does not have time to stop and check that someone is alive on the other side of the door—they have to get on. They are being managed by machine to achieve a certain level of productivity. They are wearing personalised devices that monitor how long their loo breaks are if they are working in the big warehouses. There is a huge amount of technological, algorithmic management of workers that is dehumanising and something which we should all be concerned about.

In turn, having been hired and managed by algorithms, people may well be being fired by algorithm as well. We have seen examples—for example, Amazon resisting trade union recognition in a dispute with the GMB, as the trade union movement also tries to catch up with this and do something about it. Recently, we saw strikes in the creative sector, with writers and artists concerned about the impact on their work of algorithms being used to create and that deskilling them rapidly. I have been contacted by people in the education world who are exam markers—again, they are being managed algorithmically on the throughput of the exams that they have to mark, despite this being an intensive, knowledge-based, reflective activity of looking at people’s scripts.

In this legislation we have a “user”, “consumer”, “worker” problem, in that all of them might be the same person. We are concerned here about users and consumers, but fail to recognise that the same person may also be a worker, now being sold, as part of an integrated service, with the technology, and at the wrong end of an information asymmetry. We have lots of data that is consumer-centric, and lots of understanding about the impacts on consumers, but very little data on the impact of their function as a worker.

In the United States, we have seen the Algorithmic Accountability Act. Last month, the Council of Europe published its recommendations on AI. Both are shifting the responsibility towards the companies, giving them a burden of proof to ensure that they are meeting reasonable standards around worker rights and conditions, environmental protection and so on. These amendments seek to do something similar. They want impacts on work, and on workers in particular, to be taken into account in SMS designation, competition decisions, position of conduct requirements and compliance reports. It may be that, if the Government had delivered on their promise of many years now to deliver an employment Bill, we could have dealt with some of these things in that way. But we do not have that opportunity and will not have it for some time.

As I have said, the collective bargaining option for workers is extremely limited; the digital economy has had very limited penetration of trade union membership. It is incumbent on your Lordships’ House to use the opportunities of digital legislation to see whether we can do something to put in place a floor of minimum standards for the way in which vulnerable workers across the economy, not just in specific digital companies, are subject to algorithmic decision-making that is to their disadvantage. We need to do something about it.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

Which regulators is the Minister thinking of? I am interested in Clauses 107 and 108, which are about regulatory co-ordination and information sharing, and whether there is something we should do there with those regulators. If he could give us a hint as to which regulators he is thinking of, that would be really helpful.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I refer to the digital regulators themselves—the ICO or the FCA and Ofcom—or indeed regulators with oversight of employment law.

Amendment 61 would enable the CMA to require algorithmic impact assessments, to assess the impact of algorithms on society and the environment, including working conditions, if it considered such information relevant to its digital markets functions. I agree wholeheartedly with the noble Lord about the importance of understanding the impact of algorithmic systems on society, the environment and working conditions in the UK.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, I think that I am saying that. The CMA, over the course of its investigations, can come across information beyond its own competitive remit but relevant for other regulators, and then could and should choose to advise those other regulators of a possible path for action.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

In that sense, could the CMA ask for an impact assessment on the algorithmic harm that might be carried out? Would that be in the power and remit of the CMA?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

The CMA does have power and remit to request an algorithmic impact assessment. I will take advice on this, because I believe that the algorithmic assessment that it undertakes must be in the direction of understanding anti-competitive behaviours, rather than a broader purpose. I will happily take advice on that.

As the Bill stands, the CMA will already have sufficient investigatory powers to understand the impact of complex algorithms on competition and consumers. The suggested expansion of this power would fall outside the role and remit of the CMA. Moreover, the CMA would not have appropriate tools to address such issues, if it did identify them. The Government will continue to actively look at whether new regulatory approaches are needed in response to developments in AI, and will provide an update on their approach through the forthcoming AI regulation White Paper response.

I thank the noble Lord once again for raising these important issues and hope that he feels able to withdraw the amendment.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, I start with apologies from my noble friend Lady Jones of Whitchurch, who cannot be with us due to illness. We wish her a speedy recovery in time for Christmas. I have therefore been drafted in temporarily to open for the Opposition, shunting my noble friend Lord Bassam to close for us at the end of the debate. As a result, what your Lordships will now get with this speech is based partly on his early drafts and partly on my own thoughts on this debate—two for the price of one. I reassure your Lordships that, while I am flattered to be in the super-sub role, I look forward to returning to the Back Benches for the remaining stages in the new year.

I remind the House of my technology interests, particularly in chairing the boards of CENTURY Tech and EDUCATE Ventures Research—both companies working with AI in education. I very much welcome the noble Lord, Lord de Clifford, to his place and look forward to his maiden speech.

Just over six years ago, I spoke at the Second Reading of the Data Protection Bill. I said then that:

“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.


For me, that remains the vision. We are grateful to the Minister for setting out in his speech his vision, but it feels to me that one of the Bill’s failings is the weakening of the protection from exploitation that would follow if it passes in its current form. In that 2017 Second Reading speech, I also said that:

“No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead”.—[Official Report, 10/10/17; cols. 183-5.]


Now that we have moved squarely into the age of AI, I welcome the opportunity to update GDPR to properly regulate data capture, storage and sharing in the public interest.

In the Online Safety Act, we strengthened Ofcom to regulate technology providers and their algorithmic impacts. In the Digital Markets, Competition and Consumers Bill, we are strengthening the Competition and Markets Authority to better regulate these powerful acquisitive commercial interests. This Bill is the opportunity to strengthen the Information Commissioner to better regulate the use of data in AI and some of the other potential impacts discussed at the recent AI summit.

This is where the Bill is most disappointing. As the Ada Lovelace Institute tells us in its excellent briefing, the Bill does not provide any new oversight of cutting-edge AI developments, such as biometric technologies or foundation models, despite well-documented gaps in existing legal frameworks. Will the Minister be coming forward with anything in Committee to address these gaps?

While we welcome the change from an Information Commissioner to a broader information commission, the Bill further weakens the already limited legal safeguards that currently exist to protect individuals from AI systems that make automated decisions about them in ways that could lead to discrimination or disadvantage—another lost opportunity.

I co-chair the All-Party Parliamentary Group on the Future of Work, and will be seeking to amend the Bill in respect of automated decision-making in the workplace. The rollout of ChatGPT-4 now makes it much easier for employers to quickly and easily develop algorithmic tools to manage staff, from hiring through to firing. We may also want to provide safeguards over public sector use of automated decision-making tools. The latter is of particular concern when reading the legal opinion of Stephen Cragg KC on the Bill. He says that:

“A list of ‘legitimate interests’ (mostly concerning law and order, safeguarding and national security) has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned … The Secretary of State can add to this list without the need for primary legislation, bypassing important Parliamentary controls”.


Furthermore, on lost opportunities, the Bill does not empower regulators with the tools or capabilities that they need to implement the Government’s plans for AI regulation or the commitments made at the AI Safety Summit. In this, I personally support the introduction of a duty on all public regulators to have regard to the principles on AI that were published in the Government’s White Paper. Would the Minister be willing to work with me on that?

There are other lost opportunities. I have argued elsewhere that data trusts are an opportunity to build public trust in their data being used to both develop better technology and generate revenue back to the taxpayer. I remain interested in whether personal data could be defined as an asset that can be bequeathed in one’s estate to avoid what we discussed in our debates on what is now the Online Safety Act, where bereaved families have had a terrible experience trying to access the content their children saw online that contributed to their deaths—and not just from suicide.

This takes me neatly on to broken promises and lessons not learned. I am confident that, whether the Government like it or not, the House will use this Bill to keep the promises made to families by the Secretary of State in respect of coroners being able to access data from technology providers in the full set of scenarios that we discussed, not just self-harm and suicide. It is also vital that the Bill does nothing to contradict or otherwise undermine the steps that this country has taken to keep children safe in the digital world. I am sure we will hear from the noble Baroness, Lady Kidron, on this subject, but let me say at this stage that we support her and, on these Benches, we are fully committed to the age-appropriate design code. The Minister must surely know that in this House, you take on the noble Baroness on these issues at your peril.

I am also confident that we will use this Bill to deliver an effective regime on data access for researchers. During the final parliamentary stages of the Online Safety Bill, the responsible Ministers, Paul Scully MP and the noble Lord, Lord Parkinson, recognised the importance of going further on data access and committed in both Houses to exploring this issue and reporting back on the scope to implement it through other legislation, such as this Bill. We must do that.

The Bill has lost opportunities and broken promises, but in other areas it is also failing. The Bill is too long—probably like my speech. I know that one should not rush to judgment, but the more I read the Bill and various interpretations of its impact, the more I worry about it. That has not been helped by the tabling of some 260 government amendments, amounting to around 150 pages of text, on Report in another place—that is, after the Bill had already undergone its line-by-line scrutiny by MPs. Businesses need to be able to understand this new regime. If they also have any data relationship with the EU, they potentially also need to understand how this regime interacts with the EU’s GDPR. On that, will the Minister agree to share quickly with your Lordships’ House his assessment of whether the Bill meets the adequacy requirements of the EU? We hear noises to the contrary from the Commission, and it is vital that we have the chance to assess this major risk.

After the last-minute changes in another place, the Bill increasingly seems designed to meet the Government’s own interests: first, through changes to rules on direct marketing during elections, but also by giving Ministers extensive access to the bank account data of benefit claimants and pensioners without spelling out the precise limitations or protections that go alongside those powers. I note the comments of the Information Commissioner himself in his updated briefing on the Bill:

“While I agree that the measure is a legitimate aim for government, given the level of fraud and overpayment cited, I have not yet seen sufficient evidence that the measure is proportionate ... I am therefore unable, at this point, to provide my assurance to Parliament that this is a proportionate approach”.


In starting the scrutiny of these provisions, it would be useful if the Minister could confirm in which other countries such provisions already exist. What consultation have they been subject to? Does HMRC already have these powers? If not, why go after benefit fraud but not tax fraud?

Given the lack of detailed scrutiny this can ever have in the other place, I of course assume the Government will respect whatever is the will of this House when we have debated these measures.

As we did during last week’s debate on the Digital Markets, Competition and Consumers Bill, I will now briefly outline a number of other areas where we will be seeking changes or greater clarity from the Government. We need to see a clear definition of high-risk processing in the Bill. While the Government might not like subject access requests after recent experience of them, they have not made a convincing case for significantly weakening data-subject rights. Although we support the idea of smart data initiatives such as extending the successful open banking framework to other industries, we need more information on how Ministers envisage this happening in practice. We need to ensure the Government’s proposals with regards to nuisance calls are workable and that telecommunications companies are clear about their responsibilities. With parts of GDPR, particularly those on the use of cookies, having caused so much public frustration, the Bill needs to ensure appropriate consultation on and scrutiny of future changes in this area. We must take the public with us.

So a new data protection Bill is needed, but perhaps not this one. We need greater flexibility to move with a rapidly changing technological landscape while ensuring the retention of appropriate safeguards and protections for individuals and their data. Data is key to future economic growth, and that is why it will be a core component of our industrial strategy. However, data is not just for growth. There will be a clear benefit in making data work for the wider social good and the empowerment of working people. There is also, as we have so often discussed during Oral Questions, huge potential for data to revitalise the public services, which are, after 13 years of this Government, on their knees.

This Bill seems to me to have been drafted before the thinking that went into the AI summit. It is already out of date, given its very slow progress through Parliament. There is plenty in the Bill that we can work with. We are all agreed there are enormous opportunities for the economy, our public services and our people. We should do everything we can to take these opportunities forward. I know the Minister is genuinely interested in collaborating with colleagues to that end. We stand ready to help the Government make the improvements that are needed, but I hope the Minister will acknowledge that there is a long way to go if this legislation is to have public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy too. We must end the confusion, empower the regulators and in turn empower Parliament to better scrutinise the tsunami of digital secondary legislation coming at us. There is much to do.

Artificial Intelligence: Regulation

Lord Knight of Weymouth Excerpts
Tuesday 14th November 2023

(5 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I am pleased to tell my noble friend that, following a request from the Secretary of State, the safety policies of Amazon, Anthropic, Google DeepMind, Inflection, Meta, Microsoft, OpenAI and others have been published and will go into what we might call a race to the top—a competitive approach to boosting AI safety. As for enshrining those practices in regulation, that is something we continue to look at.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, further to the question from the noble Lord, Lord Holmes, around data—that the power of the AI is in many ways defined by the quality of the data—does the Minister have any concern that the Prime Minister’s friend, Elon Musk, for example, owns a huge amount of sentiment data through Twitter, a huge amount of transportation data through Tesla, and a huge amount of communication data through owning more than half the satellites orbiting the planet? Does he not see that there might be a need to regulate the ownership of data across different sectors?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Indeed. Of course, one of the many issues with regulating AI is that it falls across so many different jurisdictions. It would be very difficult for any one country, including the US, to have a single bit of legislation that acted on the specific example that the noble Lord mentions. That is why it is so important for us to operate on an international basis and why we continue not just with the AI safety summit at Bletchley Park but working closely with the G7 and G20, bodies of the UN, GPAI and others.

King’s Speech

Lord Knight of Weymouth Excerpts
Tuesday 14th November 2023

(5 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, it is always a pleasure to follow the noble Baroness. I particularly endorse her last comments around digital exclusion. I very much look forward to the three maiden speeches from the welcome new Members of your Lordships’ House.

Our cultural sector, our sports industries and our flair for design are a large part of what defines this country globally. They are also critical in defining our future success and that of our children. By the time a child currently in reception leaves school in 2037, we will need to have shifted to a new economic model and a new social contract, and embedded a new way of working that is no longer exploiting people or the planet. We need a truly sustainable and equitable future if we are to give that child the opportunity to thrive.

At the same time, as AI moves apace, it will, as we heard from the AI summit, largely disrupt the labour market; workers will be deskilled as machines combine highly agile robotics with the ability to recognise patterns, predict language and assimilate vast amounts of information, while learning to constantly improve their accuracy. New and exciting jobs will emerge, with inventiveness, curiosity and unpredictability at their heart, to work alongside these machines of prediction. These are big challenges.

The gracious Speech is the chance to hear the Government’s vision to address these huge challenges. By contrast, it reminded me of a decaf cappuccino. On the surface, there were some good sprinklings of legislative cocoa powder: I am a fan of the proposals on football governance and will want to do my bit on the digital markets and data protection Bills; I will be especially keen to explore how forms of data trust can help build public confidence in data sharing so that we can exploit the potential of AI for good to the full. The Speech also had plenty of froth: we will have to see whether warm words on support for the creative industries, making AI safe, or fixing apprenticeship take-up will amount to anything at all. But at its heart, the Speech lacked substance and offered not a glimmer of hope for future generations that things can get better. It is as though the Government have run out of ideas and are incapable of thinking long-term beyond the next general election.

Yesterday saw a commitment to recycling ministerial resources but lacked the punch we need to get Britain building again, to drive us forward into a sustainable green economy. The Speech lacked the shot of stimulant that the country needs to get us moving again and to fix the NHS and our wider public services that remain underinvested and broken following David Cameron’s years of austerity. Most importantly, it lacked a future vision to offer the opportunities our young people need to face the uncertainties of the future with confidence.

I should draw noble Lords’ attention to my entry in the register, in particular as a director of CENTURY Tech and EDUCATE Ventures, both businesses deploying AI in education. I know from that commercial work that hiring talent in the tech sector is really challenging. What does this Speech have to offer that challenge? I know from work I recently completed for Engineering UK with the noble Lord, Lord Willetts, who I am delighted to see in his place, that without reversing the decline in engineering apprenticeships we will fail to have the technical skills we need to transition to a growing green economy. What does this Speech have to offer that challenge?

For the technology and cultural sectors, access to skills is key and a constraint on current growth. What did the Speech have to say on reversing the catastrophic decline in the creative subjects and in design and technology in our schools? How can we build a pipeline of talent into design, engineering and technology if the only applied subject in the national curriculum is fading away? The number of students entering design and technology GCSE has more than halved since 2009. In that time, the numbers teaching the subject have also halved; teacher recruitment for D&T met just 23% of its overall target in 2021-22, and it is getting worse. I wish the new Schools Minister well and hope that he recognises that school accountability in the English baccalaureate and Progress 8 have all contributed to this decline.

This Minister at the Dispatch Box will know that one of our most inspirational Britons is Sir Jony Ive, the designer of a suite of Apple products, including my iPad here, that have changed the world. His dad was a D&T teacher and inspector and Jony left school to study industrial design in Newcastle. For this Minister the question is simply: does he agree that the decline in every creative subject except art in schools at GCSE and the decline in design and technology is now at a critical point for the long-term future of the sectors we are focused on in this debate? Will he be meeting the new Minister, Damian Hinds, and pushing him to accelerate and extend the ambition of the advanced British standard? The Speech promised that it will bring technical and academic routes into a single qualification, but we know that that will take 10 years and not extend below 16. That is too little and way too late for this country’s technology, creative and sustainable futures.

We are all proud of our cultural and digital sectors in this country. As we can see, the rapid adoption of AI means that the competitive advantage of humans over machines can be so only if we are better humans: more creative, more expressive, more caring, more inventive; that is our future. We need a Government who understand the urgency of making the changes we need to deliver that hopeful opportunity.