Wednesday 7th May 2025

(1 day, 18 hours ago)

Commons Chamber
Read Hansard Text Read Debate Ministerial Extracts
Consideration of Bill, as amended in the Public Bill Committee
[Relevant documents: Correspondence between the Joint Committee on Human Rights, the Secretary of State for Science, Innovation and Technology and the Minister of State for Data Protection and Telecoms, on the Data (Use and Access) Bill [Lords], reported to the House on 30 April, 13 April and 26 February.]
New Clause 16
Economic impact assessment
“(1) The Secretary of State must, before the end of the period of 12 months beginning with the day on which this Act is passed—
(a) prepare and publish an assessment of the economic impact in the United Kingdom of each of the four policy options described in section B.4 of the Copyright and AI Consultation Paper, read with relevant parts of section C of that Paper (policy options about copyright law and the training of artificial intelligence models using copyright works), and
(b) lay a document containing the assessment before Parliament.
(2) The document may include an assessment of the economic impact in the United Kingdom of policy options which are alternatives to the options described in subsection (1)(a).
(3) An assessment included in the document must, among other things, include assessment of the economic impact of each option on—
(a) copyright owners, and
(b) persons who develop or use AI systems,
including the impact on copyright owners, developers and users who are individuals, micro businesses, small businesses or medium-sized businesses.
(4) In this section—
‘AI system’ means a machine-based system that, from the input it receives, can infer how to—
(a) generate predictions, digital content, recommendations, decisions or other similar outputs, or
(b) influence a physical or virtual environment,
with a view to achieving an explicit or implicit objective;
‘the Copyright and AI Consultation Paper’ means the command paper ‘Copyright and AI: Consultation’, numbered CP1205, published on 17 December 2024;
‘copyright owner’ has the same meaning as in Part 1 of the Copyright, Designs and Patents Act 1988;
‘develop’ an AI system means carry on an activity involved in producing the system, such as (for example) designing, programming, training or testing the system (and related terms are to be interpreted accordingly);
‘digital content’ means data which is produced and supplied in digital form;
‘medium-sized business’ means a business with at least 50 but fewer than 250 staff;
‘micro business’ means a business with fewer than 10 staff;
‘small business’ means a business with at least 10 but fewer than 50 staff;
‘use’ an AI system means instruct an AI system to generate outputs or to influence an environment (and related terms are to be interpreted accordingly).”—(Chris Bryant.)
This new clause requires the Secretary of State to prepare, publish and lay before Parliament an assessment of the economic impact in the UK of the policy options described in section B.4 of the government’s recent consultation paper on Copyright and Artificial Intelligence.
Brought up, and read the First time.
14:22
Chris Bryant Portrait The Minister for Data Protection and Telecoms (Chris Bryant)
- View Speech - Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Government new clause 17—Report on the use of copyright works in the development of AI systems.

New clause 1—Age of consent for social media data processing—

“(1) The UK GDPR is as amended as follows.

(2) In Article 8 of the UK GDPR (Conditions applicable to child's consent in relation to information society services)

After paragraph 1 insert—

‘(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.

(1B) For the purposes of paragraph 1A “social networking services” means any online service that—

(a) allows users to create profiles and interact publicly or privately with other users, and

(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.

(1C) Paragraph 1B does not apply to—

(a) educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.

(b) health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support’”.

This new clause would raise the age for processing personal data in the case of social networking services from 13 to 16.

New clause 2—Compliance with UK copyright law by operators of web crawlers and general-purpose AI models—

“(1) The Secretary of State must by regulations make provision (including any such provision as might be made by Act of Parliament), requiring the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988, regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place.

(2) Provision made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—

(a) pre-training and training,

(b) fine tuning,

(c) grounding and retrieval-augmented generation, and

(d) the collection of data for the said purposes.

(3) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”

This new clause requires web crawlers and general-purpose AI models with UK links to comply with UK copyright law across all stages of AI development.

New clause 3—Transparency of crawler identity, purpose and segmentation

“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding the identity of crawlers used by them or by third parties on their behalf, including but not limited to—

(a) the name of the crawler,

(b) the legal entity responsible for the crawler,

(c) the specific purposes for which each crawler is used,

(d) the legal entities to which operators provide data scraped by the crawlers they operate, and

(e) a single point of contact to enable copyright owners to communicate 35 with them and to lodge complaints about the use of their copyrighted works.

(2) The information disclosed under subsection (1) must be available on an easily accessible platform and updated at the same time as any change.

(3) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to deploy distinct crawlers for different purposes, including but not limited to—

(a) web indexing for search engine results pages,

(b) general-purpose AI model pre-training, and

(c) retrieval-augmented generation.

(4) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to ensure that the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine.

(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”

This new clause requires operators of web crawlers and AI models to disclose their identity, purpose, data-sharing practices, and use separate crawlers for different functions.

New clause 4—Transparency of copyrighted works scraped

“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding text and data used in the pre-training, training and fine-tuning of general purpose AI models, including but not limited to—

(a) the URLs accessed by crawlers deployed by them or by third parties on their behalf or from whom they have obtained text or data,

(b) the text and data used for the pre-training, training and fine-tuning, including the type and provenance of the text and data and the means by which it was obtained, and

(c) information that can be used to identify individual works, and (d) the timeframe of data collection.

(2) The disclosure of information under subsection (1) must be updated on a monthly basis in such form as the regulations may prescribe and be published in such manner as the regulations may prescribe so as to ensure that it is accessible to copyright owners upon request.

(3) The Secretary of State must lay before Parliament a draft of the statutory 35 instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”

This new clause mandates transparency about the sources and types of data used in AI training, requiring monthly updates accessible to copyright owners.

New clause 5—Enforcement

“(1) The Secretary of State must by regulations make provision requiring the Information Commission (under section 114 of the Data Protection Act 2018) (‘the Commissioner’) to monitor and secure compliance with the duties by an operator of a web crawler or general-purpose artificial intelligence (AI) model whose service has links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 (‘a relevant operator’), including but not limited to the following—

(a) the regulations must provide for the Commissioner to have the power by written notice (an ‘information notice’) to require a relevant operator to provide the Commissioner with information that the Commissioner reasonably requires for the purposes of investigating a suspected failure to comply with the duties;

(b) the regulations must provide for the Commissioner to have the power by written notice (an ‘assessment notice’) to require and to permit the Commissioner to carry out an assessment of whether a relevant operator has complied or is complying with the duties and to require a relevant operator to do any of the acts set out in section 146(2) of the Data Protection Act 2018;

(c) the regulations must provide that where the Commissioner is satisfied 15 that a relevant operator has failed, or is failing to comply with the duties, the Commissioner may give the relevant operator a written notice (an ‘enforcement notice’) which requires it—

(i) to take steps specified in the notice, or

(ii) to refrain from taking steps specified in the notice;

(d) the regulations must provide that where the Commissioner is satisfied that a relevant operator has failed or is failing to comply with the duties or has failed to comply with an information notice, an assessment notice or an enforcement notice, the Commissioner may, by written notice (a ‘penalty notice’), require the person to pay to the Commissioner an amount in sterling specified in the notice, the maximum amount of the penalty that may be imposed by a penalty notice being the ‘higher maximum amount’ as defined in section 157 of the Data Protection Act 2018; and

(e) the regulations may provide for the procedure and rights of appeal 30 in relation to the giving of an information notice, an assessment notice, an enforcement notice or a penalty notice.

(2) The regulations must provide that any failure to comply with the duties by a relevant operator shall be directly actionable by any copyright owner who is adversely affected by such failure, and that such copyright owner will be entitled to recover damages for any loss suffered and to injunctive relief.

(3) The regulations must provide that the powers of the Commissioner and the rights of a copyright owner will apply in relation to a relevant operator providing a service from outside the United Kingdom (as well as such one provided from within the United Kingdom).

(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing the regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”

This new clause grants the Information Commissioner enforcement powers to ensure compliance with AI and web crawler transparency rules, including penalties for breaches.

New clause 6—Technical solutions—

“(1) The Secretary of State must conduct a review of the technical solutions that may be adopted by copyright owners and by the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to prevent and to identify the unauthorised scraping or other unauthorised use of copyright owners’ text and data.

(2) Within 18 months of the day on which this Act is passed, the Secretary of State must report on such technical solutions and must issue guidance as to the technical solutions to be adopted and other recommendations for the protection of the interests of copyright owners.”

This new clause requires the Secretary of State to review and report on technical measures to prevent unauthorised data scraping by web crawlers and AI models.

New clause 7—Right to use non-digital verification services

“(1) This section applies when an organisation—

(a) requires an individual to use a verification service; and

(b) uses a digital verification service for that purpose.

(2) Where it is reasonably practicable for an organisation to offer a non-digital method of verification, the organisation must—

(a) make a non-digital alternative method of verification available to any individual required to use a verification service; and

(b) provide information about digital and non-digital methods of verification to those individuals before verification is required.”

This new clause would create a duty upon organisations to support digital inclusion by offering non-digital verification services where practicable.

New clause 8—Data Vision and Strategy

“Within six months of Royal Assent of this Act, the Secretary of State must publish a ‘Data Vision and Strategy’ which outlines—

(a) the Government’s data transformation priorities for the next five years; and

(b) steps the Government will take to ensure the digitisation of Government services.”

New clause 9—Departmental Board Appointments

“(1) Within six months of the day on which this Act is passed—

(a) Government departments;

(b) NHS England; and

(c) NHS trusts

shall appoint to their departmental board or equivalent body at least one of the following—

(i) Chief Information Officer;

(ii) Chief Technology Officer;

(iii) Chief Digital Information Officer;

(iv) Service Transformation Leader; or

(v) equivalent postholder.

(2) The person or persons appointed as under subsection (1) shall provide an annual report on the progress of the department or body towards the Government’s Data Vision and Strategy.”

This new clause would require digital leaders to be represented at executive level within Government departments and other bodies.

New clause 10—Data use in Public Service Delivery Review

“(1) The Secretary of State must, every 12 months, lay before Parliament a ‘Data use in Public Service Delivery Review’.

(2) The Data use in Public Service Delivery Review shall include, but is not limited to assessment of the steps being taken to—

(a) improve the Government’s use of data in public service delivery over the previous 12 months;

(b) expand the use of data to support increased and improved digital services in public service delivery;

(c) improve expertise and digital talent within Government departments to help expand the use of data for public service delivery; and

(d) facilitate and regulate for better use of data in the delivery of public services.”

This new clause would require an annual assessment by the Secretary of State to examine the steps being taken to facilitate and regulate the use of data in the delivery of public services using digital and online technologies.

New clause 11—Access to a deceased child’s social media data

“(1) Where a person under 18 years of age has deceased, a parent or legal guardian (the ‘requestor’) may request from any internet service provider (ISP) the child’s user data from up to 12 months prior to the date of death.

(2) The ISP must provide a copy of the requested data, or direct account access, upon verification of the requestor’s identity and relationship to the deceased person, and no court order shall be required for such disclosure.

(3) ‘User data’ includes all content, communications, or metadata generated by or associated with the deceased person’s online activity, including stored messages and posts, except where the deceased person had explicitly directed otherwise prior to death.

(4) The ISP may refuse or redact specific data only where—

(a) disclosure would unduly infringe the privacy rights of another individual,

(b) the deceased person had explicitly opted out before death,

(c) there is a conflicting court order, or

(d) a serious risk to public safety or national security would result.

(5) In providing data under this section, the ISP must comply with data protection legislation.

(6) This section constitutes a lawful basis for disclosure under Article 6 of the UK GDPR.

(7) The Secretary of State may, by regulations subject to the affirmative resolution procedure—

(a) provide guidance on verifying parent or guardian status,

(b) clarify any additional grounds for refusal, and

(c) prescribe safeguards to protect third-party confidentiality.

(8) For the purposes of this section—

‘internet service provider (ISP)’ includes any provider of social media, messaging, or other online platforms; and

‘data protection legislation’ has the meaning given in section 51 of this Act.”

This new clause would allow parents of a deceased minor to obtain that child’s social media data without a court order, subject to privacy safeguards for third parties.

New clause 12—Raising the minimum age at which users can consent to processing of personal data

“(1) The UK GDPR is amended in accordance with subsection (2) of this section.

(2) (2) After paragraph 1 of Article 8 of the UK GDPR (Conditions applicable to child’s consent in relation to information society services) insert—

‘(1A) References to “13 years old” and “age of 13 years” in paragraph 1 shall be read as “16 years old” and “age of 16 years” in the case of processing of personal data.

(1B) Paragraph (1A) does not apply to—

(a) platform systems and services operated where the primary purpose of processing of personal data is for the advancement of a charitable purpose as defined in the Charities Act 2011;

(b) publicly owned platform systems and services operated for the primary purpose of law enforcement, child protection, education, or healthcare;

(c) cases in which the Secretary of State determines it is in the best interests of the child for an operator to accept the child’s own consent.’”

This new clause would raise the age for processing personal data from 13 to 16 years old with certain exceptions for charitable purposes and child safety.

New clause 13—Code of practice for the use of children’s educational data

“(1) Within 6 months of the passage of this Act, the Information Commissioner must prepare a code of practice which contains such guidance as the Information Commissioner considers appropriate on the processing of children’s data in connection with the provision of education.

(2) Guidance under subsection (1) must consider—

(a) all aspects of the provision of education including learning, school management, and safeguarding;

(b) all types of schools and learning settings in the development of guidance;

(c) the use of AI systems in the provision of education;

(d) the impact of profiling and automated decision-making on children’s access to education opportunities;

(e) children’s consent to the way their personal data is generated, collected, processed, stored and shared;

(f) parental consent to the way their children’s personal data is being generated, collected, processed, stored and shared;

(g) the security of children’s data;

(h) the exchange of information for safeguarding purposes.”

This new clause requires the Information Commissioner to produce a code of practice for accessing children’s educational data.

New clause 14—Transparency of business and customer data used in training Artificial Intelligence models

“(1) The Secretary of State must by regulations make provision requiring operators of general-purpose AI models to disclose upon request information about business data and customer data processed for the purposes of pre-training, training, fine-tuning, and retrieval-augmented generation in an AI model, or any other data input to an AI model.

(2) Business data and customer data must include, but is not limited to, the whole or any substantial part of a literary, dramatic, musical or artistic work, sound recording, film or broadcast included in any text, images and data used for the purposes set out in subsection (1).

(3) Information disclosable under subsection (1) must include but is not limited to:

(i) Digital Object Identifiers and file names;

(ii) Details of how the work was identified, including metadata;

(iii) The source from which it was scraped or otherwise obtained; and

(iv) The URLs accessed by crawlers deployed by operators, or by third parties, to obtain the data.

(4) The owner of rights in any individual work identifiable in information disclosed under subsection (1) must be provided upon request to the relevant operator with information as to whether and how they have complied with the laws of the United Kingdom in respect to that work.

(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”

This new clause would require the Secretary of State to set out transparency provisions requiring generative AI developers to provide information to enable individuals and creative businesses to determine whether their data, works and other subject matter have been used in training datasets.

New clause 15—Complaints procedure for vulnerable individuals

“(1) The Data Protection Act 2018 is amended in accordance with subsections (2) to (4).

(2) After section 165(3) insert—

‘(3A) For complaints under subsection (2), the Information Commissioner must provide appropriate complaints-handling procedures for—

(a) victims of modern slavery,

(b) victims of domestic abuse,

(c) victims of gender-based violence, or

(d) data subjects otherwise in a position of vulnerability.

(3B) Procedures under subsection (3A) must include—

(a) appropriate support for vulnerable individuals;

(b) provision of specialised officers for sensitive cases;

(c) signposting to support services;

(d) provision of a helpline;

(e) de-escalation protocols.’

(3) After section 166(1)(c) insert—

‘(d) fails to investigate a complaint appropriately or take adequate action to remedy findings of inadequacy.’

(4) After section 166(2)(b), insert—

‘(c) to use formal powers as appropriate to investigate a complaint and to remedy any findings of inadequacy, unless the request from the data subject is manifestly unfounded or excessive.’”

This new clause would require the Information Commission to introduce a statutory complaints procedure for individuals in a position of vulnerability and new grounds of appeal to an Information Tribunal.

New clause 18—Report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies

“(1) The Secretary of State must within six months of the passing of this Act—

(a) prepare and publish a report examining the need for a specific statutory public interest test to determine and safeguard access to NHS data by third-parties and companies.

(b) within 28 days of a report being laid under subsection (1) the Government must schedule a debate and votable motion on the findings of the report in each House.

(2) The report must consider—

(a) whether and in what situations it would be necessary, proportionate and lawful to share NHS data with third-parties and companies when the interests and risks to both the individual and/or public is considered.

(b) when it would be in the public interest and in the best interests of patients and the NHS to allow access by third-parties and companies to NHS data in relation to the provision of health care services and for promotion of health.”

This new clause would require the Secretary of State to produce a report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies and then to schedule a debate on it in each House.

New clause 19—Secretary of State’s duty to review the age of consent for data processing under the UK GDPR

“(1) The Secretary of State must, within 12 months of Royal Assent of this Act, have conducted a review and published a report into the operation of Article 8 (Conditions applicable to child's consent in relation to information society services) of the UK GDPR in relation to the data processed by social media platforms of children under the age of 16.

(2) As part of this review, the Secretary of State must consider—

(a) the desirability of increasing the digital age of consent under the UK GDPR from 13 to 16, taking into account the available evidence in relation to the impact of social media platforms on the educational, social and emotional development of children; and

(b) the viability of increasing the digital age of consent under Article 8 of the UK GDPR in relation to specific social media platforms which are shown by the evidence to be unsuitable for use by children under the age of 16.

(3) Within six months of the publication of the report under subsection (1), the Secretary of State must lay a plan before Parliament for raising the digital age of consent to 16 through amendments to Article 8 GDPR, unless the review concludes that such changes are unnecessary.”

New clause 20—Duties of the Secretary of State in relation to the use by web-crawlers and artificial intelligence models of creative content

“The Secretary of State must—

(a) by 16 September 2025, issue a statement, by way of a copyright notice issued by the Intellectual Property Office or otherwise, in relation to the application of the Copyright, Designs and Patents Act 1988 to activities conducted by web-crawlers or artificial intelligence models which may infringe the copyright attaching to creative works;

(b) by 16 September 2025, lay before Parliament a report which includes a plan to help ensure proportionate and effective measures for transparency in the use of copyright materials in training, refining, tuning and generative activities in AI;

(c) by 16 September 2025, lay before Parliament a report which includes a plan to reduce barriers to market entry for start-ups and smaller AI enterprises on use of and access to data;

(d) by 1 July 2026, publish a technological standard for a machine-readable digital watermark for the purposes of identifying licensed content and relevant information associated with the licence.”

New clause 21—Directions to public authorities on recording of sex data

“(1) The Secretary of State must, within three months of the passage of this Act, issue regulations relating to the code of practice set out in section 49 of this Act which require public authorities to—

(a) collect, process and retain sex data only where it is lawful to do so in accordance with data protection legislation;

(b) request and record sex data accurately, in every circumstance where sex data is collected, in accordance with following category terms and definitions—

(i) ‘Sex’ meaning male or female only based on ‘sex at birth’, ‘natal sex’ or ‘biological sex’ (these terms carrying the same meaning and capable of being used interchangeably); and,

(ii) in addition, where it is lawful to do so in accordance with data protection legislation and the Gender Recognition Act 2004, ‘Acquired Gender’ meaning male or female only, as recorded on a gender recognition certificate issued in accordance with the Gender Recognition Act 2004;

(c) have updated relevant organisation guidance to stipulate that, where sex data is collected, this must be done in accordance with the definitions set out by subsection (1)(b) within three months of these regulations coming into force;

(d) have conducted a review of the accuracy of data held in relation to the sex of data subjects to ensure that the data is accurate in recording sex at birth and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate within 12 months of these regulations coming into force;

(e) have taken every reasonable step to ensure that any data held in relation to the sex and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate of a data subject that is found to be inaccurate is rectified or erased within 18 months of these regulations coming into force; and

(f) have produced and submitted to the Secretary of State a report setting out the findings of its review in relation to the matters set out by subsection (1)(d) and, where relevant, a description of the steps taken to ensure that the data held by the relevant public authority is accurate within the definitions set out subsection (1)(b) with 18 months of these regulations coming into force.

(2) The Secretary of State may, on receipt of a report in accordance with subsection (1)(f) instruct a public authority to take any further remedial steps within a specified timeframe reasonably necessary to ensure the accuracy of the sex and acquired gender data held by the relevant public authority.

(3) The Secretary of State must, within one month of the passage of this Act, establish and maintain a register of public authorities approved to act as sources of data relating to the attribute of sex for persons providing digital verification services.

(4) The register in subsection (3) must be published on the website of the Office for Digital Identities & Attributes or any successor body.

(5) Until such time as a public authority is added to the register under subsection (3), persons providing digital verification services may only obtain data on the sex of an individual requesting the provision of digital verification services from the record of births held by the General Register Office in accordance with subsection (6).

(6) Information supplied by the General Register Office pursuant to subsection (5) must specify sex as recorded at birth, as well as any subsequent corrections to the register in the field marked ‘Sex’.

(7) The Secretary of State may, from time to time, add public authorities to the register as under subsection (3) only upon being satisfied on the basis of a report issued under subsection (1)(f), or satisfaction of such further steps required by the Secretary of State under subsection (2) that the data held by the relevant public authority in relation to sex and, where relevant, acquired gender as recorded on a gender recognition certificate, as defined in subsection (1)(b), is accurate.”

This new clause requires the Secretary of State to issue regulations relating to the code of practice in section 49 requiring public authorities to record sex data in line with these regulations when data are collected. This clause is linked to amendments 39 and 40.

New clause 22—Recording of ethnicity data for the purposes of public service delivery

“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data in the process of public service delivery and associated data collection.

(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.

(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”

This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be collected in the process of public service delivery.

New clause 23—Recording of ethnicity data on the Register of Births and Deaths

“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data during birth and death registration.

(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.

(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”

This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be able to be collected during birth and death registration.

Government amendments 11 to 32.

Amendment 39, in clause 45, page 42, line 30, at the beginning insert—

“Save in respect of data relating to sex,”.

This amendment is consequential on NC21.

Amendment 40, page 43, line 15, at end insert—

“”gender recognition certificate” means a gender recognition certificate issued in accordance with the Gender Recognition Act 2004.”

This amendment is consequential on NC21.

Government amendments 1 to 8.

Amendment 37, in clause 67, page 75, line 24, at end insert—

“(2A) For the purposes of paragraph 2, ‘scientific research’ means creative and systematic work undertaken in order to increase the stock of knowledge, including knowledge of humankind, culture and society, and to devise new applications of available knowledge.

(2B) To meet the reasonableness test in paragraph 2, the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards.”

This amendment incorporates clarifications to help reduce potential misuse of the scientific research exception. The first is a definition of scientific research based on the Frascati Manual. The second is a requirement that research be conducted in line with frameworks and standards in the UKRI Code of Practice for Research.

Amendment 41, in clause 80, page 95, line 19, at end insert—

“3. For the purposes of paragraph 1(a), a human’s involvement is only meaningful if they are a natural person with the necessary competence, authority and capacity to understand, challenge and alter the decision.”

See explanatory statement for Amendment 44.

Amendment 45, page 96, line 2, at end insert—

“5. Consent in accordance with paragraph 2 cannot be given by persons under the age of 18 where—

(a) the automated decision-making is likely to produce legal or similarly significant effects on the child, or

(b) the processing involves the profiling of a child to determine access to essential services, education, or other significant opportunities.

6. The controller shall not be obliged to maintain, acquire or process additional information in order to identify the age of a data subject for the sole purpose of complying with this Regulation.

7. A significant decision may not be taken based solely on automated processing, if the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child, taking into account their rights and development stage, authorised by law to which the controller is subject, and after suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are made publicly available.

8. Profiling or solely automated processing of children’s data may not occur for the purposes of targeted advertising or behavioural analysis.”

This amendment ensures that automated decision-making cannot take place in circumstances where it would affect a child’s access to significant opportunities or would not be in their best interests, as well as protections against practices such as behavioural analysis.

Amendment 46, page 96, leave out lines 13 to 19 and insert—

“(a) communicate to the data subject before and after the decision is taken the fact that automated decision-making is involved in the decision, the extent of any human involvement, and the availability of safeguards under this Article;

(b) provide the data subject with information about decisions described in paragraph 1 taken in relation to the data subject including meaningful information about the logic involved, the significance and the envisaged consequences of such processing for the data subject, and a personalised explanation for the decision;

(c) enable the data subject to make representations about such decisions;

(d) enable the data subject to obtain human intervention on the part of the controller in relation to such decisions;

(e) enable the data subject to contest such decisions.

3. For the purposes of paragraph 2(b), a personalised explanation must—

(a) be clear, concise and in plain language of the data subject’s choice in a readily available format;

(b) be understandable, and assume limited technical knowledge of algorithmic systems;

(c) address the reasons for the decision and how the decision affects the individual personally, which must include—

(i) the inputs, including any personal data;

(ii) parameters that were likely to have influenced or were decisive to decision or a counterfactual of what change would have resulted in a more favourable outcome;

(iii) the sources of parameters and inputs;

(d) be available free of charge and conveniently accessible to the data subject, free of deceptive design patterns.

4. Where the safeguards apply after a decision is made, the controller must give effect to data subject requests as soon as reasonably practicable and within one month of the request.

5. The controller must ensure the safeguards are fully in place and complete a data protection impact assessment under Article 35 before a decision under Article 22A is taken, documenting their implementation of the safeguards in addition to the requirements of that Article.

6. The controller must publish details of their implementation of the safeguards and how data subjects can make use of them.”

This amendment would ensure that data subjects are informed of automated decisions made about them in a timely way, and that that explanation is personalised to enable them to understand why it was made. It also ensures processors are incentivised to put the safeguards in place before commencing automated decision-making.

Amendment 42, page 96, line 23, after “Article 22A(1)(a),” insert

“and subject to Article 22A(3)”.

See explanatory statement for Amendment 44.

Amendment 43, page 97, line 19, at end insert—

“(3) To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”

See explanatory statement for Amendment 44.

Amendment 44, page 98, line 31, after “and 50C(3)(c),” insert “and subject to 50A(3)”.

This amendment and Amendments 41, 42 and 43 would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person who is empowered to change the decision in practice.

Amendment 9, in clause 81, page 100, line 7, at end insert—

“Age assurance

1C. Information society services which are likely to be accessed by children must use highly effective age verification or age estimation measures for the purpose of delivering on children’s higher protection matters.”

This amendment requires services which are likely to be accessed by children to use highly effective age verification measures.

Amendment 38, in clause 86, page 103, line 22, at end insert—

“(2A) Where personal data is processed for the purposes of scientific research under section 87(4) of the 2018 Act (‘reuse’), the processor or controller must publish details of the data sources used.

(2B) These details must as a minimum include a description of the scientific research, the provenance and method of acquisition of the personal data being reused, the original lawful basis for processing, the number of data subjects affected, and whether the data subjects have been notified of the reuse.

(2C) The processor or controller must notify the Information Commission when processing data for the purposes of scientific research under section 87(4) of the 2018 Act with the same details.”

This amendment ensures transparency for the use of scientific research exemptions by requiring those reusing personal data to publish details of that reuse and notify the Information Commission of that reuse.

Government amendments 33 and 34.

Amendment 10, in schedule 7, page 201, line 5, at end insert—

“(1B) A third country cannot be considered adequate or capable of providing appropriate safeguards by any authority where there exists no credible means to enforce data subject rights or obtain legal remedy.

(1C) For the purposes of paragraph 1A, the Secretary of State must make a determination as to whether credible means are present in a third country.

(1D) In making a determination regarding credible means, the Secretary of State must have due regard to the view of the Information Commissioner.

(1E) Credible means do not exist where the Secretary of State considers that any of the following are true:

(a) judicial protection of persons whose personal data is transferred to that third country is insufficient;

(b) effective administrative and judicial redress are not present;

(c) effective judicial review mechanisms do not exist; and

(d) there is no statutory right to effective legal remedy for data subjects.”

The amendment would prohibit personal data transfer to countries where data subject rights cannot be adequately upheld and prohibit private entities from using contracts to give the impression that data security exists.

Government amendments 35 and 36.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Earlier I appeared as a Department for Culture, Media and Sport Minister, and now I appear as a Department for Science, Innovation and Technology Minister. I hate to embarrass Members, but they will get two bouts of me today. I will start with the Government amendments, and then once I have heard the arguments from Members advancing other amendments, I will speak to those later in the debate. If I do not cover subjects in this initial speech, I will get back to them later.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

The right hon. Gentleman is enticing me. I hope he will be nicer to me than the Chair of the Culture, Media and Sport Committee, the hon. Member for Gosport (Dame Caroline Dinenage) was earlier.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am sure that the Chair of the Committee and I will always be nice to Minister. I was only going to say that I have experienced the slight schizophrenia he has referred to in holding roles in the Department for Science, Innovation and Technology and in DCMS at the same time. Although he is appearing as a DSIT Minister this afternoon, can he assure the House that he will not forget his responsibilities as a DCMS Minister for the creative industries?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I model myself in all things on the right hon. Gentleman, apart from the fact that I left the Tory party many years ago, and it is about time that he came over to the Labour Benches.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

No, the right hon. Member for Maldon (Sir John Whittingdale) could come over here; I am not going back over there.

The point I was going to make is that I am fully cognisant of my duties. I think the right hon. Gentleman was referring to the artificial intelligence copyright issues that we will be addressing fairly shortly. I like the fact that I am in both Departments, because it means I can bring the knowledge of both sectors to bear on each other. If we are lucky, and if we work hard at it, I hope that I will be able to persuade him that we can come to a win-win solution. As he knows, this is not easy. When I had my first meeting with him after I was appointed in the post, he said, “This is not an easy area to resolve.” I hope I am not breaking a confidence—but he is smiling.

I have a large number of topics to cover, and I am conscious that many Members will think this is the data Bill, when we will actually be dealing with an awful lot of subjects this afternoon that do not feel as if they have anything to do with the measures in the original version brought forward by the right hon. Gentleman and previously. I hope that Members will bear with me. I intend to address the Government’s amendments as follows: first, AI and copyright; secondly, deepfakes; thirdly, the national underground assets register; and then smart data and other minor and technical amendments.

I will start with AI and intellectual property. As Members know, it was never the Government’s intention to legislate on that issue at all in this Bill. It is a complex and important issue, which is why we have consulted on a package of measures. That consultation had more than 11,500 responses, which we are still considering. Several hon. Members have said to me, “Will you remove the opt-out clause in the Bill?” I need to make it absolutely clear that no such opt-out clause is in the Bill. We never laid one in the Bill, so there is not an opt-out clause to remove.

As Members will also know, the Lords inserted a set of amendments on AI and copyright, which we removed in Committee. They reappear on the amendment paper today as new clauses 2 to 6, tabled by the hon. Member for Harpenden and Berkhamsted (Victoria Collins). A similar measure has been tabled as new clause 14 by my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel).

We oppose all these new clauses for several reasons. First, they pre-empt the results of the consultation. It must surely be better to legislate on this complex subject in the round rather than piecemeal. The amendments are also unworkable. New clause 5, for instance, would make the Information Commissioner the regulator of transparency requirements, but the Information Commissioner’s Office has neither the skills nor the resources to perform that function. Obviously, transparency requirements without an effective enforcement mechanism are worse than useless, which means the other clauses on transparency are also unworkable in this context. The new clauses also fail to address some of the most important questions in this area. They effectively legislate piecemeal rather than in the round. Whenever Parliament has done that in the past, it has rued the day, and I think the same is true today.

Pete Wishart Portrait Pete Wishart (Perth and Kinross-shire) (SNP)
- Hansard - - - Excerpts

Does the Minister not understand the urgency? Generative AI is ingesting our whole creative catalogue as we speak. We need something in place now. We cannot wait a year for reports or three years for legislation; we need action now. Does he not understand that something needs to be brought forward here today? These amendments offer that.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I do not think the amendments do offer that, because I do not think they work. We need to legislate in the round, as I say, and not piecemeal. I point out to the hon. Member that there is something of a two-edged sword here. I have been repeatedly told—and I understand the point—that there is no legal uncertainty as to the copyright status of works that are being scraped. At the same time, people are saying they want legislative change. Those two things cannot be true at the same time. I am determined to get us to a better place on this, as I will perhaps explain in a couple of moments.

I think there is an intention to push new clause 2 to a vote later, which I urge hon. Members not to do, although I do not always get my way. New clause 2 basically says that people should comply with the law. I mean, it is a simple fact: people should comply with the law. We cannot legislate to tell people that they should comply with the law; the law is the law. If none of these amendments is passed today, the law will remain as it is today and copyright law in the UK will be robust and clear.

For the absolute avoidance of doubt, some people have talked to me about text and data mining exceptions, which, as Members will know, exist, for instance, in the European Union. There is a text and data mining exception already in UK law. It was introduced in 2014 via a statutory instrument, which added section 29A to the Copyright, Designs and Patents Act 1988. However, it is an exception for the sole purpose of non-commercial research. I think that that is absolutely clear in law, and I do not think it needs any clarifying.

Stella Creasy Portrait Ms Stella Creasy (Walthamstow) (Lab/Co-op)
- Hansard - - - Excerpts

I understand the point that the Minister is making about existing copyright law, but, as he has said, the Government opened a consultation that has, for many of our constituents who work in the creative industries, prefigured a substantial change in copyright when it comes to AI. Does he see the merit that many of us see in making it clear that the principles behind copyright from which our creative constituents should be able to benefit, and which should protect their own works, are what is at stake here? Having said that the existing law stands, will he at least make a commitment that that is what the Government want as well? I think he can understand why people are concerned, and the source of the concerns that have merited these amendments.

14:31
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I completely understand and, in large measure, share those concerns. We wanted to ensure, in this fast-changing world, that the creative industries in the United Kingdom could be remunerated for the work they had produced. We are not in the business of giving away other people’s work to third parties for nothing: that would be to sell our birthright for a mess of pottage, to use a term from an old translation of the Bible, and we are determined not to do it. As my hon. Friend—and several other Members—will have heard me say many times before, we would only proceed with the package of measures included in the consultation if we believed that we were advancing the cause of the creative industries in the UK, rather than putting them in danger or legal peril.

I think that some of the things I will say in a moment will be of assistance. We want to reach a point at which it is easier for the creative industries—whether they are large businesses with deep pockets and able to use lawyers, or very small individual photographers or painters—to assert and protect their rights, and to say, if they wish, “No, you cannot scrape my material for the purpose of large language model learning, unless you remunerate me.” That remuneration might happen via a collective licensing scheme, or it might happen individually. Either way, we want to get to more licensing rather than less. As, again, I have said several times at this Dispatch Box, we have looked at what has happened in the European Union and what is happening in the United States of America, and we believe that although the EU said that its package was designed to deliver more licensing, it has not led to more licensing or to more remuneration of the creative industries, and we want to avoid that pitfall.

As I have said, I take the concerns of the creative industries seriously, both as a DSIT Minister and as a DCMS Minister; of course I do. I agree—we, the Government, agree—that transparency is key. We want to see more licensing of content. We believe that the UK is a creative content superpower, and we want UK AI companies to flourish on the basis of high-quality data. I have spoken to a fair number of publishing companies, in particular UK companies such as Taylor & Francis, a largely academic publisher. As Members will know, the UK is the largest exporter of books in the world. Those companies are deliberately trying to get all their material licensed to AI companies, for two reasons: first, they want to be remunerated for the work that they have provided, and secondly, just as importantly, they want AI to come up with good answers. If you put dirty water into a pipe, dirty water will come out at the other end, and if you put good data into AI, good answers will come out of AI. That is an important part of why we want to ensure that we have strong AI based on high-quality data, and much of that is premium content from our creative industries.

We also agree that the Government must keep an open mind, and must take full account of the economic evidence. That is why we have tabled new clauses 16 and 17, which set out binding commitments to assess the impact of any and all proposals and to consider and report on the key areas raised in debate. That includes any and all of the options that were involved in the consultation that we published after the amendments were tabled in the House of Lords. As the Government take forward the commitments made by these amendments, they will consider all potential policy options. I must emphasise that the Government have not prejudged the outcome of the consultation, and take the need to consider and reflect on the best approach for all parties very seriously.

Members will, I am sure, have read new clause 17; it requires the Government to report on four matters. First, there is the issue of technical solutions that would enable copyright owners to control whether their copyright works could be used to develop AI.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Will the hon. Lady just let me finish this paragraph, because it might read better in Hansard? Actually, I have now added that bit, so it is ruined, and I might as well give way to her.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

The question of technical solutions is very important, but my challenge is this. I have spoken to representatives of some of the big tech companies who are pushing for that, and who are saying that it is hard for them to do it at scale but creatives can do it. Why can the tech companies not be leading on an opt-in system for creatives? Let me hand that back to the Minister.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

I should point out that the hon. Lady, as the spokesperson for the Liberal Democrat party, will be speaking very shortly.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I know, but she is wonderful, so we will let her—or you will let her, Madam Deputy Speaker.

This is a really important point. Surely it cannot be impossible for us to find a technical solution. People who can develop AI—and they are now developing AI on their laptops, especially following DeepSeek; they do not need massive computers—should be able to develop a very simple system, as I have said before, whereby all creatives who are copyright owners are able to assert their rights, very simply, across all platforms, without any great exertion. That is what I want to achieve.

The hon. Lady was quite right to raise that question, so what are we going to do next? We say in new clause 17 that we will report in 12 months’ time. If we were to report in 12 months’ time that we had done absolutely nothing, I think that everyone would rightly rant and rave at us. It is our intention that the Secretary of State for Science, Innovation and Technology and the Secretary of State for Culture, Media and Sport will together co-ordinate a special taskforce specifically to consider how we can facilitate, incentivise and enable the development of these technical solutions. I suspect that, if we can get there, opt-out will look remarkably like opt-in.

The second matter on which new clause 17 requires us to report is access to data for AI developers to train AI systems in the UK, the third is transparency, and the fourth relates to measures to facilitate the licensing of copyright works for AI training. The publication will be required within 12 months of Royal Assent, and will of course be laid before Parliament. New clause 16 supplements these reports with a full economic impact assessment that will go further than previous assessments, and will present an analysis of the economic impact of a range of policy options available in this context, supported by the additional evidence that the Government have received in response to their consultation. The reporting requirements are important: they mean that we will have to engage with each of these issues apace and in depth, and we will do that. We are determined to find and incentivise technical solutions that support our objectives, and I believe that if we do that we can be a world leader. As I said earlier, the two Secretaries of State will convene working groups to tackle each of these issues.

I have heard people say that we are legislating to water down copyright, but that is simply not true. If Members support the Government’s position today, the UK’s copyright law will remain precisely as robust tomorrow as it is today. For activities in the UK, people will, in law, only be able to use copyright material if they are permitted and licensed to do so or if a copyright exception allows it, such as the existing copyright exceptions for education, public libraries and non-commercial work.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I thought that that might engender something.

Alice Macdonald Portrait Alice Macdonald
- Hansard - - - Excerpts

It was a pleasure to serve on the Bill Committee. May I take up the point about timelines in the new clause? The Minister has said that the reports must be made before the end of a period of 12 months, but, as other Members have said, there is a great deal of concern about what may happen. Does he expect this to take a year, or might it possible to work faster so that more reassurance can be given? I accept that there will need to be further consultation, and examination of the responses.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Obviously, a series of different things will happen. We will have to respond to the consultation at some point, and I guess that the Culture, Media and Sport Committee will want to respond as well. In the meantime, we will be running a working group. I am very happy to keep the House updated on how that work progresses, but I do not want to commit to producing something within 12 months without being absolutely certain that I can do so. If new clause 17 is carried today, it will be a requirement by law that we produce a response within 12 months.

I fully get the point about urgency. As the right hon. Member for Maldon knows well, this issue has been hanging around for a considerable period of time. We in the UK have perhaps been a little slow, but I want to make sure that we get it right, rather than legislate piecemeal.

Marsha De Cordova Portrait Marsha De Cordova (Battersea) (Lab)
- Hansard - - - Excerpts

I apologise if I have missed this, but has the Minister outlined when the Government will respond to the consultation?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

No, I have not—my hon. Friend has not missed anything. Obviously, we want to respond as soon as possible, but we have 11,500 consultation responses to consider.

Some issues have hardly been referred to in the public debate on this matter. One issue that Equity is understandably pursuing, and that we referred to in the consultation, is about personality rights, which exist in some states in the United States of America. That is quite complicated to legislate for, which is one of the reasons we have consulted on it.

We have also consulted on the question—again, nobody has referred to this in the public debate—of whether a work that is generated by AI has any copyright attached to it. If so, who owns that copyright? It is slightly moot in British law. One could argue that British copyright law has always presumed that copyright applies only where a work is the expression of an individual, so it does not apply to AI-generated material, but there are other elements. Section 9(3) of the Copyright, Designs and Patent Act 1988 says that machine-generated material can have copyright attached to it, which is one of the other issues that we want to address.

As I said earlier, one of the issues to which nobody has yet come up with an answer is how we will provide proper enforcement of whatever transparency requirements we propose. I am conscious that in discussions I have had with our European counterparts, including my Spanish counterpart and members of the European Commission, there has been some concern about precisely what they will do by virtue of transparency. This issue is made more complicated by the advent of DeepSeek—for a whole series of different reasons, which I am happy to explain at some other point—but we need to end up with a transparency system that is both effective and proportionate. Simply dumping a list of millions and millions of URLs that have been visited on the internet is neither effective nor proportionate, so we will have to come up with something.

Jonathan Davies Portrait Jonathan Davies (Mid Derbyshire) (Lab)
- Hansard - - - Excerpts

Does the Minister envisage that any model of enforcement around transparency will be compulsory and not a voluntary system?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

By its nature, enforcement would have to be compulsory, but we are running ahead of ourselves, because nobody has actually come up with a system that has an enforcement mechanism. Who would do it? What body would do it? How would that body be resourced? That is one of the things that we need to look into, and it is one of the elements of the consultation.

I will move on to another subject: the issue of purported intimate images. Government amendment 34 deals with the creation of intimate images or deepfakes. Earlier in the Bill’s passage, my colleague Lord Ponsonby added a new offence of creating purported intimate images without consent or reasonable belief in consent, and I am sure all hon. Members agree that this is a really important addition. In Committee, we introduced the offence of requesting the creation of purported images without consent or reasonable belief in consent, as hon. Members who were on the Public Bill Committee with me will know. It seems axiomatic that the courts should have the power to deprive offenders of the image and anything containing it that relates or is connected to the offence. This is already the case for the creating offence, which was introduced in the House of Lords. Government amendment 34 amends the sentencing code to achieve that for the requesting offence. It ensures that the existing regime of court powers to deprive offenders of property also applies to images and devices containing the image that relate to the requesting offence.

We have tabled a series of amendments to clauses 56 to 59 to reflect our discussions with the devolved Governments on the national underground asset register. The amendments will require that the Secretary of State to obtain the consent of Welsh Ministers and the Department for Infrastructure in Northern Ireland, rather than merely consult them, before making regulations in relation to the provisions. Co-operation with the devolved Governments has been consistent and constructive throughout the Bill’s passage. We have secured legislative consent from Scotland, and the Senedd in Wales voted in favour of granting the Bill legislative consent only yesterday. We regret that for procedural reasons, the process with Northern Ireland has not yet reached the stage of legislative consent. We are, however, working constructively with the Department of Finance to ensure that we can make progress as quickly as possible. We continue to work closely with the Northern Ireland Executive to secure legislative consent, and to ensure that citizens and businesses of Northern Ireland feel the full benefits of the Bill.

Before I finish, I turn to our amendments to help ensure that smart data schemes can function optimally, and that part 1 of the Bill is as clear as possible. Amendments to fee charging under clauses 11 and 15 follow extensive stakeholder engagement, and will maximise the commercial viability of smart data systems by enabling regulations to make tailored provision on fee charging within each smart data scheme. For example, amendments 19 to 21 enable the fees charged to exceed expenses where appropriate. This is necessary to fulfil the commitment in the national payments vision to establish a long-term regulatory framework for open banking. Outside smart data, Government amendment 35

“adds references to investigating crime to existing references in the Data Protection Act 2018 to detecting or preventing crime”,

which will bring these references into line with other parts of the legislation.

Nusrat Ghani Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - - - Excerpts

It is a privilege to respond to this debate on behalf of His Majesty’s official Opposition, and to speak to the new clauses and amendments. This is an ambitious piece of legislation, which will enable us to harness data—the currency of our digital age—and use it in a way that drives the economy and enhances the delivery of public services. Since its original inception under the Conservatives in the last Parliament, the Bill has also become the platform for tackling some of the most pressing social and technological issues of our time. Many of these are reflected in the amendments to the Bill, which are the subject of debate today.

I start with new clause 20. How do we regulate the interaction of AI models with creative works? I pay tribute to the work of many Members on both sides of this House, and Members of the other place, who have passionately raised creatives’ concerns and the risks posed to their livelihoods by AI models. Conservative Members are clear that this is not a zero-sum game. Our fantastic creative and tech industries have the potential to turbocharge economic growth, and the last Government rightly supported them. The creative and technology sectors need and deserve certainty, which provides the foundation for investment and growth. New clause 20 would achieve certainty by requiring the Government to publish a series of plans on the transparency of AI models’ use of copyrighted works, removing market barriers for smaller AI market entrants and digital watermarking and, most important of all, a clear restatement of the application of copyright law to AI-modelling activities.

I cannot help but have a sense of déjà vu in relation to Government new clause 17: we are glad that the Government have acted on several of the actions we called for in Committee, but once again they have chosen PR over effective policy. Amid all the spin, the Government have in effect announced a plan to respond to their own consultation—how innovative!

What is starkly missing from the Government new clauses is a commitment to make it clear that copyright law applies to the use of creative content by AI models, which is the primary concern raised with me by industry representatives. The Government have created uncertainty about the application of copyright law to AI modelling through their ham-fisted consultation. So I offer the Minister another opportunity: will he formally confirm the application of copyright law to protect the use of creative works by AI, and will he provide legal certainty and send a strong signal to our creative industries that they will not be asked to pay the price for AI growth?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the Minister for making that statement at the Dispatch Box. As he knows, we need to have that formally, in writing, as a statement from the Government to make it absolutely clear, given that the consultation has muddied the waters.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am sorry, but I said that in my speech, and I have said it several times in several debates previously.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I would therefore be grateful if the Minister said why there remains uncertainty among creatives about the application of copyright in this area. Is that not why we need to move this forward?

I now turn to Government amendment 34 and others. I congratulate my noble Friend Baroness Owen on the tremendous work she has done in ensuring that clauses criminalising the creation of and request for sexually explicit deepfake images have made it into the Bill. I also thank the Government for the constructive approach they are now taking in this area.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I should have said earlier that, as the shadow Minister knows, in Committee we changed the clause on “soliciting” to one on “requesting” such an image, because in certain circumstances soliciting may require the exchange of money. That is why we now have the requesting offence.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the Minister for his clarification and reiteration of that point, and again for his work with colleagues to take forward the issue, on which I think we are in unison across the House.

New clause 21 is on directions to public authorities on recording of sex data. One does not need to be a doctor to know that data accuracy is critical, particularly when it comes to health, research or the provision of tailored services based on protected characteristics such as sex or age. The accuracy of data must be at the heart of this Bill, and nowhere has this been more high-profile or important than in the debate over the collection and use of sex and gender data. I thank the charity Sex Matters and the noble Lords Arbuthnot and Lucas for the work they have done to highlight the need for accurate data and its relevance for the digital verification system proposed in the Bill.

Samantha Niblett Portrait Samantha Niblett (South Derbyshire) (Lab)
- Hansard - - - Excerpts

The recent decision by the Supreme Court that “sex” in the Equality Act 2010 refers to biological sex at birth, regardless of whether someone holds a gender recognition certificate or identifies as of a different gender, has already left many trans people feeling hurt and unseen. Does the shadow Minister agree with me that any ID and digital verification service must consider trans people, not risk making them more likely to feel that their country is forgetting who they are?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the hon. Member for her intervention, and I will shortly come on to the impact on all people of the decision of the Supreme Court. Our new clause’s focus and scope are simple. The Supreme Court ruling made it clear that public bodies must collect data on biological sex to comply with their duties under the Equality Act. The new clause ensures that this data is recorded and used correctly in accordance with the law. This is about data accuracy, not ideology.

New clause 21 is based in part on the work of Professor Alice Sullivan, who conducted a very important review, with deeply concerning findings on inaccurate data collection and the conflation of gender identity with biological sex data. She found people missed off health screening, risks to research integrity, inaccurate policing records and management through the criminal justice system, and many other concerns. These concerns present risks to everyone, irrespective of biological sex, gender identity or acquired gender. Trans people, like everyone else, need health screening based on their biological sex. Trans people need protecting from sexual predators, too, and they have the right to dignity and respect.

The Sullivan report shows beyond doubt that the concerns of the last Government and the current Leader of the Opposition were entirely justified. The Government have had Professor Sullivan’s report since September last year, but the Department for Science, Innovation and Technology has still not made a formal statement about it or addressed the concerns raised, which is even more surprising given its relevance to this Bill. The correction of public authority data on sex is necessary and urgent, but it is made even more critical by the implementation of the digital verification services in the Bill.

Tonia Antoniazzi Portrait Tonia Antoniazzi (Gower) (Lab)
- Hansard - - - Excerpts

I appreciate that the shadow Minister is making an important point on the Sullivan review and the Supreme Court judgment, but there are conversations in Government and with Labour Members to ensure that the Supreme Court judgment and the Sullivan review are implemented properly across all Departments, and I hope to work with the Government on that.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I thank the hon. Member for her intervention, and for all the work that she and colleagues on both sides of the House are doing in this area. I hope that the findings of the Sullivan report are implemented as soon as possible, and part of that implementation would be made possible if Members across the House supported our new clause.

For the digital verification services to be brought in, it is important that the data used to inform them is accurate and correct. Digital verification could be used to access single-sex services, so it needs to be correct, and if sex and gender data are conflated, as we know they are in many datasets, a failure to act will bring in self-ID by the back door. To be clear, that has never been the legal position in the UK, and it would conflict with the ruling of the Supreme Court. Our new clause 21 is simple and straightforward. It is about the accurate collection and use of sex data, and rules to ensure that data is of the right standard when used in digital verification services so that single-sex services are not undermined.

New clause 19 is on the Secretary of State’s duty to review the age of consent for data processing under the UK GDPR. What can or should children be permitted to consent to when using or signing up to online platforms and social media? How do we ensure children are protected, and how do we prevent harms from the use of inappropriate social media itself, separate from the content provided? How do we help our children in a world where social media can import the school, the playground, the changing room, the influencer, the stranger, the groomer, the radical and the hostile state actor all into the family home?

Our children are the first generation growing up in the digital world, and they are exposed to information and weaponised algorithms on a scale that simply did not exist for their parents. In government, we took measures to improve protections and regulate harmful content online, and I am delighted to see those measures now coming into force. However, there is increasing evidence that exposure to inappropriate social media platforms is causing harm, and children as young as 13 may not be able to regulate and process this exposure to such sites in a safe and proportionate way.

I am sure every Member across the House will have been contacted by parents concerned about the impact of social media on their children, and we recognise that this is a challenging area to regulate. How do we define and target risky and inappropriate social media platforms, and ensure that education and health tech—or, indeed, closed direct messaging services—do not fall within scope? How effective are our provisions already, and can age verification be made to work for under-16s? What ids are available to use? What will the impact of the Online Safety Act 2023 be now that it is coming into force? What are the lessons from its implementation, and where does it need strengthening? Finally, how do we support parents and teachers in educating and guiding children so they are prepared to enter the digital world at whatever age they choose and are able to do so?

The Government must take action to ensure appropriate safeguards are in place for our children, not through outright bans or blanket restrictions but with an evidence-based approach that takes into account the recent legal changes and need for effective enforcement, including age verification for under-16s. Too often in this place we focus on making more things illegal rather than on the reasons for lack of enforcement in the first place. There is no point in immediate restrictions if they cannot be implemented.

15:00
Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I agree with all the points the shadow Minister is making about keeping our children safe online, so why does new clause 19 only commit to a review of the digital age of data consent and raising the age from 13 to 16 for when parental consent is no longer required? Why does he not support the Liberal Democrats’ new clause 1 that would start to implement this change? We can still, through implementation, do all the things the hon. Gentleman proposes to do, so why the delay?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

There are a few issues with new clause 1. One is the scope in terms of the definition of networking services and ensuring platforms such as WhatsApp are not captured within it. Looking at new clause 19, there are challenges to implementing in this area. There is no point in clicking our fingers and saying, “Let’s change the age of digital consent,” without understanding the barriers to implementation, and without understanding whether age verification can work in this context. We do not want to create a system and have people just get around it quite simply. We need the Government to do the work in terms of setting it up so that we can move towards a position of raising the age from 13 to 16.

Max Wilkinson Portrait Max Wilkinson (Cheltenham) (LD)
- Hansard - - - Excerpts

The press have obviously been briefed by Conservatives that the Conservatives are pushing for a ban on social media for under-16s, but it seems that what is actually being suggested is a review of the digital age of consent with a view to perhaps increasing it to 16. The two positions are very different, and I wonder whether the tough talk in the press matches what is actually being proposed by the Opposition today.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I have been very clear on this, and it is important in such a complex area to look at the detail and nuance of the challenges around—(Interruption.) Well, it is very easy to create a new clause where we click our fingers and say, “Let’s make this more illegal; let’s bring in x, y or z restriction.” As a responsible Opposition, we are looking at the detail and complexities around implementing something like this. [Interruption.] I have been asked a few questions and the hon. Member for Cheltenham (Max Wilkinson) might want to listen to the rationale of our approach.

One question is how to define social media. Direct messaging services such as WhatsApp and platforms such as YouTube fall in the scope of social media. There are obviously social media platforms that I think all of us are particularly concerned about, including Snapchat and TikTok, but by changing the age of digital consent we do not want to end up capturing lower-risk social media platforms that we recognise are clearly necessary or beneficial, such as education technology or health technology platforms. And that is before we start looking at whether age verification can work, particularly in the 13-to-16 age group.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Sorry, I am getting a bit lost. Does the Minister think, and does the Conservative party think, that the digital age of consent should rise from 13 to 16 or not?

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

Order. I point out to Mr Bryant that Dr Ben Spencer is the shadow Minister.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I think that was wishful thinking by the Minister in this debate.

Our new clause says that we need to look at the desirability of raising the digital age of consent for data processing from 13 to 16 in terms of its impact particularly on issues such as the social and educational development of children, but also the viability of doing so in terms of the fallout and the shaking out of the Online Safety Act and with regard to age verification services. Should there then be no evidence to demonstrate that it is unnecessary, we would then raise the digital age of consent to 13 to 16. It might be the case that, over the next six months, the shaking out of the Online Safety Act demonstrates that this intervention is not necessary. Perhaps concerns around particular high-risk social media platforms will change as technology evolves. We are saying that the Government should do the work with a view to raising the age in 18 months unless there is evidence to prove the contrary. [Interruption.] I have made this crystal clear, and if the Minister would choose to look at the new clause, rather than chuckling away in the corner, he might see the strategy we are proposing.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I thank the shadow Minister for giving way. As ever, he is extremely polite in his presentation and in his dealing with interventions, but I am not sure that he dealt with my intervention, which was basically asking whether the Conservative party position is as it has briefed to the press—that it wishes to ban social media for under-16s—or that it wishes to have a review on raising the age of data consent. It cannot be both.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

I say again that the position is that, following a careful look at the evidence regarding the desirability and validity of doing so—taking into account findings regarding the impact and implementation of the Online Safety Act and age verification and how one defines social media, particularly high-risk platforms—unless there is direct evidence to show that raising the age from 13 to 16 is unnecessary, which there may be, then we should raise it from 13 to 16. If that has not provided clarity, the hon. Gentleman is very welcome to intervene on me again and I will try and explain it a third time, but I think Members have got a grasp now.

This new clause will also tackle some of the concerns at the heart of the campaign for Jools’ law, and I pay tribute to Ellen Roome for her work in this area. I am very sympathetic to the tragic circumstances leading to this campaign and welcome the additional powers granted to coroners in the Bill, but I know that they do not fully address Ellen Roome’s concerns. The Government need to explain how they can be sure that data will be retained in the context of these tragedies, so that a coroner will be able to make sure, even if there are delays, that it can be accessed. If the Minister could provide an answer to that in his winding-up speech, and detail any further work in the area, that would be welcome.

On parental access to children’s data more broadly, there are difficult challenges in terms of article 8 rights on privacy and transparency, especially for children aged 16 to 17 as they approach adulthood. Our new clause addresses some of these concerns and would also put in place the groundwork to, de facto, raise the digital age of consent for inappropriate social media to 16 within 18 months, rendering the request for parental access to young teenage accounts obsolete.

I urge colleagues across the House to support all our amendments today as a balanced, proportionate and effective response to a generational challenge. The Bill and the votes today are an opportunity for our Parliament, often referred to as the conscience of our country, to make clear our position on some of the most pressing social and technological issues of our time.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

I call the Chair of the Science, Innovation and Technology Committee.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- View Speech - Hansard - - - Excerpts

I would like to thank colleagues in the other place and in this House who have worked so hard to improve the Bill. By modernising data infrastructure and governance, this Bill seeks to unlock the secure, efficient use of data while promoting innovation across sectors. As a tech evangelist, as well as the Chair of the Science, Innovation and Technology Committee, I welcome it, and I am pleased to see colleagues from the Select Committee, my hon. Friend the Member for Stoke-on-Trent South (Dr Gardner) and the right hon. Member for North West Hampshire (Kit Malthouse), here for this debate.

Having spent many unhappy hours when working for Ofcom trying to find out where British Telecom’s ducts were actually buried, I offer a very personal welcome to the national underground asset register, and I thank the Minister for his work on this Bill as well as for his opening comments.

I agree with the Minister that there is much to welcome in this Bill, but much of the Second Reading debate was consumed by discussion on AI and copyright. I know many Members intend to speak on that today, so I will just briefly set out my view.

The problem with the Government’s proposals on AI and copyright are that they give all the power to the tech platforms who—let us be frank—have a great deal of power already, as well as trillions of dollars in stock market capitalisation and a determination to return value to their shareholders. What they do not have is an incentive to design appropriate technology for transparency and rights reservation if they believe that in its absence they will have free access to our fantastic creators’ ingenuity. It is essential that the Minister convinces them that if they do not deliver this technology—I agree with him that it is highly possible to do so—then he will impose it.

Perhaps the Minister could announce an open competition, with a supplier contract as the prize, for whichever innovative company designs something. The Science, Innovation and Technology Committee, sitting with the Culture, Media and Sport Committee, heard from small companies that can do just that. The tech giants might not like it, but I often say that the opposite of regulation is not no regulation—it is bad regulation. If the tech platforms do not lead, they will be obliged to follow because the House will not allow the copyright of our fantastic creators to be put at risk. The Minister knows that I think him extremely charismatic and always have done, but I do not believe that “Chris from DSIT” can prevail against the combined forces of Björn from Abba and Paul from The Beatles.

The prospects for human advancement opened by using data for scientific research are immense. As a world-leading science powerhouse, the UK must take advantage of them. That is why, despite being a strong advocate of personal data rights, I welcome the Bill’s proposals to allow the reuse of data without consent for the purposes of scientific research. I am concerned, however, that the exemption is too broad and that it will be taken advantage of by data-hungry tech companies using the exemption even if they are not truly advancing the cause of scientific progress but simply, as with copyright, training their AI models.

Huge amounts of data is already collected by platforms, such as direct messages on Instagram or via web-scraping of any website that contains an individual’s personal data such as published records or people’s public LinkedIn pages. We know it can be misused because it has been, most recently with Meta’s controversial decision to use Instagram-user data to train AI models, triggering an Information Commissioner’s Office response because of the difficulty users encountered in objecting to it. Then there is the risk of data collected via tracking cookies or the profiling of browsing behaviour, which companies such as Meta use to fingerprint people’s devices and track their browsing habits. Could the data used to create ads also be freely reusable under this exemption? The US tech firm Palantir has the contract for the NHS federated data platform. Amnesty International has already raised concerns about the potential for patients’ data being mishandled. Does the Bill mean that our health data could be reused by Palantir for what it calls research purposes?

David Davis Portrait David Davis (Goole and Pocklington) (Con)
- Hansard - - - Excerpts

Before the hon. Lady moves on from Palantir, I think the House should know that it is an organisation with its origins in the American security state—the National Security Agency and the Central Intelligence Agency—and I cannot understand for the life of me why we are willing to commit the data of our citizens to an organisation like that.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

I thank the right hon. Member for that intervention. I will leave it to the Minister to address his point.

The concern that is probably of most interest to my constituents is reflected in the recent report by The Sunday Times that Chelsea football club claims research and development tax credits. Will the Minister confirm that if Chelsea were to collect data on Newcastle United fans attending an away match at Stamford Bridge, it could be reused for whatever research it is undertaking as a consequence of the exemption?

My amendments 37 and 38 would incorporate into the Bill two clarifications to help reduce the potential misuse of the scientific research exemption. I thank the Ada Lovelace Institute for its help in drafting the amendments. Amendment 37 proposes placing in the Bill a basic definition of scientific research based on the “Frascati Manual” used by the ICO, enabling the “reasonably described” test to be assessed against an objective standard.

15:19
Amendment 38 is a requirement that such research be conducted in line with relevant frameworks and standards, based on the UK Research and Innovation code of practice for research, ensuring transparency for the use of scientific research exemptions. Researchers would have to publish—for example, on their website—a short statement with details of what they are reusing and send it to the ICO. This is a minimal, proportionate amount of information that researchers should already have to hand, meaning that over time we get a picture of how much people’s data is being reused and for what kinds of research, so that there is some transparency here.
When the Minister responds, could he please engage with the examples I have given? His letter to me of 17 April, which was perhaps drafted by his officials, seemed to say that the exemption is okay because we can trust companies to understand our undefined understanding of scientific research, and that the ICO would act if that were not the case. One problem with that suggestion is that we will not know how the data is being used for scientific research, because there are no transparency requirements on algorithms in general and on this exemption in particular.
Science in this country is respected—it is more respected than politicians and big tech. We must not allow that respect to be contaminated by uses that are not truly scientific. I have a real fear that if we allow this exemption to be abused, we will undermine public trust in data sharing and in science.
I will briefly speak to other parts of the Bill and amendments. On new clause 21, I hope the Government will set out how definitions of sex and gender will be consistent and appropriate to the need for which the data is being collected and verified under the digital verification provisions.
Marsha De Cordova Portrait Marsha De Cordova
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for the speech she is making. We all know the importance of data. Does she agree that it is right that when we are recording sex, it is based on what the Equality Act 2010 determines as sex, being biological sex?

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

I thank my hon. Friend for that intervention. The Minister referred to that briefly, describing it, in relation to AI, as a pipeline where bad data in would mean bad data out. My hon. Friend knows that the definition of sex and gender has been controversial and contested. The Supreme Court brought some clarity and it is important that data collection reflects consistency and clarity. If we have bad data definitions, we will undoubtedly have bad consequences. As I said, it is important that we have consistency and definition when it comes to the collection of data for these purposes, and I look forward to hearing how that will be achieved.

I also want to speak briefly in support of clause 125, which introduces rules allowing researchers to access data from online services for online safety research. The Science, Innovation and Technology Committee’s inquiry into social media algorithms in misinformation heard considerable evidence on the role of algorithms in pushing misinformation generally, and particularly to children. I very much welcome this clause, which will increase transparency, but could the Minister clarify that it will fully cover the recommender algorithms used by social media platforms, which drive new content to users?

My constituents often feel that advances in technology are done to them rather than with them and for their benefit. Critically, our constituents need to feel that they have agency over the way data impacts their lives. Rather than feeling empowered by digital innovation, too many feel the opposite: disempowered, undermined, dehumanised, tracked and even attacked. Delivering the improvements promised by the Bill must therefore go hand in hand with respecting the rights of citizens to control and manage their data and driving innovation and scientific research benefits to them.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

I call the Liberal Democrat spokesperson.

Victoria Collins Portrait Victoria Collins
- View Speech - Hansard - - - Excerpts

Thank you for calling me, Madam Deputy Speaker, and for your patience regarding my earlier intervention. I am very passionate about all elements of the Bill.

On Second Reading, I said:

“Data is the new gold”—[Official Report, 12 February 2025; Vol. 762, c. 302.]

—a gold that could be harnessed to have a profound impact on people’s daily lives, and I stand by that. With exponential advances in innovation almost daily, this has never been truer, so we must get this right.

I rise today to speak to the amendments and new clauses tabled in my name specifically, and to address two urgent challenges: protecting children in our digital world and safeguarding the rights of our creative industry in the age of artificial intelligence. The Bill before us represents a rare opportunity to shape how technology serves people, which I firmly believe is good for both society and business. However, I stand here with mixed emotions: pride in the cross-party work we have accomplished, including with the other place; hope for the progress we can still achieve; but also disappointment that we must fight so hard for protections that should be self-evident.

New clause 1 seeks to raise the age of consent for social media data processing from 13 to 16 years old. We Liberal Democrats are very clear where we stand on this. Young minds were not designed to withstand the psychological assault of today’s social media algorithms. By raising the age at which children can consent to have their data processed by social media services, we can take an important first step towards tackling those algorithms at source. This is a common-sense measure, bringing us in line with many of our European neighbours.

The evidence before us is compelling and demands our attention. When I recently carried out a safer screens tour of schools across Harpenden and Berkhamsted to hear exactly what young people think about the issue, I heard that they are trapped in cycles of harmful content that they never sought out. Students spoke of brain rot and described algorithms that pushed them towards extreme content, despite their efforts to block it.

The evidence is not just anecdotal; it is overwhelming. Child mental health referrals have increased by 477% in just eight years, with nearly half of teenagers with problematic smartphone use reporting anxiety. One in four children aged 12 to 17 have received unwanted sexual images. We know that 82% of parents support Government intervention in this area, while a Liberal Democrat poll showed that seven in 10 people say the Government are not doing enough to protect children online.

Freddie van Mierlo Portrait Freddie van Mierlo (Henley and Thame) (LD)
- Hansard - - - Excerpts

I welcome new clause 1, tabled by my hon. Friend. Does she agree that raising the age of consent for processing personal data from 13 to 16 will help reduce the use of smartphones in schools by reducing their addictiveness, thereby also improving concentration and educational performance in schools?

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

That is exactly what is at the heart of this matter—the data that drives that addictiveness and commercialises our children’s attention is not the way forward.

Many amazing organisations have gathered evidence in this area, and it is abundantly clear that the overuse of children’s data increases their risk of harm. It powers toxic algorithms that trap children in cycles of harmful content, recommender systems that connect them with predators, and discriminatory AI systems that are used to make decisions about them that carry lifelong consequences. Health Professionals for Safer Screens—a coalition of child psychiatrists, paediatricians and GPs— is pleading for immediate legislative action.

This is not a partisan issue. So many of us adults can relate to the feeling of being drawn into endless scrolling on our devices—I will not look around the Chamber too much. Imagine how much more difficult it is for developing minds. This is a cross-party problem, and it should not be political, but we need action now.

Let me be absolutely clear: this change is not about restricting young people’s digital access or opposing technology and innovation; it is about requiring platforms to design their services with children’s safety as the default, not as an afterthought. For years we have watched as our children’s wellbeing has been compromised by big tech companies and their profits. Our call for action is supported by the National Society for the Prevention of Cruelty to Children, 5rights, Healthcare Professionals for Safer Screens, Girlguiding, Mumsnet and the Online Safety Act network. This is our chance to protect our children. The time to act is not 18 months down the line, as the Conservatives suggest, but now. I urge Members to support new clause 1 and take the crucial steps towards creating a digital world where children can truly thrive.

To protect our children, I have also tabled amendment 45 to clause 80, which seeks to ensure that automated decision-making systems cannot be used to make impactful decisions about children without robust safeguards. The Bill must place a child’s best interests at the heart of any such system, especially where education or healthcare are concerned.

We must protect the foundational rights of our creators in this new technological landscape, which is why I have tabled new clause 2. The UK’s creative industries contribute £126 billion annually to our economy and employ more than 2.3 million people—they are vital to our economy and our cultural identity. These are the artists, musicians, writers and creators who inspire us, define us and proudly carry British creativity on to the global stage. Yet today, creative professionals across the UK watch with mounting alarm as AI models trained on their life’s work generate imitations without permission, payment or even acknowledgment.

New clause 2 would ensure that operators of web crawlers and AI models comply with existing UK copyright law, regardless of where they are based. This is not about stifling innovation; it is about ensuring that innovation respects established rights and is good for everyone. Currently, AI companies are scraping creative works at an industrial scale. A single AI model may be trained on thousands of copyrighted works without permission or compensation.

The UK company Polaron is a fantastic example, creating AI technology to help engineers to characterise materials, quantify microstructural variation and optimise microstructural designs faster than ever before. Why do I bring up Polaron? It is training an AI model built from scratch without using copyright materials.

David Davis Portrait David Davis
- Hansard - - - Excerpts

I am emphatically on the hon. Lady’s side in her intent to protect British creativity, but how does she respond to the implicit threat from artificial intelligence providers to this and other elements of the Bill to effectively deny AI to the UK if they find the regulations too difficult to deal with?

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

We have a thriving innovation sector in the UK, so those companies are not going anywhere—they want to work with the UK. We actually have a system now that has a fantastic creative industry and we have innovation and business coming in. There are many ways to incentivise that. I talk a lot about money, skills and infrastructure—that is what these innovative companies are looking for. We can make sure the guardrails are right so that it works for everyone.

By ensuring that operators of web crawlers and AI models comply with existing UK copyright law, we are simply upholding established rights in a new technological context. The UK led the world in establishing trustworthy financial and legal services, creating one of the largest economies by taking a long-term view, and we can do the same with technology. By supporting new clause 2, we could establish the UK as a base for trustworthy technology while protecting our creative industries.

Finally, I will touch on new clause 4, which would address the critical gap in our approach to AI regulation: the lack of transparency regarding training data. Right now, creators have no way of knowing if their work has been used to train AI models. Transparency is the foundation of trust. Without it, we risk not only exploiting creators, but undermining public confidence in these powerful new technologies. The principle is simple: if an AI system is trained using someone’s creative work, they deserve to know about it and to have a say in how it is used. That is not just fair to creators, but essential for building an AI ecosystem that the public trust. By supporting new clause 4, we would ensure that the development of AI happens in the open, allowing for proper compensation, attribution and accountability. That is how we will build responsible AI that serves everyone, not just the tech companies.

On the point of transparency, I will touch briefly on a couple of other amendments. We must go further in algorithmic decision making. That is why I have tabled amendment 46, which would ensure that individuals receive personalised explanations in plain language when an automated decision system affects them. We cannot allow generic justifications to stand in for accountability.

Pete Wishart Portrait Pete Wishart
- Hansard - - - Excerpts

I will support the hon. Lady’s new clause 2 tonight, if she pushes it to a vote, and I encourage her also to push new clause 4 to a vote. This is a most important issue. We must ensure that transparency is available to all artists and creators. Does she agree that there is no good technological barrier to having transparency in place right now?

15:30
Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

That has been my challenge to the tech companies, which I absolutely support in innovating and driving this—but if they are saying that it would be easy for creatives to do this, why is it not easy for big tech companies with power and resources to lead the way?

Amendments 41 to 44 would ensure that the decisions made about people, whether through data profiling, automated systems or algorithms, are fair. They would clarify that meaningful human involvement in automated decision making must be real, competent and capable of changing the outcome, not just a box-ticking exercise.

The amendments before us offer a clear choice to protect our children and creators or to continue to delay while harm grows—the choice to build a future in which technology either builds trust or destroys it. We have the evidence and the solutions, and the time for action is now. Let us choose a future in which technology empowers, rather than exploits—one that is good for society and for business. I urge all Members to support our amendments, which would put people and the wellbeing of future generations first.

Samantha Niblett Portrait Samantha Niblett
- View Speech - Hansard - - - Excerpts

I am pleased to speak in this debate in support of new clause 14, in the name of my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel), to which I have added my name. The clause would give our media and creative sectors urgently needed transparency over the use of copyright works by AI models. I am sure that my speech will come as no surprise to the Minister.

I care about this issue because of, not in spite of, my belief in the power of AI and its potential to transform our society and our economy for the better. I care because the adoption of novel technologies by businesses and consumers requires trust in the practices of firms producing the tech. I care about this issue because, as the Creative Rights in AI Coalition has said:

“The UK has the potential to be the global destination for generative firms seeking to license the highest-quality creative content. But to unlock that immense value, we must act now to stimulate a dynamic licensing market: the government must use this legislation to introduce meaningful transparency provisions.”

Although I am sure that the Government’s amendments are well meant, they set us on a timeline for change to the copyright framework that would take us right to the tail end of this Parliament. Many in this House, including myself, do not believe that an effective opt-out mechanism will ever develop; I know it is not in the Bill right now, but it was proposed in the AI and copyright consultation. Even if the Government insist on pursuing this route, it would be a dereliction of duty to fail to enforce our existing laws in the intervening period.

Big tech firms claim that transparency is not feasible, but that is a red herring. These companies are absolutely capable of letting rights holders know whether their individual works have been used, as OpenAI has been ordered to do in the Authors Guild v. OpenAI copyright case. Requiring transparency without the need for a court order will avoid wasting court time and swiftly establish a legal precedent, making the legal risk of copyright infringement too great for AI firms to continue with the mass theft that has taken place. That is why big tech objects to transparency, just as it objects to any transparency requirements, whether they are focused on online safety, digital competition or copyright. It would make it accountable to the individuals and businesses that it extracts value from.

The AI companies further argue that providing transparency would compromise their trade secrets, but that is another red herring. Nobody is asking for a specific recipe of how the models are trained: they are asking only to be able to query the ingredients that have gone into it. Generative AI models are made up of billions of data points, and it is the weighting of data that is a model’s secret sauce.

The Government can do myriad things around skills, access to finance, procurement practices and energy costs to support AI firms building and deploying models in the UK. They insist that they do not see the AI copyright debate as a zero-sum game, but trading away the property rights of 2.4 million UK creatives—70% of whom live outside London—to secure tech investment would be just that.

There are no insurmountable technical barriers to transparency in the same way that there are no opt-outs. The key barrier to transparency is the desire of tech firms to obscure their illegal behaviour. It has been shown that Meta employees proactively sought, in their own words,

“to remove data that is clearly marked as pirated/stolen”

from the data that they used from the pirate shadow library, LibGen. If they have technical means to identify copyright content to cover their own backs, surely they have the technical means to be honest with creators about the use of their valuable work.

I say to the Minister, who I know truly cares about the future of creatives and tech businesses in the UK—that is absolutely not in question—that if he cannot accept new clause 14 as tabled, he should take the opportunity as the Bill goes back to the Lords to bring forward clauses that would allow him to implement granular transparency mechanisms in the next six to 12 months. I and many on the Labour Benches—as well as the entire creative industries and others who do not want what is theirs simply to be taken from them—stand ready to support the development of workable solutions at pace. It can never be too soon to protect the livelihoods of UK citizens, nor to build trust between creators and the technology that would not exist without their hard work.

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

I call the Chair of the Culture, Media and Sport Committee.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- View Speech - Hansard - - - Excerpts

I rise to support new clauses 2 to 5 in the name of the hon. Member for Harpenden and Berkhamsted (Victoria Collins); to pay tribute to Baroness Kidron, who has driven forward these amendments in the other place; and to speak in favour of new clause 20 in the name of the official Opposition.

I am beginning to sound a bit like a broken record on this matter, but our creative industries are such a phenomenal UK success story. They are our economic superpower and are worth more than automotive, aerospace and life sciences added together, comprising almost 10% of UK registered businesses and creating nearly 2.5 million jobs. More than that, our creative industries have so much intrinsic value; they underpin our culture and our sense of community. Intellectual property showcases our nation around the world and supports our tourism sector. As a form of soft power, there is simply nothing like it—yet these social and economic benefits are all being put at risk by the suggested wholesale transfer of copyright to AI companies.

The choice presented to us always seems, wittingly or unwittingly, to pit our innovative AI sector against our world-class creative industries and, indeed, our media sector. It is worth noting that news media is often overlooked in these debates, but newspapers, magazines and news websites license print and content online. In turn, that helps to support high-quality and independent journalism, which is so vital to underpinning our democratic life. That is essential considering recent news that the global average press freedom score has fallen to an all-time low.

I want to push back against the false choice that we always seem to be presented with that, somehow, our creative industries are Luddites and are not in favour of AI. I have seen time and again how our creators have been characterised by big tech and its lobbyists as somehow resistant to technological progress, which is of course nonsensical.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I want to knock on the head the idea that any Government Minister thinks that the creative industries are Luddites. As I said in the debate in Westminster Hall—I know that the hon. Lady was not able to be there—many creative industries use all sorts of technical innovations every single day of the week. They are not Luddites at all; they are the greatest innovators in the country.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

I thank the Minister for that reassurance. I did take part in a Westminster Hall debate on this matter a couple of weeks ago, but one of his colleagues was responding. I made the same point then. Quite often in the media or more generally, AI seems to be pitted against our creative industries, which should not be the case, because we know that our creative industries embrace technology virtually more than any other sector. They want to use AI responsibly. They do not want to be replaced by it. The question before us is how lawmakers can ensure that AI is used ethically without this large-scale theft of IP. We are today discussing amendments that go somewhere towards providing an answer to that question.

David Davis Portrait David Davis
- Hansard - - - Excerpts

On this issue of Luddites, surely one of the problems for English language creators is that what they create is of more value because of the reach of the English language over others. Therefore, they are more likely to have their product scraped and have more damage done to them.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

My right hon. Friend makes a very good observation, but the fact is that so much content has already been scraped. Crawlers are all over the intellectual property of so many of our creators, writers and publishers—so much so that we are almost in a position where we are shutting the gate after the horse has bolted. Nevertheless, we need to do what we can legislatively to get to a better place on this issue.

New clause 2 would simply require anyone operating web crawlers for training and developing AI models to comply with copyright law. It is self-evident and incontrovertible that AI developers looking to deploy their systems in the UK should comply with UK law, but they often claim that copyright is not very clear. I would argue that it is perfectly clear; it is just that sometimes they do not like it. It is a failure to abide by the law that is creating lawsuits around the world. The new clause would require all those marketing their AI models in the UK to abide by our gold-standard copyright regime, which is the basis that underpins our thriving creative industries.

New clause 3 would require web crawler operations and AI developers to disclose the who, what, why, and when crawlers are being used. It also requires them to use different crawlers for different purposes and to ensure that rights holders are not punished for blocking them. A joint hearing of the Culture, Media and Sport Committee and the Science, Innovation and Technology Committee heard how publishers are being targeted by thousands of web crawlers with the intention of scraping content to sell to AI developers. We heard that many, if not most, web crawlers are not abiding by current opt-out protocols—robots.txt, for example. To put it another way, some developers of large language models are buying data scraped by third-party tech companies, in contravention of robots.txt protocols, to evade accusations of foul play. All this does is undermine existing licensing and divert revenues that should be returning to our creative industries and news media sector. New clause 3 would provide transparency over who is scraping copyrighted works and give creators the ability to assert and enforce their rights.

New clause 4 would require AI developers to be transparent about what data is going into their AI models. Transparency is fundamental to this debate. It is what we should all be focusing on. We are already behind the drag curve on this. California has introduced transparency requirements, and no one can say that the developers are fleeing silicon valley just yet.

New clause 20, tabled by the official Opposition, also addresses transparency. It would protect the AI sector from legal action by enabling both sides to come to the table and get a fair deal. A core part of this new clause is the requirement on the Secretary of State to commit to a plan to help support creators where their copyright has been used in AI by requiring a degree of transparency.

New clause 5 would provide the means by which we could enforce the rules. It would give the Information Commissioner the power to investigate, assess and sanction bad actors. It would also entitle rights holders to recover damages for any losses suffered, and to injunctive relief. Part of the reason why rights holders are so concerned is that the vast majority of creators do not have deep enough pockets to take on AI developers. How can they take on billion-dollar big tech companies when those companies have the best lawyers that money can buy, who can bog cases down in legislation and red tape? Rights holders need a way of enforcing their rights that is accessible, practical and fair.

The Government’s AI and copyright consultation says that it wants to ensure

“a clear legal basis for AI training with copyright material”.

That is what the new clauses that I have spoken to would deliver. Together they refute the tech sector’s claims of legal uncertainty, while providing transparency and enforcement capabilities for creators.

Ultimately, transparency is the main barrier to greater collaboration between AI developers and creators. Notwithstanding some of the unambitious Government amendments, the Opposition’s amendments would provide the long-overdue redress to protect our creative industries by requiring transparency and a widening of the scope of those who are subject to copyright laws.

The amendments would protect our professional creators and journalists, preserve the pipeline of young people looking to make a career in these sectors themselves, and cement the UK as a genuine creative industries superpower, maintaining our advantage in the field of monetising intellectual property. One day we may make a commercial advantage out of the fact that we are the place where companies can set up ethical AI companies—we could be the envy of the world.

Preet Kaur Gill Portrait Preet Kaur Gill (Birmingham Edgbaston) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to support the Bill and speak to new clauses 22 and 23 tabled in my name. The measures in the Bill will unlock the power of data to grow the economy, to improve public services and make people’s lives easier. By modernising the way in which consumers and businesses can safely share data, the Bill will boost the economy by an estimated £10 billion over the next decade. The Bill will also make our public services more efficient and effective, saving our frontline workers from millions of hours of bureaucracy every year, which they can use to focus on keeping us safe and healthy.

15:45
I welcome the Minister’s mission to update Government for the modern digital age. Any responsible Government must use data to build an accurate picture of who they are serving. It is because I recognise the incredible power of data that I tabled my probing new clauses on the need to record ethnicity data for Sikhs and Jews. If some communities are invisible to public bodies, how can we expect them to be served equally? I tabled these new clauses to put right a wrong. The ethnicity data that public bodies collect is often patchy and incomplete, and for Sikhs and Jews specifically data is not collected at all.
Jews and Sikhs have been legally recognised as ethnic groups for over 40 years, since the Mandla v. Dowell Lee case in 1983. That was reaffirmed in the Equality Act 2010, where they are considered as both religious and ethnic groups. As those are protected characteristics, we would expect public bodies to be instructed to collect information on Jews and Sikhs routinely—especially when public bodies use ethnic categories only to deliver public services—but they are not. That is why this really matters.
Jews and Sikhs therefore face discrimination, which was further evidenced by the 2017 race disparity audit that found no data on Sikhs in the 340 datasets assessed across Government. In recent years, Jews and Sikhs have been recorded only in religion data. However, that data is rarely collected to a good standard—if at all—it excludes secular and non-practising Jews and Sikhs, and is not used by public bodies to monitor and reduce inequalities or provide public services. How are we meant to tackle discrimination and inequality if Jews and Sikhs are invisible to policymakers? New clause 22, in my name, would therefore make it a statutory requirement for public bodies to collect ethnicity data and ensure that, within that, they collect data on Jews and Sikhs who are currently excluded.
New clause 23 is on a more specific but no less important point: the health inequalities that persist as a result of inadequate data. It would specifically target the provisions in the Bill about the register of births and deaths. In order for that data to be meaningful, surely the Minister will agree that Jewish and Sikh deaths matter. If they are not included, what message is that sending to both those groups?
The pandemic revealed the tragic consequences of failing to recognise health inequalities among different ethnic groups. Many experts in public health now accept that we were too slow to recognise that some ethnic groups were dying at a far higher rate than others. The Office for National Statistics belatedly started analysing covid-related deaths data by religious group—a short-term exercise that has since been discontinued—and found that Sikhs died disproportionately from covid, even after adjusting for other factors. Not only that; it showed that Sikhs were affected at a different rate from other predominantly south Asian groups, demonstrating the inadequacy of the existing ethnic categories.
The same goes for Jews, who are often missing from data that the NHS collects, as acknowledged in a report from the NHS Race and Health Observatory published in December. British Jews also died from covid at almost twice the rate of the rest of the population, and certain genetic conditions also have a higher prevalence among Jewish people, such as breast cancer in Ashkenazi Jewish women. The new clause would take forward one of the Race and Health Observatory’s key recommendations, to mandate the inclusion of Jewish as an option for ethnicity, allowing us to collect essential data for tracking health outcomes. It is of course right that we do the same for the Sikh community. Will the Minister commit to accepting these new clauses, given the significance of the arguments in their favour?
England and Wales are behind other countries on this issue. The United States Department of Health and Human Services has recorded ethnicity on death certificate records for decades, and Scotland has recorded ethnicity as part of the deaths registration process since 2012. In October 2020—during the pandemic—the previous Government announced plans to adopt the measure in England, too, but despite warm words from the then Minister for Equalities, the right hon. Member for North West Essex (Mrs Badenoch)—now the Leader of the Opposition—no action was taken.
I am pleased that the Government are taking forward important changes to ethnicity data collection within the deaths registration process, but as it stands, the process would pretty much replicate the in-built bias against Jews and Sikhs who do not have their own specific categories in ONS-designed ethnicity questionnaires. They simply rely on existing recorded information for ethnicity. My new clauses would change that, making it mandatory for public bodies to collect ethnicity data for Jews and Sikhs. After all, it is not religious differences that make Ashkenazi women more susceptible to breast cancer. The distinction between recording ethnicity versus religion is the difference between Jewish and Sikh communities being counted and not. Good quality data saves lives. That is why I introduced a ten-minute rule Bill on the issue in the House last year, and my campaign had wide support from both the Jewish and Sikh communities.
I do not intend to press my new clauses to a vote, but I hope the Government will bring forward changes that mandate the inclusion of Jewish and Sikh ethnicity categories for the purposes of public service delivery. I fully support the Bill’s mission to unlock the power of data to transform lives. That data must be fair. Sikhs and Jews must no longer be invisible to policymakers.
Iain Duncan Smith Portrait Sir Iain Duncan Smith
- View Speech - Hansard - - - Excerpts

I rise to speak in support of amendment 10 tabled by the hon. Member for Leeds Central and Headingley (Alex Sobel), in my name and that of others. I congratulate him on the amendment, as it is worth talking about.

The amendment is quite simple, in a way, as its key point is that it prevents the transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The core principle of data protection law is accountability, yet current UK law allows UK companies to transfer user data to their international partners in jurisdictions where there is no credible appeals process and no predictable rule of law. That basically puts power in the hands of those who have signed contracts containing standard data protection clauses. Those contracts create the illusion of protection, but in reality the data transfer is unsafe, either because the prospect of state interference is real or because the conditions for protection of data transfer simply are not present. We rely too much on the idea that, somehow, contract law in the UK will protect the data being transferred across to other countries, but this is about countries where such rules do not apply.

Transferring data to regimes such as China, for example, is not just a threat to UK citizens’ privacy but a national security risk. British citizens’ personal information, health records, financial details, biometrics, genomics or location data could be accessed under China’s national intelligence law, which compels organisations to co-operate with state intelligence work in secret. That is not speculation; it is the well-known and established law in China.

This is not only about China, but I use that country as a good example because it is a regular abuser of data. We have been unbelievably stupid across the board, in companies and so on, in assuming straightaway that the rules would apply to Chinese companies and they would enforce them. They cannot, because under the national intelligence law, they are told, “You will provide data as and when we require it from whatever source you have access to.”

The situation right now in Ireland is interesting. The Irish Data Protection Commission recently fined TikTok the not inconsiderable sum of €530 million and found that the company had illegally transferred data from users in the European economic area to China. The commission determined that Chinese law offers no essential equivalent to protection on GDPR due to state surveillance laws and the lack of judicial oversight.

That is not a lone example. I have written on a number of occasions about the stupidity of the contract law covering things such as pregnancy tests and covid tests, which were dominated by a Chinese company called BGI. It is the biggest genomics company in the world and it was allowed to hold about 15% of the data gathered for tests for use back in China. We now know that China is using that data, working with AI companies, to develop tests and to reference weaknesses in certain ethnic groups. We see what is already going on in Xinjiang, where a troublesome ethnic group is being deliberately targeted through genocide to get rid of it, but it is also looking at areas and weaknesses in Europe that may well in turn be usable. We have allowed it under this contract to have that data presuming that it would be protected. It is not protected at all; it has simply been transferred and is now being used for military purposes.

Those are just two examples, but it is interesting that Ireland has already taken action. Let us not forgot the Shanghai police database leak in 2022 in which the personal data of over 1 billion Chinese citizens, including criminal records and biometric details, was left openly accessible online for over a year without any enforcement action or Government accountability.

I congratulate the hon. Gentleman on tabling the amendment, because it goes to the heart of what it means to be a democracy that values the rule of law, privacy and the dignity of the individual. It rightly states that no third country can be considered adequate if it lacks credible means for judicial protection, administrative redress or statutory legal remedy. It aligns closely with the high threshold set by the Schrems II judgment, and it ensures that the standards do not fall below those we uphold, and are upheld among our friends in the European Union.

David Davis Portrait David Davis
- Hansard - - - Excerpts

My right hon. Friend makes a formidably important point. The amendment highlights one of the extraordinary weaknesses of the Bill, which is that it in effect reverses GDPR on a large number of citizen protections. To reiterate the point he gently made, that enormous fine will not stop TikTok, because it operates under legal compulsion. Even though it paid £450 million, it will continue to commit the criminal offence for which it has just been convicted.

Iain Duncan Smith Portrait Sir Iain Duncan Smith
- Hansard - - - Excerpts

I agree with my right hon. Friend: that is the peculiarity. The Minister knows only too well about the nature of what goes on in countries such as China. Chinese companies are frankly scared stiff of cutting across what their Government tell them they have to do, because what happens is quite brutal.

We have to figure out how we protect data from ill use by bad regimes. I use China as an example because it is simply the most powerful of those bad regimes, but many others do not observe data protection in the way that we would assume under contract law. For example, BGI’s harnessing of the data it has gleaned from covid tests, and its dominance in the pregnancy test market, is staggering. It has been officially allowed to take 15% of the data, but it has taken considerably more, and that is just one area.

Genomics is a huge and vital area right now, because it will dominate everything in our lives, and it populates AI with an ability to describe and recreate the whole essence of individuals, so this is not a casual or small matter. We talk about AI being used in the creative industries—I have a vested interest, because my son is in the creative industries and would support what has been said by many others about protecting them—but this area goes a whole quantum leap in advance of that. We may not even know in the future, from the nature of who they are, who we are talking to and what their vital statistics are.

This amendment is not about one country; it is about providing a yardstick against which all third countries should be measured. If we are to maintain the UK’s standing as a nation that upholds privacy, the rule of law, democracy and accountability, we must not allow data to be transferred to regimes that fundamentally do not share those values. It is high time that we did this, and I am glad to see the Minister nodding. I hope therefore that he might look again at the amendment. Out of old involvement in an organisation that he knows I am still part of, he might think to himself that maybe this is worth doing or finding some way through.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I do not resile from my views just because I have become a Minister, just as the right hon. Member did not when he became a Minister. He makes an important set of points. I do think, however, that they are already met by the changes in the schedule to article 45B, which is not an exhaustive list of things that the Secretary of State may consider. The points he refers to are certainly things that the Secretary of State could—and should, I would argue—consider.

Iain Duncan Smith Portrait Sir Iain Duncan Smith
- Hansard - - - Excerpts

I am grateful to the Minister, and I hope that that might find its way on to the face of the Bill with a little more description, but I understand that and I acknowledge that he does as well.

16:00
Although we will not press this to a vote, I hope that the Government realise that we sometimes have no comprehension of how big and important the powers we are up against are, or of how they dominate their own producing companies in every international market. We must wake up to that and understand the real threat, not just to the ability to create but to the ability to exist as who we are. At that point we may come to terms with this. Having listened to the Minister—we have worked together on many issues to do with both Russia and China—I hope the Government will think again and come up with a slightly tighter version in line with the amendment tabled by the hon. Member for Leeds Central and Headingley (Alex Sobel).
Jon Trickett Portrait Jon Trickett (Normanton and Hemsworth) (Lab)
- View Speech - Hansard - - - Excerpts

I have tabled new clause 18, which is about health and instituting a new public interest test. There is an existing test, but it is not very accessible or useful nowadays. I will explain why. Probably the largest—or the most sensitive—database in the United Kingdom is the one held by the NHS. Almost every human being in our country is on a file somewhere, hopefully for beneficial and clinical reasons, in the NHS. I have had a series of medical issues in the last few months, and I am astonished by how much data is held about me. That will be the same for all of us, I guess.

That database is filled with the most vital, private and intimate details about all our lives, and if it is going to be sustained, it is important that it retains the confidence of patients and of citizens as a whole. The truth is that the Information Commissioner is not convinced that the Bill goes far enough in protecting us and the NHS data, and if the Minister has time at the end, I hope he will be able to comment on whether he is persuaded by the Information Commissioner that we should move further.

This data, which is intimate and private, as I have just said, can also be the basis of major advances in human welfare. We can imagine all sorts of ways that our interests as human beings are being advanced by the use of that data every day. I am thinking, for example, of the search for a covid vaccine, which was led by a British scientist and partly based on the data that was available from the NHS. We should be celebrating that—that scientific research is essential—but there is a threat to this database as well, and in my view it comes from two separate sources. First, there are hostile—or even friendly—state actors. Secondly, there are private interests who want to use the data not for human welfare, though that might be a fig leaf that they use, but for private profit. That cannot be right, given what that data has been created for.

Christine Jardine Portrait Christine Jardine (Edinburgh West) (LD)
- Hansard - - - Excerpts

The hon. Gentleman makes an important point. Data can be susceptible and its breach is a breach of privacy. Does he agree that one danger in new clause 21, about the data on individuals’ sex at birth, is that it risks breaching someone’s privacy if they have kept that fact private and that data becomes public knowledge through the means being discussed?

Jon Trickett Portrait Jon Trickett
- Hansard - - - Excerpts

I thank the hon. Member for making that important point, and of course she is right.

I go back to this question of the threats to the database, which are not simply the product of my imagination; they are real. First, all data can be monetised, but this database is so large that huge commercial interests are now trying to get access to that health data. I do not want to cause offence to any hon. Members, all of whom I know follow the rules, but it is interesting that nearly £3 million from the private health sector was made available to over 150 different Members of Parliament. I do not suggest that any Member has done anything inappropriate—that would be wrong of me—but one wonders how almost £3 million was found by a private sector that has no commercial interest in pursuing those investments.

Secondly, on commercial interests, will the Minister confirm that at no stage will any data or any other aspect of the NHS be up for sale as part of negotiations with the United States on a trade deal? Will the Government provide some guidance on that? If the House reflects on private sector interests—which are not necessarily in the best interests of humanity—and how they make money, there is an interesting thought about health insurance. A party represented in the House is led by an individual who has suggested that we should end the way that we fund the NHS and replace it with an insurance system. If the insurance industry got access to the data held on all of us by the NHS, they would be able to see the genome of each person or of groups of people, and provide differential rates of insurance according to people’s genetic make-up. That is a serious threat. I do not think the party that has recently entered the House has thought that through, but companies providing insurance could commercialise that data. That is one reason we must never follow the track towards a national insurance system to replace the NHS.

Yesterday, the Secretary of State for Health and Social Care told the House that we will not be privatising the NHS, and I welcome that statement. Reference has already been made to Palantir—the right hon. Member for Goole and Pocklington (David Davis) mentioned it earlier—and the contract that we inherited from the previous Government. It is extraordinary that Palantir, a company that has deep roots in the United States defence establishment, should be handling the data of millions of people, when its chair has said that he is completely opposed to the central principle of the NHS and that he effectively wants a private health system in the UK. How could a £500 million contract to handle our personal data have been handed over to such a company, led by a person whose purpose seems to be to destroy the principles of our NHS? How our data is handled should be our decision, in the United Kingdom.

The Information Commissioner says that it is important that this precious and vital data, which is personal to each of us, should be protected against any possibility of cyber-attacks. However, there has already been a cyber-attack. Qilin—the way I am pronouncing it makes it sound as if someone is trying to commit murder, but there may be another way of saying it—is a Russian cyber-criminal group that obtained access to 400 GB of private information held by a company dealing with pathology testing. That is an enormous amount of data. Qilin attempted to extort from the company that held the data a financial interest. I do not know whether enough provision is made in the Bill for the protection of our data, so I suggest that there should be a new public interest test, with a report to Parliament within six months, which we can all debate and then examine whether the legislation has gone far enough.

Finally, the Information Commissioner says three things. First, the database must retain public confidence. Media discussions and opinion polling show that people are losing confidence that their personal data is secure, and I understand why that should be the case. Secondly, data should be properly protected and built from the beginning with proper safeguards against cyber-attacks. Thirdly, and perhaps most importantly, the Bill refers to an effective exemption for scientific research. As my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah) said, private companies, and perhaps US companies, might use the idea of promoting scientific research as a fig leaf to hide their search for profit from the precious commodity—data—that we have because we created our NHS. That is a very dangerous thought, and the Information Commissioner says he is not convinced that the definition of scientific research in the Bill is sufficiently strong to protect us from predatory activity by other state actors or private companies.

David Davis Portrait David Davis
- Hansard - - - Excerpts

The hon. Gentleman is making an excellent speech and some very perceptive points. I remind him that previous attempts by the NHS to create a single data standard have all failed, because the GPs did not believe that the security levels were sufficient. It is not just the Information Commissioner; the GPs refused to co-operate, which highlights the powerful point that the hon. Gentleman is making.

Jon Trickett Portrait Jon Trickett
- Hansard - - - Excerpts

I am grateful to the right hon. Gentleman for making that very serious point. When the clinicians—whose duty is to protect their patients—say they are not convinced about the safety of data being handed over to a central database, we have to listen to their reactions.

I do not intend to press my new clause to the vote, but it is important that we continue to debate this matter, because this enormous database—which can contribute to the general welfare of all humanity—must be protected in such a way that it retains confidence and ensures the security of the whole system. With that, I leave the discussion to continue on other matters.

Pete Wishart Portrait Pete Wishart
- View Speech - Hansard - - - Excerpts

Thank you ever so much, Madam Deputy Speaker—other matters we shall attend to.

I speak in support of new clauses 2 to 6 and new clause 14, which I enthusiastically support. I believe that those new clauses represent our very last chance to guarantee at least a bit of security for our creative industries in the face of what can only be described as the almost existential threat posed by generative AI. This is critical. I listened to the Minister very carefully, but this lackadaisical approach and the progress he is intending do not properly reflect the scale of the threat and challenge that our creative industries are currently confronted with. I accept that we have come a long way in this debate, and I accept the positive tone the Minister tries to take when dealing with these issues. I believe that he is sincere about trying to find a solution—he wants to get to a place where both the AI companies and the creative industries are satisfied. I am not entirely sure that we will get to that place, but I wish him all the best in those efforts.

We have certainly come a long way since the first statement we had in this House. I am sure that hon. Members will remember the belligerent way in which the Secretary of State presented that first statement— I am surprised that he is not here today. He was almost saying to the creative industries that they had to take it on the chin in order to satisfy this Government’s attempts to find some economic growth—which they have so far found elusive—in the shape of unfettered artificial intelligence, and that we should just get on with that agenda.

Alison Bennett Portrait Alison Bennett (Mid Sussex) (LD)
- Hansard - - - Excerpts

Yesterday, I spoke to a local author in Mid Sussex, Chris Bradford. He has written a number of brilliant children’s books, including the “Young Samurai” series, which my own children enjoyed a few years ago. Going back to the point made by the hon. Member for Gosport (Dame Caroline Dinenage), Chris told me that he is not against AI—he can see that it has uses—but that what we are seeing is blatant theft. Does the hon. Member for Perth and Kinross-shire (Pete Wishart) agree that the creative industries are part of the answer to growing our economy?

Pete Wishart Portrait Pete Wishart
- Hansard - - - Excerpts

I agree with the hon. Lady, and I will give her a personal example—I should have declared my interests, as set out in the register. Throughout at least the past five decades, artists have worked with the technology that is available. It is the first thing they turn to when going to the studio to make a new recording. The first thing they do in the film industry is to look for all sorts of innovation. It is absurd to suggest that somehow people who work in the creative sector will not embrace this new technology and use it for all its worth, so I fully accept what she said.

16:14
I want to refer to the public campaign, because I am sure I am not alone in having an inbox full of correspondence from concerned constituents. Our constituents are right with our artists in the creative sector when it comes to this issue. Members will know that I have been evangelising on the value of copyright and intellectual property rights for at least two decades, and I always felt I was having quite a job in getting through and reaching our constituents, but by God have they got it now. They understand and appreciate that our gold-standard copyright regime underpins our success in the creative sectors right around the world. It is our IP rights and copyright that ensure that the artists our constituents love, and on whose behalf they write, are properly rewarded for the works they produce and recognised for the wonderful things they bring forward.
Is it not ironic that just at the point when our constituents are starting to get it and see the value of copyright, we have a Government who are moving away from the central principles of that copyright regime and are maybe at the point of watering it down, when it has given us sustained success over all these years? I want to thank the campaign and the artists for doing this, because it has moved the Government on significantly. We have now seen them abandon their preferred opt-out mechanism when it comes to their consultation, and we welcome that. Instead of resolving the debate, it seems that the Government are kicking it into the long grass, to this no man’s land of economic assessments, reports and consulting with the sector. That might take years, and we have not got years. We have artificial intelligence, in the form of the bots and training machines, hoovering up all our cultural heritage at this very moment. The matter is pressing, and we need solutions now.
The Government have introduced new clauses 16 and 17, and I do not think anybody has any problem with them—it is good to see that they have finally put a brake on some of their ambitions with the opt-out. What we need is real action. Let us hear more about the dynamic licensing and rights enforcement system that the Minister continually goes on about. If we have to have legislation, it has to be done quickly. We have not got years left to do all this. It could take up to 12 months just to get the reports that the Government mention in their amendments. Beyond that, legislation has to be designed, presented to Parliament and debated here, and then it has to be implemented. We have not got the years to do that. Every day that we delay, more of our cultural heritage is scraped, reused and monetised by tech giants. We cannot afford to sit back and wait for years. Our creators deserve protection now, not after irreversible harm has been done.
The amendments that we have before us today offer that protection. That is what the creative sector wants, and I hope that the Government will take the amendments seriously. Our creators simply ask to be treated fairly and compensated for all the work that they do, and they want to have the right to control how their work is used.
Transparency is where we now hoist our flag, and it is the hill we fight on. It is the most practical solution, and there is no technical reason why we cannot have full transparency. There is no reason why artists, creators and inventors should not know that their work is being scraped by generative AI bots. There is no reason why we cannot make sure that we deal with this and make sure that all our musicians have the right to assert their rights. The only thing stopping that is a fear of angering the tech lobbyists, but we must take seriously the property rights of 2.4 million creatives who contribute £126 billion to the UK economy every year.
John McDonnell Portrait John McDonnell (Hayes and Harlington) (Ind)
- Hansard - - - Excerpts

The right hon. Gentleman has rightly referred to creatives throughout the debate. As I have said in earlier debates, I am the secretary of the parliamentary group of the National Union of Journalists, and we have expressed our concern about journalists and photographers, whom we also represent. The union position is very straightforward, espousing adherence to copyright but also to openness and transparency, and regulation of the mechanisms that will be used in future for scraping in particular. This is now having an impact on the quality of journalism, on which our democracy rests.

Pete Wishart Portrait Pete Wishart
- Hansard - - - Excerpts

The right hon. Gentleman is right to remind us about journalism. What has been notable, along with the clumsy way in which the Government have approached these issues, is the unity that exists throughout the creative sector, taking on board what is happening in journalism. We have seen some fantastic coalitions of interests emerging from all this. That is another positive development, and I just hope that the Government are satisfied when they see the outcomes.

AI and creativity can work together. I gave an example of that to the hon. Member for Mid Sussex (Alison Bennett). We have all been encouraged to think that there is a divergence between the position of those in the creative sector, such as artists, and the position of those who are involved in the tech sector and, in particular, AI. There should be an approach that works for everyone involved. The AI companies know that our content is immensely valuable. They refuse to pay for anything at present, not because they do not understand the value but because they have spotted an opportunity to hoodwink Governments around the world into believing that they should not have to pay for an essential resource.

Polly Billington Portrait Ms Polly Billington (East Thanet) (Lab)
- Hansard - - - Excerpts

One of my colleagues had a conversation with representatives of the AI sector. She was very enthusiastic about the idea that there would be an enormous amount of growth in this country if they were able to adopt what they wanted, so she asked, “What would you like?” They replied, “We want the BBC archive, for free.” In circumstances of this kind, we need to think not only about transparency but about the second stage, which is licensing. Without the opportunity for small creators to have the power to permit and therefore to be paid, all this will be for nothing.

Pete Wishart Portrait Pete Wishart
- Hansard - - - Excerpts

The hon. Lady is spot on. The Minister continues to go on about licensing arrangements, and I think that is the territory we want to move this on to. We need to hear more about the Government’s ambitions right now, and about what they are planning to do. The hon. Lady should have a look at the submissions to the consultation from the big tech companies such as OpenAI—it is a horror show. An opt-out is even too far for them.

I have enjoyed working with Labour colleagues during these debates. They have said all the right things, and I think that, as usual, they recognise some of the difficulties in the sector, but I appeal to them now to support, in particular, new clause 2, if it goes to a vote. It is no good just saying all the right things; this is about voting in the right direction. There is no other chance, because this is the only opportunity. We must offer some protection to our creative sector over the next few years, because nothing else will appear during that time. We will all become involved in the consultation and we will all be taking part in the legislation when it comes here, but that is years away. This is the only thing that we can do to offer some support to the creative sector, and I urge everyone to support the new clauses.

Allison Gardner Portrait Dr Allison Gardner (Stoke-on-Trent South) (Lab)
- View Speech - Hansard - - - Excerpts

I welcome the opportunity to speak in support of the Bill and to address some of the amendment proposed, particularly Government new clauses 16 and 17.

New clause 17 is entitled “Report on the use of copyright works in the development of AI systems”. I am pleased to note, in subsection (3)(b), that the report will

“the effect of copyright on access to, and use of, data by developers of AI systems (for example, on text and data mining)”.

I also note that “developers” are specifically broken down into

“individuals, micro businesses, small businesses or medium-sized businesses”.

It is right to provide for that level of granularity. Similarly, I note that the report will

“consider, and make proposals in relation to… the disclosure of information by developers of AI systems about”

their use of copyright data to develop AI systems and “how they access” that copyrighted data,

“for example, by means of web crawlers”.

I am pleased to see discussions of licensing included in the report, and an exploration, again in granular detail, of the impact of a licensing system on all levels of developers. However, I would have liked to see an equal level of granularity for copyright owners to understand the effects of proposals outlined in subsection (3). Subsection (4) states that

“In preparing the report, the Secretary of State must consider the likely effect of proposals, in the United Kingdom, on… copyright owners”

as well as developers and users of AI systems. Although I note that new subsection (4) refers to individuals, microbusinesses and so on, I feel that there is a little vagueness as to whether this level of granularity is afforded to copyright owners as well.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

That is not intentional. It is exactly the same level of granularity that we will go into in our reporting.

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Well, I will just throw the rest of my speech away, then. I shall persevere. Will the report explore the effects of the proposed solutions and the resulting protections on individual creators?

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Micro creative businesses?

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

And small publishers?

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Right, so can I push it further?

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

There seem to be an awful lot of David Attenborough TikTok videos, but it is not him. I wonder whether this measure will apply to personality rights, and about the definition of a “small rights owner”. I will just squeeze that in.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Personally, I am in favour of doing something about personality rights, but it is one of the things that is in the consultation, to which will we respond. It is one of the things for which we will need to legislate in the round.

Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Perfect.

I asked the Secretary of State what reassurances can be given that smaller creatives, including microbusinesses and small creative businesses, will be considered in the report so that they can have confidence that the systems finally applied will work for them, particularly when we consider an individual’s early career—think of Ed Sheeran strumming away in his bedroom in his pre-fame days—and how they can protect their copyrighted works against the global tech giants.

New clause 16 addresses the economic impact on both copyright owners and AI developers, and I want to switch from talking about copyright owners to trying to defend the AI industry. If we do not get the controls right, we risk the mid and long-term success of the AI industry. If we do not get a fair solution for the creative and AI industries, we risk a reduction in the quantity, and potentially in the quality, of human-created data and an increase in AI-generated creative data.

I will briefly segue, because we are developing a lot of AI-created content that might be subject to copyright. A report recently pointed out that 18% of Spotify content is now AI-generated. People might remember the big hoo-ha when an AI-generated image won a photographic competition, which caused a lot of disturbance, but a lot of creative skill was involved in how the photographer developed and produced that image. No, it was not a photograph, but it is in a category of its own. I feel that is also creative content and copyrighted data, so there is a grey area.

If we start to generate more and more AI-created data and less and less high-quality human-generated data, because of the challenges to the creative industry, there is a danger that AI models will start scraping and training on AI-generated data, potentially leading to a reductive spiral into mediocrity, with some even suggesting that this could result in model collapse. On new clauses 16 and 17, I encourage the House to consider the impact of not employing proposals such as licensing and protecting the generation of new human-created content, given the risks posed to AI models and developers in the long term.

I will briefly comment on amendments 37 and 38, tabled by my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah). She ably outlined the reasons for the amendments, so I will not go into a lot of detail, but I want to point out that getting the definitions correct will prevent a loophole whereby AI companies can misuse personal data by claiming that their commercial development is scientific research. The amendments would provide transparency on the use of data by researchers in order to maintain confidence in this country’s ethical, legal and professional high standards in academic research. I hope the Minister will give careful consideration to the points I have raised.

I am now going to give my Whip, my hon. Friend the Member for Cardiff North (Anna McMorrin), a heart attack because I am going to refer to amendments 41 to 46 to clause 80 on article 22 of the UK GDPR, which is close to my heart. She is not to worry, though; I read those amendments with great interest and I understand the back-up they would provide, but although I am a newbie MP, as I read them—in my understanding, given the little work I did in my previous job with a regulator—I felt that they were more like secondary legislation. They could be considered for the future, particularly amendment 46, which includes some very welcome additions. However, when it comes to primary legislation, I feel that the Bill works better as it stands.

16:29
I want to go off-piste a little bit more to support my hon. Friend the Member for Birmingham Edgbaston (Preet Kaur Gill) in trying to make sure that we have accurate data and fill in the gaps that exist in databases. It is important that we address that issue, and her new clauses 22 and 23 are worthy of consideration.
Equally, I approached new clause 21 with an open mind, because it is vital that we collect biological sex data to protect women and trans people, but as I read it I had a developing sense of unease—because how does the determination of accuracy of data impact on the individual, and if we start looking at those two protected characteristics, what about the others? I feel it is a little bit of a slippery slope; I wonder if I would have to go around with my baptism certificate to prove my religion, and how would I prove my sexuality? I am afraid I developed a growing unease about that new clause, but I support the idea of accurate data collection for both gender identity and biological sex, which is very important.
I will not bore the House with information about where the alcohol dehydrogenase enzyme sits in male and female bodies, but there is a reason why men do not get drunk, or rather why women get drunk more easily. Oh, I am going to tell everyone, because I have started to do so—I will make it relevant. ADH in men is located and primarily active in the stomach, while in women it tends to be in the liver, so in men, alcohol gets broken down before it even gets into the bloodstream; it has nothing to do with body mass ratio. Understanding biology in that way highlights the importance of getting accurate data for health and scientific research, and with that extra knowledge about alcohol dehydrogenase enzymes, I will leave it there.
None Portrait Several hon. Members rose—
- Hansard -

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

Order. From the next speaker, there will be a five-minute time limit.

Max Wilkinson Portrait Max Wilkinson (Cheltenham) (LD)
- View Speech - Hansard - - - Excerpts

As many Members will be aware, my constituent Ellen Roome knows only too well the tragedies that can take place as a result of social media. I am pleased that Ellen joins us in the Gallery to hear this debate in her pursuit of Jools’ law.

In 2022, Ellen came home to find her son Jools not breathing. He had tragically lost his life, aged just 14. In the following months, Ellen battled the social media giants—and she is still battling them—to try to access his social media data, as she sought answers about what had happened leading up to his death. I am grateful to the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), for raising this in his speech. In her search for answers, Ellen found herself blocked by social media giants that placed process ahead of compassion. The police had no reason to suspect a crime, so they did not see any reason to undertake a full investigation into Jools’ social media. The inquest did not require a thorough analysis of Jools’ online accounts. None of the social media companies would grant Ellen access to Jools’ browsing data, and a court order was needed to access the digital data, which required eye-watering legal fees.

The legal system is unequipped to tackle the complexities of social media. In the past, when a loved one died, their family would be able to find such things in their possession—perhaps in children’s diaries, in school books or in cupboards. However, now that so much of our lives are spent online, personal data is kept by the social media giants. New clause 11 in my name would change that, although I understand that there are technical and legal difficulties.

The Minister and the Secretary of State met Ellen and me this morning, along with the hon. Member for Darlington (Lola McEvoy), and we are grateful for the time they gave us. My new clause will not go to a vote today, but we will keep pushing because Ellen and other parents like her should not have to go through this to search for answers when a child has died. I understand that there are provisions in the Bill that will be steps forward, but we will keep pushing and we will hold the Government’s and all future Governments’ feet to the fire until we get a result.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

It was great to meet this morning, although I am sorry it was so late and so close to Report stage; I wish it had been earlier. We were serious in the meeting this morning: we will do everything we possibly can to make sure that coroners understand both their powers and their duties in this regard, and how they should be operating with families and the prosecuting authorities as well if necessary. We will also do everything we can to ensure that the technical companies embrace the point that they need to look after the families of those who have lost loved ones when they are young.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I thank the Minister for his intervention. He is absolutely right. There are clear issues of process here. There are differential approaches across the country—different coroners taking different approaches and different police forces taking different approaches. The words of Ministers have weight and I hope that coroners and police forces are taking note of what needs to happen in the future so that there are proper investigations into the deaths of children who may have suffered misadventure as a result of social media.

On related matters, new clause 1 would gain the support of parents like Ellen up and down this country. We need to move further and faster on this issue of social media and online safety—as this Government promised on various other things—and I am pleased that my party has a very clear position on it.

I will now turn to the issue of copyright protections. I held a roundtable with creatives in Cheltenham, which is home to many tech businesses and AI companies. The creative industries in my town are also extremely strong, and I hear a lot of concern about the need to protect copyright for our creators. The industry, is worth £124 billion or more every year, remains concerned about the Government’s approach. The effects of these issues on our culture should not be understated.

We would be far poorer both culturally and financially if our creatives were unable to make a living from their artistic talents. I believe there is still a risk of the creative industry being undermined if the Government remove protections to the benefit of AI developers. I trust that Ministers are listening, and I know that they have been listening over the many debates we have had on this issue. If they were to remove those protections, they would tip the scales in favour of AI companies at the cost of the creative industry. When we ask AI companies and people in tech where the jobs are going to come from, the answers are just not there.

The amendments tabled by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins) would reinstate copyright protections at all levels of AI development and reinforce the law as it currently stands. It is only fair that when creative work is used for AI development, the creator is properly compensated. The Government have made positive noises on this issue in multiple debates over the last few months. That is a positive sign, and I think that in all parts of this House we have developed a consensus on where things need to move—but creatives remain uneasy about the implications for their work and are awaiting firm action.

Ministers may wish to tackle this issue with future action, and I understand that it might not be dealt with today, but our amendments would enable that to happen. They also have an opportunity today: nothing would send a stronger signal than Government support and support from Members from across the House for my hon. Friend’s amendments, and I implore all Members to back them.

Jonathan Davies Portrait Jonathan Davies (Mid Derbyshire) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 4, 16 and 17, but first let me say that this is a very ambitious and weighty piece of legislation. Most of us can agree on sections or huge chunks of it, but there is anxiety in the creative industries and in the media—particularly the local media, which have had a very torrid time over the last few years through Brexit and the pandemic. I thank UK Music, the News Media Association and Directors UK for engaging with me on this issue and the Minister for his generosity in affording time to Back Benchers to discuss it.

AI offers massive opportunities to make public services and businesses more effective and efficient, and this will improve people’s lives. However, there is a fundamental difference between using AI to manage stock in retail or distribution, or for making scientific breakthroughs that will improve people’s health, and the generative AI that is used to produce literature, images or music. The latter affects the creative industries, which have consistently seen faster and more substantial growth than the overall economy. The creative industries’ gross value added grew by over 50% in real terms compared with the overall UK economy, which grew by around a fifth between 2010 and 2022. That is why the Government are right to have identified the creative industries as a central plank of their industrial strategy, and it is right to deliver an economic assessment within 12 months, as outlined in Government new clauses 16 and 17. I welcome all that.

I know it is not the Government’s intention to deal with copyright and licensing as part of the Bill, but because of the anxiety in the sector the issues have become conflated. Scraping is already happening, without transparency, permission or remuneration, in the absence of a current adequate framework. The pace of change in the sector, and the risk of tariffs from across the pond, mean it is imperative that we deal with the threat posed to the creative industries as soon as possible. We are now facing 100% tariffs on UK films going to the USA, which increases that imperative.

I welcome the Government’s commitment to engage with the creative industries and to implement a programme to protect them, following consultation. I would welcome an overview from the Minister in his summing up about progress in that regard. The more we delay, the worse the impact could be on our creative sector. I am also concerned that in the Government’s correct mission to deliver economic growth, they may inadvertently compromise the UK’s robust copyright laws. Instead, we should seek to introduce changes, so that creatives’ work cannot be scraped by big AI firms without providing transparency or remunerating the creatives behind it. Failure to protect copyright is not just bad for the sector as a whole, or the livelihoods of authors, photographers, musicians and others; it is bad for our self-expression, for how robust the sector can be, and for how it can bring communities together and invite us to ask the big questions about the human condition. Allowing creators to be uncredited and undercut, with their work stripped of attribution and their livelihoods diluted in a wave of synthetic imitation, will disrupt the creative market enormously. We are not talking about that enough.

It is tempting to lure the big US AI firms into the UK, giving the economy a sugar rush and attracting billions of pounds-worth of data centres, yet in the same breath we risk significantly draining economic value from our creative industries, which are one of the UK’s most storied pillars of our soft power. None of this is easy. The EU has grappled with creating a framework to deal with this issue for years without finding an equitable solution. I do not envy what the Government must navigate. However, I ask the Minister about the reports that emerged over the weekend, and whether the Government are moving away from an opt-out system for licensing, which creatives say will not work. Will that now be the Government’s position?

Harnessing the benefits of AI—economic, social and innovative—is not diametrically opposed to ensuring that the rights of creatives are protected. We must ensure transparency in AI, as covered in new clause 4, so that tech companies, some of which are in cahoots with some of the more troubling aspects of the US Administration, do not end up with the power to curate an understanding of the world that reflects their own peculiar priorities. Big AI says transparency will effectively reveal its trade secrets, but that need not necessarily be the case, as my hon. Friend the Member for South Derbyshire (Samantha Niblett) said. A simple mechanism to alert creators when their content is used is well within the abilities of these sophisticated companies. They just need the Government to prod them to do it.

The Government are working hard. I know that they care passionately about the sector, and the economic and social value it brings. I look forward to hearing how they will now move at pace to address the concerns I have outlined, even if they cannot do so through the Bill.

John Whittingdale Portrait Sir John Whittingdale
- View Speech - Hansard - - - Excerpts

The Minister referred, in his opening remarks, to the fact that the Bill has been a long time in its gestation. It is very nearly two years since the first meeting of the Bill Committee, which I attended, to take through what was pretty much an identical Bill. At that time, it was uncontroversial and the Opposition supported it—indeed, I support it today. There are a lot of measures we have not discussed because they are universally accepted, such as the national underground asset register, smart data provisions and the relief on some of the burden of GDPR.

I congratulate Baroness Kidron, who very successfully attached to the Bill amendments to address a different, but vital issue: protection of the creative industries with respect to copyright. Therefore, I support new clauses 2 to 6, which are essentially Baroness Kidron’s amendments that were passed in the House of Lords. The Minister said that it was not the intention to legislate at this time, that the Government want to wait and are consulting, and that they have tabled two amendments. However, one of the measures is to conduct an economic impact assessment, which the Government would always have had to do anyway, and the other is to commission a report into such things as technical standards and transparency. As the hon. Member for Perth and Kinross-shire (Pete Wishart) has pointed out, that will simply delay things even longer, and this is an issue that must be addressed now, because generative AI models are currently scraping and using material.

In our view, the law is clear. The Minister asks why we need new clause 2 if all it says is that people should obey the law, and if we also believe the law is clear. One of the reasons it is so important is that we can enforce the law only if we know that it is being broken. That is why transparency is absolutely vital; it is only with transparency that rights owners can discover the extent to which their content is being used by generative AI, and then know how to take action against it.

I absolutely agree that it is not that the creative industries are against artificial intelligence. Indeed, a lot of creatives are using it; a lot of them are developing licensing models. However, for some, it is an existential question.

16:44
Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

On the point about transparency, the law is the law—it already exists. However, the law can be enforced, and people can be punished, only if actions that break our current laws come to light. Does the right hon. Gentleman agree that this is another reason that new clause 2 is essential?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree. The hon. Gentleman has stated the case: in order to enforce the law, we have to know who is breaking it.

There are all sorts of legal actions already under way, but this issue is about the extent to which scraping is going on. I agree with the right hon. Member for Hayes and Harlington (John McDonnell) on the importance of newspapers and the press. The press face the particular problem of retrieval-augmented generation—a phrase I did not think I would necessarily be introducing—which is the use of live data, rather than historic data; if historic data is used, it often produces the wrong results. The big tech companies therefore rely on retrieval-augmented generation, which means using current live data—that which is the livelihood of the press. It is absolutely essential for publishers that they should know when their material is being used and that they should have the ability to license it or not, as they choose.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

The issue the right hon. Gentleman is addressing is the immediacy of the threat within the journalistic sector at the moment. I missed the opening remarks by both Front Benchers because I was in the debate on the personal independence payment, but I am sure my hon. Friend the Minister was as eloquent as ever in advocating for the Government amendments; he is a very persuasive fellow. However, those amendments are merely about publishing a report in 12 months’ time—that is all. There will be parts of the journalistic sector that will no longer exist in 12 months’ time as a result of this legislation.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I completely agree. I do not doubt the Minister’s sincerity in wanting proper close examination, but this matter is urgent. New clause 2 and the associated measures simply state the law as it currently stands and give rights owners the essential ability to know when their material is being used, so that they can choose whether they wish to license it, and, if they do not, to take action against its use.

There is only one other point I want to raise today, as a number of speeches have been made in this debate that have very eloquently set out the case for each of the new clauses, including by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins), and indeed by the Chair of the Culture, Media and Sport Committee, my hon. Friend the Member for Gosport (Dame Caroline Dinenage). For the other concern that I want to raise, the Minister will need to put his other hat back on for a moment. Earlier in the day, he was speaking as the Minister for Creative Industries, Arts and Tourism about the threat from the possibility of tariffs on the film industry. Obviously, we are concerned about the general question of US tariffs, and there is talk about trying to achieve a trade deal—in the President’s words, a “beautiful trade deal”—which would mean that the UK was protected. However, we are told that one of the prices that could be attached to such a deal could be relieving the burden of regulation on tech companies.

I am afraid that we know how the tech companies define burdensome regulation. In their view, copyright is a burdensome regulation, not a legal obligation or moral right of rightsholders. I hope the Minister will make it clear that we will not sacrifice the rights of creative industries and copyright owners in order to obtain a trade agreement and that, at the same time, we will not dilute other, very important digital legislation, such as the Digital Markets, Competition and Consumers Act 2024, which I understand is also potentially on the table.

I will not speak any longer, because the case has already been made. I will say only to the Minister that although it is clear that new clauses 2 to 6 command quite a lot of support on both sides of the House, I have no doubt that the Government will defeat them if they choose to do so tonight. However, he will be aware that they were originally made in the House of Lords, and he may find it harder if that House chooses to push the amendments through. I would not like to be back here next year once again trying to put through a data Bill because this one has failed.

Jen Craft Portrait Jen Craft (Thurrock) (Lab)
- View Speech - Hansard - - - Excerpts

I rise to support the Government’s amendments and new clauses, particularly new clause 16, which addresses the relationship between artificial intelligence and copyright and which I strongly welcome. By slightly broadening the scope of the Bill, the amendments demonstrate Ministers’ attention to this pressing detail and reflect some of the comments by colleagues and the creative sector.

The existing legal framework with regard to copyright is not fit for purpose in the face of new and developing AI technologies. Colleagues who have much greater expertise and knowledge than me have contributed to this debate, but I want to offer a reflection and draw attention to the experience of an individual—one of my constituents—as I believe it highlights the real human impact that big tech companies can have in running rampant over copyright laws.

My constituent, Susan, is an author. She has had 32 of her books and, she calculates, more than 1 million published words used by Meta without her consent. The pirating of material has serious human impacts on those in the creative industries. Susan’s life work and source of income was downgraded and devalued almost instantaneously. Her intellectual property was accessed without her permission and used to inform an AI system designed to mimic her work. Susan described that to me and said that she felt violated, as if someone had come into her house and stolen her things, and she is not alone.

I have been contacted by other professionals in the creative industries in my constituency who have also had published material used without their consent by AI. A local author has had their works harnessed through an online library of pirated books, and a local illustrator said that her work was scraped to train an AI model with images and videos taken from websites and social media without her permission. That practice is widespread and plainly wrong, even to a lay observer who is not versed in technical expertise, yet rightsholders are often impotent against big tech companies and their sizeable financial and legal assets.

Polly Billington Portrait Ms Billington
- Hansard - - - Excerpts

I observe that there is an issue of territoriality here. We have actually managed to get carve-outs and protect this country from deepfakes, for example; if something is made abroad, it cannot be used in this country. Does my hon. Friend agree that we should be able to have similar carve-outs for creatives, such as her constituent and my constituents, so that if AI-generated material is made elsewhere, it cannot be deployed in this country, in order to preserve a proper legislative framework to protect the rights of our creatives?

Jen Craft Portrait Jen Craft
- Hansard - - - Excerpts

My hon. Friend highlights a very strong issue. I agree that our current copyright laws are basically being infringed on and people who are rightsholders are unable to seek the recourse that they fully deserve under the law. There should be a carve-out, so that if there is illegal content in this country, people should have recourse to the law and be able to protect their own copyrighted material. I am pleased to see the Government commit to action on this complex issue. I hope that time will be allowed in the House for us to scrutinise this issue and to investigate properly the impact of policy options, which will be considered as part of the consultation.

I understand the complexities of legislating in this area, but those in the creative industries want to see action now, which is understandable. We must create a system that can feasibly and effectively enforce existing copyright law, bring transparency in the use of materials by AI systems, and remunerate rights holders. I support the Government’s plans to do this through primary legislation with proper scrutiny of the measures, rather than through an addendum to a broader piece of legislation. However, I appreciate that there is a balance to be struck—where growth is supported in both the creative and tech industries—but creatives must never be expected to forfeit their rights to serve that purpose.

As my constituent is at pains to point out, real people and real livelihoods are already being impacted by unregulated AI. It is crucial that we get this right, and provide much needed legal certainty to protect intellectual property in the creative industries. This must happen soon, because, while infringements of copyright law go unaddressed, it is those in our vital creative industries who are losing out.

None Portrait Several hon. Members rose—
- Hansard -

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

Order. Many people wish to speak in this debate, so before I call the next speaker I ask Members please to be mindful when taking interventions. I will now impose a four-minute time limit.

Vikki Slade Portrait Vikki Slade (Mid Dorset and North Poole) (LD)
- View Speech - Hansard - - - Excerpts

We live in a rapidly changing world. Like everyone else, I am sure that I am guilty of handing my data to organisations every hour of every day, oblivious to the impact on my privacy. I am also guilty of absorbing and using content assuming that it is trustworthy and that it has been obtained fairly.

On the other hand, my generation has been fortunate to have seen the introduction of social media and the online world, and to have experienced the time before it, which perhaps provides us with a level of scepticism about what we see, and an ability to switch it off and distance ourselves from the onslaught to our senses that digital content can provide.

Like other interventions of the past, we are now at a crossroads where we must pause and not simply plough on. The Bill gives us the opportunity to make it clear to the tech giants that we are not giving them everything that we have created, that they cannot own our children, and that we value our data as part of our identity.

Some of the amendments give us a great opportunity to improve the Bill—to make the most of this moment in time and to make sure that we do not leave people behind. We know that children’s brains continue to develop until they are in their early 20s. We know that young people’s development leads them to be risk takers in their adolescence and teenage years, and, as adults, we sometimes have to take decisions to curtail their fun to protect them. My own children have enjoyed social media from the age of 13, but, as the sector develops, and our understanding of its addictive nature improves, it is critical that we reflect that in law. Lifting the age of consent for social media data collection, as in new clause 1, will help to protect our children at the time they need it.

It is unimaginable to lose a child and to do so in the circumstances where the reasons behind their death are unclear, which is why I signed new clause 11 tabled my hon. Friend the Member for Cheltenham (Max Wilkinson), which would allow bereaved parents access to their child’s social media content. This should not be necessary given that GDPR and privacy rights do not apply to those who have died. The fact that we even need such legislation calls into question the motivation of tech giants and tells us where their interests lie. I urge the Government to support this and welcome the assurance today that more work will be done.

Trust is at an all-time low not only in the Government but in other authorities such as the NHS. As AI changes how we interact with the state, commerce and each other, the public should have a right to know how and when AI is involved in the decisions made. Transparency matters, which is why I am supporting the new clauses proposed by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins). We know that if we use each other’s content we must pay for it, or at least credit it if we are not profiting from it. We know that if we do not, we infringe that copyright, so why should tech giants, probably based in some far-flung place, have a right to scrape that content without knowledge or payment? The idea that they even need to train their systems off the backs of people who have used their talent and time and made their living through creativity is obscene.

I really must speak strongly against new clause 21. I have been overwhelmed by the scale of distress brought about by this awful proposal. It is cruel and it completely undermines the privacy of people who are transgender at a time when they are already feeling victimised.

Those who have transitioned socially, medically or surgically are protected in law, and we were told that the Supreme Court decision last month does not change that. But new clause 21 does. If it were passed, sex at birth would be recorded on a driving licence or passport, outing every trans person whenever they buy an age-restricted product, change their job, travel abroad, or even come to Parliament to visit their MP. Not only is this a fundamental breach of privacy, but it is potentially dangerous. They would be prevented from travelling to countries with poor records on rights, and they would be at higher risk of suicide and self-harm than they already are. A constituent said,

“This is a direct attempt to erase me from the public record.”

Please reject this new clause 21.

17:00
Stella Creasy Portrait Ms Creasy
- View Speech - Hansard - - - Excerpts

In the short time available to me, I want to speak to four amendments. On two of them, I would like to urge the Minister to think again. On one, I am in total agreement with the Minister that we should oppose it; the other is one that I want to draw to the House’s attention.

First, I join the Chair of the Culture, Media and Sport Committee, the hon. Member for Gosport (Dame Caroline Dinenage), the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah), my hon. Friend the Member for South Derbyshire (Samantha Niblett) and the indomitable Baroness Kidron, who joins us today from the Gallery, in encouraging the Minister to look again at amendments on AI and copyright. We know that this problem will come back and that we need to move at pace.

I represent Walthamstow, the home of William Morris, the creators and makers—and creatives abound. At least William Morris could protect his wallpaper patterns. With the AI technologies we see now moving so quickly, unless we stand up for British copyright technology, we will be in a very different place. The Minister says that if we do not pass new clause 2, we will still have copyright law tomorrow, and he is right, but we will not have the tools to deal with the technology we are dealing with now.

This issue is about not just the Elton Johns, the Ed Sheerans, the Richard Osmans or the Jilly Coopers, but the thousands of creators in our country—it is their bread and butter. Nobody is opposing technology, but they are saying that we need to act more quickly. I hope to hear from the Minister what he will do in this area. New clause 14, which has not been selected, is about the question of transparency and will help creatives exercise their rights.

Briefly, I want to support what the hon. Member for Mid Dorset and North Poole (Vikki Slade) said about new clause 21. I have always supported the appropriate collection of data, but this is not an appropriate collection of data. It is a targeting of the trans community, which is deeply regressive.

I praise the Government for what they are doing with schedule 11—and I wager that nobody else in this Chamber has looked at it. The Victims and Prisoners Act received Royal Assent in May 2024. Section 31 of the Act provides a mechanism to delete data that has been created as part of a malicious campaign of harassment. Schedule 11 is a technical amendment to GDPR laws that will make that Act, which got cross-party support, possible to enact.

For parents and carers, the thought that someone who disagrees with them might use the auspices of social services to try to remove their children because of that disagreement is impossible to comprehend. It is a nightmare that I have lived through myself. Thanks to my local authority, I am still living through it, because the record created by the person who did this to me remains on the statute book, along with the allegation that I am a risk to my children because of the views that I hold.

The primary intent of the man who made this complaint was to trigger an investigation into my private life. The judge who convicted him of harassment said that it was one of the worst examples of malicious abuse in public life that he had seen. The judge demanded that the file be stricken, as did I when it first came to light and when the man was subsequently convicted of harassment. However, Waltham Forest council continue to argue that they have to retain that data to protect my own children from me. This is an example not of how data is used to safeguard but how data can be used to harm by its existence. It is not a benign matter to have such a record associated with one’s name. Anyone who has ever been to A&E knows that the question, “Is your child known to social services?” is not a neutral inquiry. Not having a way of removing data designed to harass will perpetuate the harassment.

My local authority has not labelled the fathers who are MPs in my borough in the same way, but it argues that it must retain this data about me under section 47 of the Children Act 1989, regarding children who might reasonably be considered at risk of harm from an individual. To add insult to injury, the council has not offered to delete this data but told me that I can add to it a note to dispute the claims by the person who has been convicted of harassing me about my fitness to be a parent, and then the council might consider including the note—add more data to a file, therefore, rather than remove it. That will keep the link between me, my family, these allegations and the gentleman who harassed me in the first place. I have never received any form of apology or acknowledgement.

There have always been strong grounds and legal processes to remove malicious records. It is also right that we set a high bar, as the 2024 Act did. This consequential amendment in the Bill should now mean that the Government can use the affirmative resolution to make that law a reality. We cannot delete the misogyny at the heart of Waltham Forest council’s response, but we could finally delete the records and those of others like them and move on with our lives—

Judith Cummins Portrait Madam Deputy Speaker (Judith Cummins)
- Hansard - - - Excerpts

Order. I call Steff Aquarone.

Steff Aquarone Portrait Steff Aquarone (North Norfolk) (LD)
- View Speech - Hansard - - - Excerpts

My new clause 7 would ensure that, alongside the creation of a digital verification framework, there would be a right to use non-digital ID. Digital exclusion is a major challenge for many communities around the country, including in North Norfolk. Part of the answer is to improve digital education and bring those numbers down, but, as Age UK rightly says,

“it will never be possible to get everyone online.”

The progress we make in the digital age must ensure provision for those who will not be supported by it, or that they are not left behind or excluded.

Older people are not the only ones who struggle with digital exclusion—poverty is also a significant driver. A study in 2021 showed that more than half those who are offline earned less than £20,000 a year. The Government told the Lords that if it turned out that people were being excluded, they could consider legislating, but how many people earning less than £20,000 a year will be taking a business through the courts—perhaps as far as the Supreme Court—to secure their rights? Why are we waiting for it to go wrong, placing the onus on vulnerable people to generate test cases and legal precedent when we could put this matter to bed once and for all with this simple addition to the Bill?

I will also speak in support of new clause 1. It has become abundantly clear to us all that we cannot trust the social media giants to keep our children safe. In fact, I would go as far as to say that they have very little interest in keeping children safe. The algorithms that drive these platforms, which are designed to keep users scrolling for as long as possible to maximise ad revenue, can be deeply damaging to children and young people. It is important to emphasise just how pervasive the content stream can be. Not every hon. Member may have experienced it, but pervasive, targeted content is not the same as a child seeing something distressing on the news. Once seen—if only fleetingly—there is the potential for them to be exposed to unsubstantiated, misleading or even traumatic content, or versions of that content, over and over again every few swipes as the algorithm realises it can suck them in, keep them scrolling and make profit for its social media giants. That is not what social networks set out to do, but it is what they have become.

Whatever the social media giants told the Government or the Opposition, whether “It is too complex,” “It would cost too much,” or, “The flux capacitor is not big enough,” that is just rubbish. If we simply removed the right to process personal data for under 16s, we would remove the algorithms’ ability to target them with content based on what they say and do. If the social networks cared about children’s wellbeing, they would have done that already. I hope that today we will finally take the action necessary to protect the next generation.

Overall, my views on the Bill remain broadly similar to the frustrations I expressed months ago on Second Reading. There is important, commendable and sensible stuff in the Bill, and I welcome that, but what is not in the Bill is more frustrating, as it could have put it in a much better position to harness the power of data. We could have addressed the litany of failures in public sector data use that the Government’s own review outlined just months ago. We could be equipping our civil service and public sector with the talent, culture and leadership to make us a global trailblazer in data-driven government. It is really frustrating that the Bill does not contain any of the steps necessary to make those improvements.

If we use data better, we do government better. I am sure that the whole House and all our constituents are keen to see that.

James Naish Portrait James Naish (Rushcliffe) (Lab)
- View Speech - Hansard - - - Excerpts

As all hon. Members in the Chamber know, data is the DNA of our modern life. It drives our economy, our NHS and our public services, often silently but ultimately powerfully. For too long, outdated data infrastructure across the British state has held us back, costing us billions and draining frontline resources. This ambitious Bill sets out to change that, so it should be welcomed.

For our NHS, the Bill will mean faster and safer care. More transferable and accurate patient data could save 140,000 hours of staff time annually, reduce duplicate lab tests, prevent up to 20 deaths each year from medication errors and improve overall patient safety and outcomes—that is real impact. It will also free up 1.5 million hours of police time, letting officers spend more time on our streets, not behind desks. The national underground asset register, as has been mentioned, could bring £4 billion to our economy by preventing costly infrastructure delays and accidents. The Bill will seek not just to cut red tape, but boost research, protect personal data and allow scientists and online safety experts to better access information to keep us all safer. I welcome that.

On Government amendment 16 on artificial intelligence and the creative industries, we have all seen the potential of AI, but that promise must not come at the expense of those who create. In that regard, I thank the dozens of constituents who have contacted me about their concerns. I welcome the Government’s recognition of the complex intersection between AI and copyright, and the need to get this right. It is clear that we must tread carefully and base any changes on robust evidence. I am therefore pleased that the Government are committed to publishing a full economic impact assessment and reporting to Parliament on key concerns, such as how AI developers access copyrighted materials, the transparency of their methods and licensing. I will also continue to support Equity with its work on the principle of personality rights.

New clause 21 proposes mandatory recording of sex at birth across all public authorities. The new clause would require all public authorities, whether the NHS, which I could potentially understand, or the Driver and Vehicle Licensing Agency, which I certainly could not, to record and retain people’s sex at birth even when someone has a gender recognition certificate. The new clause would seemingly require that regardless of context, purpose or relevance. That feels neither proportionate nor respectful of existing legal frameworks or the trans community at this difficult time.

It is important that we acknowledge that transgender, non-binary and intersex people already face considerable barriers in public life, and many of my constituents have shared with me in recent weeks just how much fear and uncertainty they are experiencing. Rushed amendments and changes, without dialogue with those impacted, are not in any way welcomed and could have very negative consequences.

Finally, on the theme of privacy, proportionality and protecting vulnerable people, will the Minister say whether any steps will be taken by his Department to end the collection and sharing of sensitive personal data when people use police, public, university or other websites to report crimes or abuses and the subsequent sharing of that data with third parties through tracking pixels? The use of such tools means people inadvertently share information with advertisers. I hope the Government will look into that and take it seriously.

Sarah Olney Portrait Sarah Olney (Richmond Park) (LD)
- Hansard - - - Excerpts

I was glad to add my name to new clauses 1 to 6, tabled by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins).

I speak to new clause 1 on the age of data consent. Currently in the UK, the minimum age of digital consent—the age at which children can consent to having their data processed—is 13. The new clause would raise the minimum age for social media data processing to 16, meaning social media companies would not be able to process the data of children for their algorithms. I hope that that would allow children to enjoy many of the educational benefits and relevant services that social media can offer, as well as continuing to enjoy the freedoms of engaging with friends but without the risks of addictive algorithmic content.

The mental health issues associated with social media use in young people, particularly children, should be treated as a public health crisis. The Government are missing a vital opportunity with this legislation to reform how social media is used by not including provisions around children’s online safety. The Bill offers an important opportunity to start the process of removing harmful social media mechanisms and, as such, I urge the Minister to support the new clause.

New clauses 2 to 6 would ensure transparency in how AI systems are trained and give rights holders more control over the use of their works. Concerns about the impact of AI on the creative industries have been raised hundreds of times by my constituents and I have been raising those concerns in Parliament for years. In my Westminster Hall debate back in February 2023 on artificial intelligence and intellectual property rights, I raised my concerns about the bypassing of copyright laws in relation to AI and intellectual property with the former Conservative Government. However, the former Minister blamed the lack of the former Government’s action to introduce regulation for creatives, associated with the rise of AI, on the political turmoil in the Tory leadership at that time, and their neglectful attitude towards leadership was mirrored in their attitude towards introducing protection for creatives in this space.

Their conclusion from my Westminster Hall debate was

“not to legislate in periods of political turmoil”—[Official Report, 1 February 2023; Vol. 727, c. 163WH.]

and the need for “more deep consultation”, which did not materialise. I think we can all be glad that the chaos of that Government is in the past. However, we are yet to see the introduction of thorough and robust copyright laws relating to artificial intelligence, which is fundamental to the success of the UK’s world-leading creative industries. I hope this Government will act upon that today by accepting the new clauses.

I wish to reiterate the importance of these new clauses in ensuring that AI models are bound by existing copyright law, increasing data and identity transparency for crawlers and models, and empowering creators to take legal action against developers who fail to comply. The creative industry, like all sectors, will have to adapt to accommodate AI, but the industry is capable of and already making progress with that. Creatives have largely accepted that AI-generated content will have its place in the market, and they are already using AI to enhance their work by driving efficiencies and extending their reach to new markets. However, a solid regulatory framework, which could be created with the addition of these amendments, is essential to protect their rights and ensure that they can take part in value creation and retain control over their work. My colleagues and I believe that existing copyright law should be enforced to protect the UK’s creative industries, which are a world-leading British export.

17:22
It was recently revealed that nearly 200,000 YouTube videos, including material created by globally recognised British musicians, news channels and artists, had been scraped into a dataset used to train AI models. Content from over 40,000 creatives has been found in that dataset, yet consent was not sought from a single impacted creator to use their copyrighted works. This is not a unique case. I asked 200 creatives about their experiences with AI and copyright. I heard repeatedly of negative experiences, including one individual who had 600 images taken to train AI models without their knowledge. It is clear that AI offers a fantastic opportunity for our economy. However, it must supplement and grow industries, rather than replace them wholesale.
Alison Hume Portrait Alison Hume (Scarborough and Whitby) (Lab)
- View Speech - Hansard - - - Excerpts

I am delighted to be called to speak on Report of the Data (Use and Access) Bill. I draw Members’ attention to my membership of the Writers’ Guild of Great Britain. Before I entered this place, I worked as a freelance screenwriter, creating dramas for adults and children. I might add that children are the hardest audience to please—it used to be that we had five minutes to hook them, but now it is more like five seconds. Speed is the subject of my contribution today.

I warmly welcome the Minister’s engagement on how best to protect our peerless creative industries. In that spirit, I am pleased to see new clauses 16 and 17 and the commitment to addressing the fundamental issue of transparency. At the moment, AI companies do not have to tell anyone what they are stealing from the internet, from whom they are stealing and why they are stealing it. Although I appreciate the Government’s position that they want more time, I worry that in the gap between this Bill becoming law and a new Bill that addresses transparency and copyright coming forward, everything that can be scraped will be scraped. Twelve months is a long time, and plenty of time for AI companies to continue crawling over original copyrighted material without a care in the world. For some parts of the creative industries, 12 months will be 12 months too long. Necessity is the mother of invention, and without a legal instruction for AI companies to reveal what they are using free of charge, there is surely no incentive for the AI industry to come up with the solutions to make it simple for original creators and collecting societies to assert their rights.

New clauses 2 to 6 include calls for the operators of web crawlers and AI models to legally disclose what they are doing right now. Although I understand why the Government may not support the new clauses, will the Minister at least commit to placing a clear power to regulate in the Bill? The creative industries are nervous, spooked by the previously stated preference for an opt-out model, and such a move would calm nerves and indicate that the Government understand the pace at which the situation is developing and recognise the need for action.

Recently, here in Westminster, Björn from ABBA spoke in favour of clear transparency. Perhaps the saddest ABBA song is “The Winner Takes It All”, inspired by break-ups in the band between the As and the Bs. We must ensure that this is not a divorce of two industries that leads to the creative partner being left with the equivalent of the coffee table and the dog. The tech industry needs us more than we need it, so it should be honest, tell us what it is doing behind our backs and pay up. When all is said and done, this Government need to send a message now that we have the backs of our creative industries and that legal protections are our absolute priority.

Siân Berry Portrait Siân Berry (Brighton Pavilion) (Green)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 15, but I also want to associate myself with the many right hon. and hon. Members who have spoken up for our creative industries. Our most talented and creative minds have not been getting fair representation from the Government up to now, and this has been a very interesting, well informed and, hopefully, influential debate today. New clause 15 is about privacy, safety and providing a dedicated complaints procedure for individuals including victims of modern slavery, domestic abuse, gender-based violence and for others at risk of serious harm if their personal data is mishandled.

This is not a theoretical question. Last November, The Independent reported on Lola, a domestic abuse victim whose home address was leaked to her ex-partner by a company that obtains restraining orders. She said that she was left fearing for her life. As the Open Rights Group has laid out in its briefing, the Information Commissioner’s Office is not functioning as it should be in cases such as this. I have many examples—including how Charnwood district council sent details of the new address of an abused woman directly to her abuser at her former address, so that her abuser knew where she lived—yet people placed at risk in this way currently have no means of challenging the Information Commissioner’s Office if it fails to take the right action, which happens too often. New clause 15 simply proposes dedicated procedures to support vulnerable people making complaints and a right to appeal to the Information Tribunal, a route currently available to large tech firms but not to the people harmed by their practices. I hope that Ministers will take these proposals up.

On other amendments, I fully back the Liberal Democrats on new clauses 2 to 6, which I am signed up to. I personally will abstain from voting on the Liberal Democrats’ new clause 1 and on the Conservatives’ new clause 19. This is because, although I am minded to increase the age of digital consent from 13, given the wider implications of harmful content and data that can be collected and used to do harm, my discussions locally with parents and young people in Brighton Pavilion have led me to want to properly include both groups in any decision on what that new age should be, given that it would cut people off from social media. We must have rapid and real processes of deliberation on this issue as soon as possible that are not just consultative but collaborative.

Finally, new clause 21 is of serious concern to my constituents, and I agree with them and TransActual that it would constitute a gross violation of privacy rights by creating a mass outing of trans people. Subsection (1)(d) of this new clause even goes so far as to seek to revert historical changes made to someone’s gender marker. I urge the Government to reject this and to act further to protect trans rights more broadly.

Alex Sobel Portrait Alex Sobel (Leeds Central and Headingley) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to speak to my new clause 14 and amendment 10. Furthermore, I would like to make note of my steadfast opposition to new clause 21, which does not simply change data collection. It proposes to mark and track individuals based on “sex at birth”, regardless of their lived reality, legal recognition or consent. No one—not a Government, not a public authority, not a politician—has the right to define who another person is; only the individual can do that. This is a fundamental principle of dignity and respect that transcends political views and legal debates. We must reject new clause 21.

Moving on to my new clause 14, it is widely accepted that AI has already ingested everything on the internet, whether it be music, films or books, yet there is no legal requirement on these companies to disclose what they have used, making it difficult for musicians and authors to enforce their rights and, crucially, to be paid for their work. So I urge the Minister to give a commitment to legislating for transparency to protect the creative industries.

I note the Government’s new clauses 16 and 17 as a starting point, but we both know that we want to see a thriving licensing market between content creators and AI developers. A transparency commitment today would enable that licensing market as creators would be in a position to enforce their rights and demand fair pay. There would be certainty for AI developers, removing the risk of mitigation in the future. Without transparency, there is no incentive for AI firms to reach agreements with creators, and billionaire-owned tech firms will continue to rip off musicians, filmmakers and authors.

Chris Hinchliff Portrait Chris Hinchliff (North East Hertfordshire) (Lab)
- Hansard - - - Excerpts

Does my hon. Friend agree that new technology should be a tool to improve lives, not just a mechanism for funnelling more wealth and power into the hands of already super-rich corporations? Does he agree that the Bill would benefit from going even further in providing greater transparency?

Alex Sobel Portrait Alex Sobel
- Hansard - - - Excerpts

My new clause 14 would do that, so I hope the Government are taking note.

This debate is not just about economic rights. Last week I learned about the holocaust survivor Renee Salt, whose book “A Mother’s Promise” was ripped by AI, with similarly named books appearing online days after the original was released. There can be no starker contrast than Renee sharing her most traumatic experiences for the benefit of others, and a computer algorithm stealing from a Holocaust survivor to profit from her suffering. We must stand up for the human creativity that helps us to process the world we live in, or the world will become a much darker place.

I tabled amendment 10, which relates to safe data transfer, in order to confront a glaring weakness in our current data protection regime through the continued transfer of UK user data to jurisdictions that cannot and do not provide basic legal protections or enforceable rights. The need for the amendment is not theoretical. Under current rules, companies often rely on a set of contracts—international data transfer agreements—as proof that data transfers will be adequately protected. However, that assumption is increasingly proving to be false.

The Irish Data Protection Commission fined TikTok €530 million after an in-depth inquiry into its transfers of European Economic Area user data to China. The Irish authorities found that TikTok had failed to adequately assess whether Chinese law provided a level of protection “essentially equivalent” to that guaranteed under GDPR— the General Data Protection Regulation. The ruling was possible because there are no credible legal remedies in China. Laws such as the national intelligence law, the cyber-security law and the anti-terrorism law compel organisations to provide access to data without judicial oversight or meaningful recourse for individuals. China is unable to provide a level of protection “essentially equivalent” to that guaranteed in the Data Protection Act 2018 and in this Bill.

Contracts alone do not protect users when the legal system of the receiving country is incompatible with fundamental rights. This amendment introduces a clear rule: where there is no meaningful enforcement of data rights, no independent judiciary, no administrative remedy or no legal path to challenge unlawful access, such countries will be deemed unsafe for UK data transfers. The Bill must address this critical blind spot. Contracts alone cannot ensure user rights in jurisdictions that offer no legal safeguards. This amendment provides a principled, legally sound and urgently needed response to a real-world threat. I hope that the Minister, given his background, will take these issues seriously and meet me to look further at how we can close this loophole.

Damian Hinds Portrait Damian Hinds (East Hampshire) (Con)
- View Speech - Hansard - - - Excerpts

Our nominal minimum age for social media usage in this country comes from a well-meaning piece of American legislation originally passed in 1998. The age did not have to be 13. Back in 1998 it was going to be 16, but it was changed to 13. With the birth of GDPR, the age did not have to be 13: the default was 16. Various countries, including Germany, the Netherlands and Ireland, selected 16, but we selected 13. That means that at the age of 13 people can sign up to social media, have their behaviour tracked for the purpose of targeting content and ads, start their own channel, have multiple IDs and make decisions about what details of their private life they share.

Many people believe that, because of brain development, 13 is too young to make some of those decisions reliably, and that there are real downsides, risks and dangers from the combination of social media and the ready availability of a handheld electronic device. For children, there are addictive features, an effect on sleep, an ease of making unwanted content, rabbit holes to fall down and corrosive content that plays on the insecurity of adolescence.

Objections to raising the age to 16 are normally centred around worries that pro-social applications will be hit and that there will be unintended consequences, such as children not being able to seek help if they in an abusive family, or to find information about contraception or whatever else they may need to know. Indeed, those were some of the reasons why, back in 1998, the age of 16 became 13, and those reasons came up again here in the debates over GDPR. As such, I worded new clause 12 to demonstrate how we could do it without losing anything, by having very broad categories of exemption. However, even with those exemptions, the Government would still be able to say—I am sure they will, and will say some of the same things about new clause 1 shortly—that new clause 12 is technically inadequate, worded badly and contains the wrong exemptions, and that there would be unintended consequences. New clause 19, though, which was tabled by the official Opposition, is almost impossible to argue against, because it contains the default position that these exemptions will change; under its provision, those changes would be subject to review, which would ensure that all those considerations were taken into account.

17:30
I want to speak briefly to amendment 9, which I will not move, but which deals with age checks. Whatever the nominal age might be, it is irrelevant if nobody actually enforces it. The Online Safety Act 2023 says that social media companies should enforce their own age limits, but that is often done through very simple methods of self-declaration that are easy to circumvent. There is a question about how we interpret the wording of that Act, which says that age checks must be done “consistently”.
Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - - - Excerpts

Does the right hon. Gentleman agree that self-regulation just does not work in many industries? We can look at sewage reporting in the water industry, or at the AI and tech companies, which will use our data and not tell the regulators that they are doing so. There is a real need to strengthen the regulation.

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

The hon. Gentleman tempts me to broaden the debate, which I do not think you would encourage me to do at this late stage, Madam Deputy Speaker. However, he makes a very important point about self-regulation in this sector. The public, parents, and indeed children look to us to make sure we have their best interests at heart.

The Online Safety Act may only say that age minima should be enforced “consistently” rather than well, but I do not think the will of this Parliament was that it would be okay to enforce a minimum age limit consistently badly. What we meant was that if the law says right now that the age minimum is 13, or if it is 16 in the future—or whatever other age it might be—companies should take reasonable steps to enforce it. There is more checking than there used to be, but it is still very limited. The recent 5Rights report on Instagram’s teen accounts said that all its avatars were able to get into social media with only self-reported birth dates and no additional checks. That means that many thousands of children under the nominal age of 13 are on social media, and that there are many more thousands who are just over 13 but who the platform thinks are 15, 16 or 17, or perhaps 18 or 19. That, of course, affects the content that is served to them.

Either Ofcom or the ICO could tighten up the rules on the minimum age, but amendment 9 would require that to happen in order for companies to be compliant with the ICO regulation. The technology does exist, although it is harder to implement at the age 13 than at 18—of course, the recent Ofcom changes are all about those under the age of 18—but it is possible, and that technology will develop further. Ultimately, this is about backing parents who have a balance to strike: they want to make sure that their children are fully part of their friendship groups and can access all those opportunities, but also want to protect them from harm. Parents have a reasonable expectation that their children will be protected from wholly inappropriate content.

Caroline Voaden Portrait Caroline Voaden (South Devon) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak to new clauses 1 and 11, and briefly to new clause 2. The Liberal Democrats believe that the Government have missed a trick by not including in this Bill stronger provisions on children’s online safety. It is time for us to start treating the mental health issues arising from social media use and phone addiction as a public health crisis, and to act accordingly.

We know that children as young as nine and 10 are watching hardcore, violent pornography. By the time they are in their teens, it has become so normalised that they think violent sexual acts such as choking are normal—it certainly was not when we were teenagers. Girls are starving themselves to achieve an unrealistic body image because their reality is warped by airbrushed images, and kids who are struggling socially are sucked in by content promoting self-harm and even suicide. One constituent told me, “I set up a TikTok account as a 13-year-old to test the horrors, and half a day later had self-harm content dominating on the feed. I did not search for it; it found me. What kind of hell is this? It is time we gave our children back their childhood.”

New clause 1 would help to address the addictive nature of endless content that reels children in and keeps them hooked. It would raise the minimum age for social media data processing from 13 to 16 right now, meaning that social media companies would not be able to process children’s data for algorithmic purposes. They would still be able to access social media to connect with friends and access relevant services, which is important, but the new clause would retain exceptions for health and educational purposes, so that children who were seeking help could still find it.

We know that there is a correlation between greater social media use among young people since 2012 and worsening mental health outcomes. Teachers tell me regularly that children are struggling to concentrate and stay awake because of lack of sleep. Some are literally addicted to their phones, with 23% of 13-year-old girls in the UK displaying problematic social media use. The evidence is before us. It is time to act now—not in 18 months and not in a couple of years. The addictive nature of the algorithm is pernicious, and as legislators we can do something about it by agreeing to this new clause 1.

It is time to go further. This Bill does not do it, but it is time that we devised legislation to save the next generation of teenagers from the horrors of online harm. Ofcom’s new children’s code provides hope that someone ticking a box to say they are an adult will no longer be enough to allow access to adult sites. That is a good place to start; let us hope it works. If it does not, we need to take quick and robust action to move further with legislation.

Given the nature of the harms that exist online, I also support new clause 11 and strongly urge the Government to support it. No parent should have to go through the agony experienced by Ellen Roome. Losing a child is horrific enough, but being refused access to her son’s social media data to find out why he died was a second unacceptable agony. That must be changed, and all ISPs should be compelled to comply. New clause 11 would make that happen. I heard what the Minister said about coroners, but I strongly believe that legislation is needed, with a requirement to release data or provide access to their children’s account for any parent or guardian of someone under 18 who has died. There is, as far as I can see, no reason not to support this new clause.

Briefly, I echo calls from across the House to support new clause 2 in support of our creatives. Creativity is a uniquely human endeavour. Like others, I have been contacted by many creators who do not want their output stolen by AI companies without consent or permission. It is vital that AI companies comply with copyright legislation, which clearly has to be updated to meet the requirements of the brave new world of tech that we now live in.

Iqbal Mohamed Portrait Iqbal Mohamed
- View Speech - Hansard - - - Excerpts

I rise to confirm my agreement with new clauses 1 and 12, and I associate myself with the speech of the hon. Member for South Devon (Caroline Voaden). I have had several emails on the protection of copyrighted information and revenue streams for artists, including from Yvonne, who contacted me recently. It is essential that the creative arts and intellectual property are protected and that artists are properly compensated if their output is used in AI.

On new clauses 1 and 12, the case for raising the age of consent for data processing from 13 to 16 has been well made across the House, so I will not repeat the points made, but I will say that it is essential that we give our children their childhoods back. They need to be protected from the toxic content to which they are being exposed by social media and online.

New clauses 3 to 6 and new clause 14 would place transparency requirements on AI companies to report on what information and data they have used, from where, and with what permission. That is essential to holding the AI companies to account and to ensuring that content holders and data owners are informed and have adequate channels of redress for misuse of their information.

I am sure that new clause 7 was spoken about while I was out of the Chamber, but let me say now that the right for our citizens to use non-digital verification is key. My mother—who is in her late 60s, bless her—would not have a clue what to do if she did not have family to help her with her benefits claims, doctors’ prescriptions, appointments and so on. We cannot exclude millions of our citizens who may choose not to have smartphones and not to be exposed to toxic content online, or who are simply not tech-literate. I urge the Government to ensure that we do not exclude millions of our citizens. I also strongly support new clause 11, but I will defer to earlier speakers in that regard.

As for new clause 18, many constituents have written to me or spoken to me, expressing concern about sharing their NHS and other private data with third parties such as Palantir. It is essential for this new Government to adopt a posture of supporting ethical, transparent business practices for all suppliers who provide services in our country. We have already heard about the background of Palantir. I do not know how true this is, but some of my constituents believed, or had read, that during the Prime Minister’s first visit to the US, after meeting Donald Trump he visited Palantir’s headquarters, or one of its offices. I urge the Government to protect—

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

Order. The hon. Gentleman’s time is up.

David Chadwick Portrait David Chadwick (Brecon, Radnor and Cwm Tawe) (LD)
- View Speech - Hansard - - - Excerpts

I rise to speak in strong support of new clauses 1 and 2.

New clause 1 seeks to raise the age of consent for social media data processing from 13 to 16. As the father of two young boys, I am deeply concerned about the way in which tech platforms engineer addiction, manipulate attention, and shape childhood in ways that parents and even Governments cannot easily counter. This is not hypothetical; it is the reality that our children are living every day. Children aged 13 to 15 are especially vulnerable. Those social media algorithms do not just show content. They shape beliefs, reinforce insecurities and amplify harm. Whether it is body image filters, content promoting self-harm or endless scrolling, these platforms are designed for engagement, not wellbeing.

The new clause would not ban young people from using social media. It simply says that their data should not be exploited for commercial gain without genuine, informed consent. By raising the age to 16 for these specific practices, we align with international best practice and the United Nations convention on the rights of the child. With clear exemptions for education and health platforms, this is a targeted and proportionate reform that prioritises children’s mental health.

New clause 2 deals with copyright compliance and AI. As we all know, the AI revolution is here, but just as we would not let a factory operate by stealing its raw materials from others, we should not let AI models train on copyrighted work, such as books, music or journalism, without permission or payment. The new clause makes one clear demand: if an AI system operates in the UK, it must respect UK copyright law, regardless of where the servers are based. We are standing up for our creators—for the authors, musicians, film-makers and developers whose work gives AI its power. In Wales alone, the creative industries turned over £1.5 billion in 2023, employing more than 37,000 people. Let us not wait for lawsuits or damage to our industries. The new clause provides legal clarity, defends creators, and affirms that Parliament, not silicon valley, writes the rules.

These Liberal Democrat new clauses are principled, practical and long overdue, and I urge all Members to support them.

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Spencer
- View Speech - Hansard - - - Excerpts

It has been a pleasure to hear the speeches of Members from across the House. I pay tribute to my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friend the Member for Maldon (Sir John Whittingdale), who spoke with passion about the protection of copyright in AI. I suspect that my right hon. Friend is looking forward to seeing the back of the Bill, and hoping that it does not return in a future iteration. My right hon. Friend the Member for Chingford and Woodford Green (Sir Iain Duncan Smith) spoke of the importance of ensuring that data does not fall victim to hostile states and hostile state actors. My right hon. Friend the Member for East Hampshire (Damian Hinds) spoke with knowledge and authority about this important issue, and the challenges and practicalities involved in ensuring that we get it right for our children.

I will return to the three themes that we have put forward. The Minister has repeatedly given assurances on the application of copyright with regard to AI training, but the Secretary of State created uncertainty by saying in the AI copyright consultation:

“At present, the application of UK copyright law to the training of AI models is disputed.”

When we create that level of uncertainty, we need at least an equal level of clarity to make amends, and that is partly what our new clause 20 calls for: among other things, a formal statement from the Intellectual Property Office or otherwise. I do not see why it is a challenge for the Government to put that forward and deliver.

17:45
Our new clause 19 focuses on the use of inappropriate social media services by under-16s, and on protections for children. For those watching online, I will summarise the three positions that I have heard. It seems that the Government position is, “We don’t want to think about it, so let’s leave things as they are.” The Lib Dem position is, “We don’t want to think about it, but let’s bring in a blanket ban.” Our position is very much, “Think about it, get it into a workable and effective position, and then improve the protections for our children.”
Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

I would just like to clarify that we have thought long and hard about this Bill, along with many organisations and charities, to get it right.

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

That is good to hear.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I will try a third time, because we tried earlier. The Conservatives have clearly briefed the press that they are angling for a ban on social media for under-16s—it has been reported in multiple places. Can the shadow Minister confirm whether that is the Conservatives’ position or not?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

For the fourth time, and as I have said, new clause 19 would effectively create a de facto position whereby there are restrictions on the use of inappropriate social media services by children. It seeks to tackle the challenges of implementation, age verification and the scope of social media. It says that there needs to be work to make sure that we can actually do so and that, when we can, we should move in that direction, unless there is overwhelming evidence that it is not needed, such as with the shaking out of the Online Safety Act.

Finally, I return to new clause 21. Sadly, it has been widely misrepresented. The laws in this area are clear: the Equality Act puts in place obligations in relation to protected characteristics. The Supreme Court says that “sex” means biological sex, and that public authorities must collect data on protected characteristics to meet their duties under the Equality Act. The new clause would put that clear legal obligation into effect, and build in data minimisation principles to preserve privacy. There would be no outing of trans people through the new clause, but where public authorities collect and use sex data, it would need to be biological sex data.

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

As ever, it is good to see you in the Chair, Madam Deputy Speaker. I thank all right hon. and hon. Members who have taken part in the debate. If I do not manage to get to any of the individual issues that have been raised, and to which people want answers, I am afraid that is because of a shortness of time, and I will seek to write to them. I thank the officials who helped to put the Bill together, particularly Simon Weakley—not least because he not only did this Bill, but all the previous versions in the previous Parliament. He deserves a long-service medal, if not something more important.

I will start with the issues around new clauses 1, 11, 12 and 13, and amendment 9. The Government completely share the concern about the vulnerability of young people online, which lots of Members have referred to. However, the age of 13 was set in the Data Protection Act 2018—I remember, because I was a Member at the time. It reflects what was considered at the time to be the right balance between enabling young people to participate online and ensuring that their data is protected. Some change to protecting children online is already in train. As of last month, Ofcom finalised the child safety codes, a key pillar of the Online Safety Act. Guidance published at the same time started a three-month period during which all in-scope services likely to be accessed by children will be required to assess the risk of harm their services pose to them.

From July, the Act will require platforms to implement measures to protect children from harm, and this is the point at which we expect child users to see a tangible, positive difference to their online experiences. I wish it had been possible for all this to happen earlier— I wish the Act had been in a different year—but it is the Act it is. The new provisions include highly effective age checks to prevent children encountering the most harmful content, and adjusting algorithms to reduce the exposure to harmful content. Services will face tough enforcement from Ofcom if they fail to comply.

The Act very much sets the foundation for protecting children online. The Government continue to consider further options in pursuit of protecting children online, which is why the Department for Science, Innovation and Technology commissioned a feasibility study to understand how best to investigate the impact of smartphones and social media on children’s wellbeing. This will form an important part of our evidence base.

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

Will the Minister give way?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am going to come to the right hon. Member’s amendment in a moment.

The study is being led by Dr Amy Orben of Cambridge University, and it is supported by scientists from nine of the UK’s premier universities, all with established expertise in this field. The study will report to the Government this month on the existing evidence base, ongoing research and recommendations for future research that will establish any causal links between smartphones, social media and children’s wellbeing. The Government will publish the report along with the planned next steps to improve the evidence base in this area to support policy making. Considering the extra work we are doing, I hope Members will not press their amendments.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

Will the Minister give way?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am afraid that I will not give way.

On new clause 13, tabled by the hon. Member for Harpenden and Berkhamsted (Victoria Collins), we share the concern that children’s data in education must be safeguarded. We have already committed to instructing the Information Commissioner’s Office to produce a statutory code of practice on the use of children’s data by edtech services once the findings of their audits have been published. We believe that defining the scope of the code in legislation now or imposing a six-month deadline for its publication risks undermining that evidence-led process.

Amendment 9, tabled by the right hon. Member for East Hampshire (Damian Hinds), seeks to ensure that platforms adopt strong age-assurance mechanisms when designing their services under the new children’s higher protection matters duty in clause 81. Of course, we subscribe to that policy aim, but the clause already strengthens UK GDPR by requiring providers of information society services to take account of how children can best be protected and supported when they are designing their processing activities. The ICO’s age-appropriate design code will be updated to provide clear and robust guidance on how services can meet these obligations, including through proportionate risk-based age assurance, where appropriate. I will take the right hon. Member’s intervention if he wants—he asked first—but I am afraid I have to be very careful because I have a lot of questions to answer.

Damian Hinds Portrait Damian Hinds
- Hansard - - - Excerpts

Very quickly, I want the Minister to confirm that the Ofcom children’s codes, to which he has referred, are all about the 18 age threshold. They are a very welcome move to filter out wholly inappropriate content that is designed for over-18s and other very harmful content, but they do not do anything for the initial threshold—the age minimum—at age 13.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

The right hon. Member makes a fair point.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

Will the Minister give way?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am terribly sorry, but I do need to crack on because I have very little time.

I have not yet mentioned new clause 21 and amendments 39 and 40. Let me start by saying that the Government accept the Supreme Court ruling, but it is paramount that we work through this judgment carefully, with sensitivity and in line with the law. We cannot simply flick a switch; we must work through the impacts of this judgment properly, recognising that this is broader than data used by digital verification services. I reflect the comment made earlier by the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), when he said that data accuracy is important.

Nadia Whittome Portrait Nadia Whittome (Nottingham East) (Lab)
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. Trans people and trans-led groups have been very concerned by new clause 21 tabled by the Opposition. They have rightly described it as an attack on trans people’s rights and their privacy. Can the Minister offer some reassurance that, as well as opposing this amendment today, the Government will not seek to introduce similar legislation via other means in the future?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

We are opposing the amendment and are not intending to introduce similar legislation.

As I said, data accuracy is important. That is equally true for any data used in a digital verification service. That is why the Government are already engaged in an appropriate and balanced range of work on data standards and data accuracy. We are already developing data standards on the monitoring of diversity information, including sex, via the Data Standards Authority. Following a review, the Office for Statistics Regulation published updated guidance on collecting and reporting data and statistics about sex and gender identity last year, and all Government Departments are now considering how best to address the recommendations of the Sullivan review, which we published. That is the first reason why we will not be supporting this new clause or the amendment today. Simply, we believe the concerns regarding the way in which public authorities process sex and gender data should be considered holistically, taking into account the effects of the Supreme Court ruling and the specific and particular requirements of public authorities. By contrast, the new clause and the amendment would undermine the work the Government are already doing. Giving the Secretary of State a new regulatory rule would undermine the existing processes that ensure compliance with the UK’s data protection.

Secondly, the new clause is misplaced because the Bill does not alter the evidence which can be relied upon to prove sex or gender. Indeed, it does not seek to alter any of the content of data used by digital verification services. Instead, the Bill enables people to do digitally what they can presently do physically, and it is for organisations to consider what specific information they need to verify in their particular circumstances. Any inconsistency between what they can do digitally and what they can do physically would obviously sow further division.

Thirdly, the new clause is unnecessary, because it is very unlikely that digital verification services would be used in many, if not all, of the cases specifically raised by or with hon. Members, such as within the NHS to gain access to single-sex wards or for screening or to enter other female-only spaces. We expect digital verification services to be used primarily to prove things such as one’s right to work, or one’s age, address or professional or educational qualifications, which are not matters where sex or gender is relevant at all.

Fourthly, the new clause goes significantly further than the findings of the Supreme Court. Finally, the proposals have the potential to interfere with the right to respect for private and family life under the Human Rights Act by requiring public authorities to record sex as biological sex in all cases regardless of whether it is justified or proportionate in that given circumstance. In addition, the amendment does not take account of the fact that the Gender Recognition Act 2004 gives those with gender recognition certificates a level of privacy and control over who has access to information about their gender history. As for amendment 39, it will create further uncertainty as it appears to prevent use of clause 45 in all cases involving sex.

As I have set out, while I understand the reason for tabling these amendments, I fear they would create legal confusion, uncertainty and inconsistency. I also note that they were not part of the previous Government’s version of this Bill, in which in nearly all respects this part of the Bill was identical to ours. Given the narrow scope of digital verification service measures, the need to consider this area holistically to ensure alignment with existing legislation, and upcoming EHRC guidance and the breadth of work already being carried out, I hope the new clause and amendments will be withdrawn.

There was one other amendment referring to digital verification services: the Liberal Democrats’ new clause 7. I completely share their concerns about digital inclusion, which were also mentioned by the hon. Member for Dewsbury and Batley (Iqbal Mohamed). We have published our own digital inclusion action plan, but such obligations could be particularly challenging for businesses currently operating solely in the digital sphere—for example, online banks. Taking a blanket approach in the way proposed would not be proportionate, so I urge that the amendment be withdrawn.

On scientific research, my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah) tabled amendments 37 and 38. Amendment 37 adds further conditions to the definition of scientific research. I understand her concern and we want to prevent misuse. However, the Bill does not expand the meaning of scientific research and already contains safeguards, such as in clause 86. Moreover, the amendment replicates wording from two external documents—including the Frascati document—neither of which were intended to be legally binding or to define scientific research. I am very happy to continue having these conversations with my hon. Friend, but I urge her not to press her the amendment.

On access to NHS data, which my hon. Friend the Member for Normanton and Hemsworth (Jon Trickett) raised, let me just answer his direct question about the sale of NHS data. The Secretary of State for Health has said categorically that the NHS is not for sale and that patients’ data is not for sale—end of story. I hope we can put that one to bed.

On ethnicity data, my hon. Friend the Member for Birmingham Edgbaston (Preet Kaur Gill) made valid points that we intend to pursue. Public bodies usually collect ethnicity data in line with the Office for National Statistics’ harmonised standards. The ONS is currently reviewing that and I am sure she will want to feed into that process.

I am afraid that I have not had time to refer again to AI and copyright, but this country is a—

17:59
Debate interrupted (Programme Order, 12 February)
The Deputy Speaker put forthwith the Question already proposed from the Chair (Standing Order No. 83E), That the clause be read a Second time.
Question agreed to.
New clause 16 accordingly read a Second time, and added to the Bill.
The Deputy Speaker then put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 17
Report on the use of copyright works in the development of AI systems
“(1) The Secretary of State must, before the end of the period of 12 months beginning with the day on which this Act is passed—
(a) prepare and publish a report on the use of copyright works in the development of AI systems, and
(b) lay the report before Parliament.
(2) The report must consider—
(a) the four policy options described in section B.4 of the Copyright and AI Consultation Paper, read with relevant parts of section C of that Paper (policy options about copyright law and the training of artificial intelligence models using copyright works), and
(b) such alternative options as the Secretary of State considers appropriate.
(3) The report must consider, and make proposals in relation to, each of the following—
(a) technical measures and standards (for example, measures and standards concerned with metadata) that may be used to control—
(i) the use of copyright works to develop AI systems, and
(ii) the accessing of copyright works for that purpose (for example, by web crawlers);
(b) the effect of copyright on access to, and use of, data by developers of AI systems (for example, on text and data mining), including the effect on developers who are individuals, micro businesses, small businesses or medium-sized businesses;
(c) the disclosure of information by developers of AI systems about—
(i) their use of copyright works to develop AI systems, and
(ii) how they access copyright works for that purpose (for example, by means of web crawlers);
(d) the granting of licences to developers of AI systems to do acts restricted by copyright, including the granting of licences by and to individuals, micro businesses, small businesses and medium-sized businesses.
(4) In preparing the report, the Secretary of State must consider the likely effect of proposals, in the United Kingdom, on—
(a) copyright owners, and
(b) persons who develop or use AI systems,
including the likely effect on copyright owners, developers and users who are individuals, micro businesses, small businesses or medium-sized businesses.
(5) In preparing the report, the Secretary of State must have regard to, among other things, the Consultation Paper responses.
(6) The Secretary of State may comply with this section by preparing and publishing two or more reports which, taken together, satisfy the requirements in this section.
(7) In this section—
“Consultation Paper responses” means responses to the Copyright and AI Consultation Paper received by the Secretary of State on or before 25 February 2025;
“copyright” means the property right which subsists in accordance with Part 1 of the Copyright, Designs and Patents Act 1988;
“copyright work” has the same meaning as in Part 1 of the Copyright, Designs and Patents Act 1988;
“web crawler” means a computer program that obtains data from websites in accordance with instructions and that can autonomously determine which websites to visit.
(8) Terms used in this section and in section (Economic impact assessment) have the same meaning in this section as they have in that section.”—(Chris Bryant.)
Brought up, and added to the Bill.
New Clause 1
Age of consent for social media data processing
“(1) The UK GDPR is as amended as follows.
(2) In Article 8 of the UK GDPR (Conditions applicable to child's consent in relation to information society services)
After paragraph 1 insert—
“(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.
(1B) For the purposes of paragraph 1A “social networking services” means any online service that—
(a) allows users to create profiles and interact publicly or privately with other users, and
(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.
(1C) Paragraph 1B does not apply to—
(a) educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.
(b) health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support.””—(Victoria Collins.)
Brought up.
Question put, That the clause be added to the Bill.
18:01

Division 187

Ayes: 76


Liberal Democrat: 54
Scottish National Party: 6
Independent: 5
Plaid Cymru: 4
Reform UK: 3
Green Party: 2
Social Democratic & Labour Party: 1
Democratic Unionist Party: 1

Noes: 295


Labour: 292
Independent: 3

New Clause 2
Compliance with UK copyright law by operators of web crawlers and general-purpose AI models
“(1) The Secretary of State must by regulations make provision (including any such provision as might be made by Act of Parliament), requiring the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988, regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place.
(2) Provision made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—
(a) pre-training and training,
(b) fine tuning,
(c) grounding and retrieval-augmented generation, and
(d) the collection of data for the said purposes.
(3) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”.(Victoria Collins.)
This new clause requires web crawlers and general-purpose AI models with UK links to comply with UK copyright law across all stages of AI development.
Brought up.
Question put, That the clause be added to the Bill.
18:14

Division 188

Ayes: 88


Liberal Democrat: 54
Independent: 8
Scottish National Party: 6
Conservative: 6
Green Party: 4
Plaid Cymru: 4
Reform UK: 3
Traditional Unionist Voice: 1
Social Democratic & Labour Party: 1
Democratic Unionist Party: 1

Noes: 287


Labour: 283
Independent: 2

New Clause 19
Secretary of State’s duty to review the age of consent for data processing under the UK GDPR
“(1) The Secretary of State must, within 12 months of Royal Assent of this Act, have conducted a review and published a report into the operation of Article 8 (Conditions applicable to child’s consent in relation to information society services) of the UK GDPR in relation to the data processed by social media platforms of children under the age of 16.
(2) As part of this review, the Secretary of State must consider—
(a) the desirability of increasing the digital age of consent under the UK GDPR from 13 to 16, taking into account the available evidence in relation to the impact of social media platforms on the educational, social and emotional development of children; and
(b) the viability of increasing the digital age of consent under Article 8 of the UK GDPR in relation to specific social media platforms which are shown by the evidence to be unsuitable for use by children under the age of 16.
(3) Within six months of the publication of the report under subsection (1), the Secretary of State must lay a plan before Parliament for raising the digital age of consent to 16 through amendments to Article 8 GDPR, unless the review concludes that such changes are unnecessary.”—(Dr Spencer.)
Brought up.
Question put, That the clause be added to the Bill.
18:26

Division 189

Ayes: 160


Conservative: 91
Liberal Democrat: 55
Independent: 5
Reform UK: 3
Green Party: 3
Traditional Unionist Voice: 1
Democratic Unionist Party: 1

Noes: 294


Labour: 287
Independent: 5

New Clause 21
Directions to public authorities on recording of sex data
“(1) The Secretary of State must, within three months of the passage of this Act, issue regulations relating to the code of practice set out in section 49 of this Act which require public authorities to—
(a) collect, process and retain sex data only where it is lawful to do so in accordance with data protection legislation;
(b) request and record sex data accurately, in every circumstance where sex data is collected, in accordance with following category terms and definitions—
(i) ‘Sex’ meaning male or female only based on ‘sex at birth’, ‘natal sex’ or ‘biological sex’ (these terms carrying the same meaning and capable of being used interchangeably); and,
(ii) in addition, where it is lawful to do so in accordance with data protection legislation and the Gender Recognition Act 2004, ‘Acquired Gender’ meaning male or female only, as recorded on a gender recognition certificate issued in accordance with the Gender Recognition Act 2004;
(c) have updated relevant organisation guidance to stipulate that, where sex data is collected, this must be done in accordance with the definitions set out by subsection (1)(b) within three months of these regulations coming into force;
(d) have conducted a review of the accuracy of data held in relation to the sex of data subjects to ensure that the data is accurate in recording sex at birth and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate within 12 months of these regulations coming into force;
(e) have taken every reasonable step to ensure that any data held in relation to the sex and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate of a data subject that is found to be inaccurate is rectified or erased within 18 months of these regulations coming into force; and
(f) have produced and submitted to the Secretary of State a report setting out the findings of its review in relation to the matters set out by subsection (1)(d) and, where relevant, a description of the steps taken to ensure that the data held by the relevant public authority is accurate within the definitions set out subsection (1)(b) with 18 months of these regulations coming into force.
(2) The Secretary of State may, on receipt of a report in accordance with subsection (1)(f) instruct a public authority to take any further remedial steps within a specified timeframe reasonably necessary to ensure the accuracy of the sex and acquired gender data held by the relevant public authority.
(3) The Secretary of State must, within one month of the passage of this Act, establish and maintain a register of public authorities approved to act as sources of data relating to the attribute of sex for persons providing digital verification services.
(4) The register in subsection (3) must be published on the website of the Office for Digital Identities & Attributes or any successor body.
(5) Until such time as a public authority is added to the register under subsection (3), persons providing digital verification services may only obtain data on the sex of an individual requesting the provision of digital verification services from the record of births held by the General Register Office in accordance with subsection (6).
(6) Information supplied by the General Register Office pursuant to subsection (5) must specify sex as recorded at birth, as well as any subsequent corrections to the register in the field marked ‘Sex’.
(7) The Secretary of State may, from time to time, add public authorities to the register as under subsection (3) only upon being satisfied on the basis of a report issued under subsection (1)(f), or satisfaction of such further steps required by the Secretary of State under subsection (2) that the data held by the relevant public authority in relation to sex and, where relevant, acquired gender as recorded on a gender recognition certificate, as defined in subsection (1)(b), is accurate.”—(Dr Spencer.)
This new clause requires the Secretary of State to issue regulations relating to the code of practice in section 49 requiring public authorities to record sex data in line with these regulations when data are collected. This clause is linked to amendments 39 and 40.
Brought up.
Question put, That the clause be added to the Bill.
18:38

Division 190

Ayes: 97


Conservative: 91
Reform UK: 3
Traditional Unionist Voice: 1
Independent: 1
Democratic Unionist Party: 1

Noes: 363


Labour: 289
Liberal Democrat: 56
Independent: 5
Scottish National Party: 5
Green Party: 4
Plaid Cymru: 4
Social Democratic & Labour Party: 1

Clause 4
Power to make provision in connection with business data
Amendments made: 11, page 6, line 25, after “recipient” insert
“in relation to business data”.
This amendment is consequential on Amendment 12.
Amendment 12, page 6, line 26, after “authority” insert
“to do something with the business data”.
Clause 4(4)(a) refers to a person appointed by a public authority. This amendment specifies that the person must be appointed to do something with the business data in respect of which the public authority is a third party recipient (defined in clause 4(2)).
Amendment 13, page 6, line 30, at end insert—
“(aa) make provision requiring a person who is a third party recipient in relation to business data (whether by virtue of those regulations or other data regulations), and who is appointed by a public authority to do something with the business data, to publish or provide business data as described in paragraph (a)(i) or (ii),”.
This amendment enables regulations under Part 1 to require people who are third party recipients of business data, and who are appointed by a public authority, to publish business data or to provide it to customers of the trader to whom the business data relates or to other persons.
Amendment 14, page 6, line 31, leave out from “or” to “, make” in line 32 and insert
“the appointed person referred to in paragraph (a) or (aa)”.
This amendment enables regulations under Part 1 to make, in relation to a person described in paragraph (aa) of clause 4(4) (see Amendment 13), any provision that they can make in relation to a data holder (other than provision imposing a levy).
Amendment 15, page 6, line 37, after “authority” insert “or appointed person”.
This amendment and Amendment 16 enable regulations under Part 1 to make, in relation to a person (other than a customer) who, in accordance with regulations made under clause 4(4), receives business data from a person appointed by a public authority, any provision that they could make if the person received the business data from a data holder.
Amendment 16, page 6, line 39, leave out “(a)(ii)” and insert
“(a) tab="yes" or (aa), other than a customer described in paragraph (a)(i)”.—(Chris Bryant.)
See the explanatory statement for Amendment 15.
Clause 8
Enforcement of regulations under this Part
Amendments made: 17, page 12, line 5, leave out
“and sections 9 and 10”.
This amendment and Amendment 18 adjust the way in which clauses 9 and 10 are signposted in clause 8(3), to reflect the fact that clauses 9 and 10 make provision about regulations under Part 1, not just regulations under clause 8(1).
Amendment 18, page 12, line 6, at end insert
“(and see sections 9 and 10)”.—(Chris Bryant.)
See the explanatory statement for Amendment 17.
Clause 11
Fees
Amendments made: 19, page 16, line 14, leave out
“for the purpose of meeting expenses”
and insert
“in connection with activities”.
This amendment and Amendments 20 and 21 remove the requirement for the amount of fees provided for by regulations under clause 11 to be linked to the expenses of performing duties imposed, or exercising powers conferred, by or under Part 1 of the Bill.
Amendment 20, page 16, line 25, leave out from beginning to “performing” in line 26 and insert “Those activities are”.
See the explanatory statement for Amendment 19.
Amendment 21, page 16, line 35, leave out
“in respect of which the fee is charged”
and insert
“in connection with which the fee is charged (and for the total amount of fees payable in connection with things to exceed the total cost)”.
See the explanatory statement for Amendment 19.
Amendment 22, page 17, line 14, at end insert—
“(9) The Secretary of State or the Treasury may by regulations make provision about whether a person listed in subsection (2), or a person acting on their behalf, who could require payment in connection with an activity described in subsection (3) otherwise than in reliance on regulations under subsection (1) may do so.
(10) Where duties or powers are imposed or conferred—
(a) on a person in their capacity as a third party recipient by or under regulations made under this Part, other than regulations made in reliance on section 4(4)(a), (aa) or (b), or
(b) on a person in their capacity as a person described in section 4(4)(c) by or under regulations made under this Part,
nothing in this section, or in regulations under subsection (1) or (9), prevents the person, or a person acting on their behalf, from requiring payment in connection with the performance or exercise of those duties or powers, or restricts their ability to do so, where the person could do so otherwise than in reliance on regulations under subsection (1).
(11) Examples of requiring payment otherwise than in reliance on regulations under subsection (1) include doing so in reliance on other legislation or a contract or other arrangement (whenever entered into).”—(Chris Bryant.)
This amendment enables regulations to make clear whether or not powers to charge arising otherwise than under regulations made under Part 1 can be used in connection with activities carried on pursuant to such regulations. It also provides that, where third party recipients have existing powers to charge, the regulations cannot prevent or restrict the exercise of those powers.
Clause 15
The FCA and financial services interfaces: supplementary
Amendments made: 23, page 21, line 26, leave out third “to”.
See the explanatory statement for Amendment 24.
Amendment 24, page 21, line 27, after “subsection,” insert
“or to a person acting on behalf of such a body or person,”.
This amendment and Amendment 23 insert a reference to a person acting on behalf of an interface body or a person listed in clause 15(7) into clause 15(6).
Amendment 25, page 21, line 27, leave out
“for the purpose of meeting expenses”
and insert
“in connection with activities”.
This amendment and Amendments 26 and 27 remove the requirement for the amount of fees provided for by FCA interface rules (defined in clause 14(2)) to be linked to the expenses of performing duties, or exercising powers, arising from regulations under Part 1 of the Bill or FCA interface rules.
Amendment 26, page 21, line 32, leave out subsection (8) and insert—
“(8) Those activities are performing or exercising—
(a) duties or powers imposed or conferred on the interface body or person listed in subsection (7) by FCA interface rules, and
(b) other duties or powers imposed or conferred on that body or person by or under regulations made under this Part.”
See the explanatory statement for Amendment 25. This amendment also makes minor changes for consistency with clauses 11(3) and 14(2).
Amendment 27, page 21, line 40, leave out
“in respect of which the fee is charged”
and insert
“in connection with which the fee is charged (and for the total amount of fees payable in connection with things to exceed the total cost)”.
See the explanatory statement for Amendment 25.
Amendment 28, page 22, line 10, at end insert—
“(da) may require or enable rules to make provision about what must or may be done with amounts paid as fees;”.
This amendment confers express power to enable the FCA to make provision about the treatment of amounts paid as fees to interface bodies and others, for consistency with the similar power in clause 11(1)(b).
Amendment 29, page 22, line 13, at end insert—
“(9A) Regulations under section 14 may enable FCA interface rules to make provision about whether an interface body or a person listed in subsection (7), or a person acting on behalf of such a body or person, who could require payment in connection with an activity described in subsection (8) otherwise than in reliance on FCA interface rules may do so.
(9B) Examples of requiring payment otherwise than in reliance on FCA interface rules include doing so in reliance on other legislation or a contract or other arrangement (whenever entered into).”—(Chris Bryant.)
This amendment enables FCA interface rules (defined in clause 14(2)) to make clear whether or not powers to charge arising otherwise than under FCA interface rules can be used in connection with activities carried on pursuant to such rules or regulations under Part 1.
Clause 21
Regulations under this Part: supplementary
Amendment made: 30, page 26, line 17, after “sections” insert “11(9),”.—(Chris Bryant.)
This amendment provides that the regulation-making power under clause 11(9) (inserted by Amendment 22) is not restricted by clause 21(3).
Clause 23
Related subordinate legislation
Amendment made: 31, page 27, line 26, at end insert—
“(3A) For the purposes of determining whether subordinate legislation contains provision described in clauses 2(1) to (4) or 4(1) to (4), references in those sections to something specified are to be read as including something specified by or under any subordinate legislation.”—(Chris Bryant.)
Clause 23 confers power to make provision in connection with subordinate legislation that is similar to regulations that can be made under Part 1. This amendment provides that, when determining whether that power is available, clauses 2 and 4 should be read as referring to things specified in subordinate legislation, rather than in regulations under Part 1.
Clause 25
Other defined terms
Amendment made: 32, page 29, line 2, at end insert—
“(4) In this Part, references to regulations made under subsection (3) of section 4 or any of sections 5 to 21 (and references which include such regulations) include regulations made under section 4(4)(b) or (c) which make provision that could be made under the other subsection or section.”—(Chris Bryant.)
This amendment provides that references in Part 1 to regulations made under particular provisions of Part 1 include regulations made under clause 4(4)(b) and (c) (which confer power to make provision that could be made in reliance on those other provisions).
Clause 56
National Underground Asset Register: England and Wales
Amendment made: 1, page 57, leave out lines 35 and 36 and insert
“obtain the consent of the Welsh Ministers in relation to any provision which would be within the legislative competence of Senedd Cymru if contained in an Act of the Senedd (ignoring any requirement for the consent of a Minister of the Crown imposed under Schedule 7B to the Government of Wales Act 2006).”—(Chris Bryant.)
This amendment provides that the Secretary of State must obtain the consent of the Welsh Ministers before making regulations under Part 3A of the New Roads and Street Works Act 1991 (inserted by this clause) in relation to any provision that would be within the legislative competence of Senedd Cymru if contained in an Act of the Senedd.
Clause 57
Information in relation to apparatus: England and Wales
Amendments made: 2, page 60, line 2, leave out “consult the Welsh Ministers” and insert
“obtain the consent of the Welsh Ministers in relation to any provision that relates to apparatus in streets in Wales”.
This amendment provides that the Secretary of State must obtain the consent of the Welsh Ministers before making regulations under section 79 of the New Roads and Street Works Act 1991 in relation to any provision that relates to apparatus in streets in Wales.
Amendment 3, page 60, line 25, at end insert—
“(4A) Before making regulations under this section the Secretary of State must obtain the consent of the Welsh Ministers in relation to any provision that relates to apparatus in streets in Wales.”
This amendment provides that the Secretary of State must obtain the consent of the Welsh Ministers before making regulations under section 80 of the New Roads and Street Works Act 1991 in relation to any provision that relates to apparatus in streets in Wales.
Amendment 4, page 60, leave out line 28.
This amendment is consequential on amendment 3.
Amendment 5, page 61, line 17, leave out subsections (8) and (9).—(Chris Bryant.)
This amendment removes provision applying the Street Works (Records) (England) Regulations 2002 to Wales and also removes provision revoking the Street Works (Records) (Wales) Regulations 2005.
Clause 58
National Underground Asset Register: Northern Ireland
Amendment made: 6, page 67, leave out lines 25 to 27 and insert—
“(A1) Before making regulations under this Order the Secretary of State must obtain the consent of the Department for Infrastructure.”—(Chris Bryant.)
This amendment provides that the Secretary of State must obtain the consent of the Department for Infrastructure in Northern Ireland before making regulations under the Street Works (Northern Ireland) Order 1995.
Clause 59
Information in relation to apparatus: Northern Ireland
Amendments made: 7, page 69, leave out lines 35 and 36.
This amendment is consequential on amendment 6.
Amendment 8, page 70, leave out line 25.—(Chris Bryant.)
This amendment is consequential on amendment 6.
Clause 135
Creating, or requesting the creation of, purported intimate image of adult
Amendments made: 33, page 173, line 22, leave out “This section” and insert “Subsection (2)”.
This amendment is consequential on Amendment 34.
Amendment 34, page 173, line 29, at end insert—
“(3) Subsection (4) applies where a person commits an offence under section 66F of the Sexual Offences Act 2003 (requesting the creation of purported intimate image of adult).
(4) A purported intimate image which is connected with the offence, and anything containing it, is to be regarded for the purposes of section 153 (and section 157(3)(b)) as used for the purposes of committing the offence (including where it is committed by aiding, abetting, counselling or procuring).
(5) A purported intimate image is connected with an offence under section 66F of the Sexual Offences Act 2003 if —
(a) it appears to be of a person who was the subject of the request to which the offence relates (whether or not it is what was requested), and
(b) it was in the offender’s possession, or under the offender’s control, as a result of that request.”—(Chris Bryant.)
This amendment provides that deprivation orders can be made under the Sentencing Code in connection with an offence under new section 66F of the Sexual Offences Act 2003 (requesting the creation of purported intimate image of adult).
Schedule 11
Further minor provision about data protection
Amendment made: 35, page 225, line 13, at end insert—
“21A In section 170(2)(a) (unlawful obtaining etc of personal data), after “preventing” insert “, investigating”.
21B (1) Section 171 (re-identification of de-identified personal data) is amended as follows.
(2) In subsection (3)(a), after “preventing” insert “, investigating”.
(3) In subsection (6)(a), after “preventing” insert “, investigating”.”—(Chris Bryant.)
This amendment adds references to investigating crime to existing references in the Data Protection Act 2018 to detecting or preventing crime. (There are similar amendments in paragraphs 23, 26(2) and (3), 27(2) and (3) and 29 of Schedule 11.)
Title
Amendment made: 36, line 18, after “services;” insert
“to make provision about works protected by copyright and the development of artificial intelligence systems;”.—(Chris Bryant.)
This amendment is consequential on NC16 and NC17.
Third Reading
18:50
Peter Kyle Portrait The Secretary of State for Science, Innovation and Technology (Peter Kyle)
- View Speech - Hansard - - - Excerpts

I beg to move, That the Bill be now read the Third time.

The House has worked incredibly hard to get the Bill to where it is today. It is a relief that after so many attempts to get this piece of legislation through, over such a long period of time and multiple Governments, we will finally get it across the line. I put on record my thanks to the people who have got us to where we are, including the Members from across the House who have contributed in sincere and passionate ways during today’s debate on Report, and now on Third Reading.

I also put on record my very sincere thanks to my hon. Friend the Member for Rhondda and Ogmore (Chris Bryant), the Minister responsible. He has seen the Bill through assiduously, persistently, and with passion at all times to make sure that it passes through Parliament and is out there, benefiting the people of Britain. I thank him, and also officials in my Department for Science, Innovation and Technology. There are certain officials who have been working on this Bill since 2022 and who have put their life and soul into it, often seven times a week. Their dedication to getting this piece of legislation through should be recognised by Members right across the House—it certainly is by me. I thank them very much.

I hope the House has noticed that the Government have tabled amendments to improve the Bill until the last moment. By making it an offence to request the creation of deepfake intimate images without consent and empowering the courts to deprive offenders of images and devices containing them, we will ensure consistency in our approach to protecting women and girls from that vile, demeaning form of abuse.

To conclude in the short time I have available, the Bill will make life better for working people right across our country.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

Will the Secretary of State give way?

Peter Kyle Portrait Peter Kyle
- Hansard - - - Excerpts

I am afraid that in the time I have, I cannot give way. I want to do the Opposition spokesperson, the hon. Member for Havant (Alan Mak), the courtesy of allowing him to have his say in the remaining couple of minutes.

The Bill will give working people across our country a stronger economy, better public services, and more time to do the things they like with the people they love. I look forward to working with people, including hon. Members from across the House, to resolve as quickly as possible any outstanding issues that may arise after the Bill passes. The version of the Bill that is before us today is its third substantive iteration. It follows two failed attempts by the previous Government, the first of which started back in July 2022. It is time that we got this done; for far too long, our citizens and businesses have paid the price of the failure to deliver data reform, and we cannot expect them to put up with it any longer. Today, we have an opportunity to finally get it right. The Bill that is before us today will remove the brakes that are holding back our country.

18:54
Alan Mak Portrait Alan Mak (Havant) (Con)
- View Speech - Hansard - - - Excerpts

I thank the Secretary of State for allowing time for an Opposition response. I begin by thanking hon. and right hon. Members across the House for their contributions to this Bill over many months, and I thank officials in the Department across several Governments and officials in Parliament. May I thank the entire team on our Benches, but in particular my hon. Friend the Member for Runnymede and Weybridge (Dr Spencer) for his extremely hard work and our senior researcher Sophie Thorley?

The Conservatives left the Government with a data Bill that would have improved Britain’s position as a leading tech-enabled economy and society. However, in Labour’s hands, the Government have delivered only confusion and failure. A wide range of amendments have been tabled to Labour’s Bill, highlighting key issues that required both leadership and agility from the Government, but they have failed on each of those areas. On AI and copyright, they let down our content creators. On sex and gender, they let down women and girls. On social media safety, they let down our children. The last Conservative Government turned Britain into a leading tech power, and our original Bill built on those achievements. Labour’s Bill today takes the country backwards, and our country deserves so much better.

Question put and agreed to.

Bill accordingly read the Third time and passed.