Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateChris Bryant
Main Page: Chris Bryant (Labour - Rhondda and Ogmore)Department Debates - View all Chris Bryant's debates with the Department for Science, Innovation & Technology
(1 day, 18 hours ago)
Commons ChamberI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 17—Report on the use of copyright works in the development of AI systems.
New clause 1—Age of consent for social media data processing—
“(1) The UK GDPR is as amended as follows.
(2) In Article 8 of the UK GDPR (Conditions applicable to child's consent in relation to information society services)
After paragraph 1 insert—
‘(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.
(1B) For the purposes of paragraph 1A “social networking services” means any online service that—
(a) allows users to create profiles and interact publicly or privately with other users, and
(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.
(1C) Paragraph 1B does not apply to—
(a) educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.
(b) health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support’”.
This new clause would raise the age for processing personal data in the case of social networking services from 13 to 16.
New clause 2—Compliance with UK copyright law by operators of web crawlers and general-purpose AI models—
“(1) The Secretary of State must by regulations make provision (including any such provision as might be made by Act of Parliament), requiring the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988, regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place.
(2) Provision made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—
(a) pre-training and training,
(b) fine tuning,
(c) grounding and retrieval-augmented generation, and
(d) the collection of data for the said purposes.
(3) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires web crawlers and general-purpose AI models with UK links to comply with UK copyright law across all stages of AI development.
New clause 3—Transparency of crawler identity, purpose and segmentation—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding the identity of crawlers used by them or by third parties on their behalf, including but not limited to—
(a) the name of the crawler,
(b) the legal entity responsible for the crawler,
(c) the specific purposes for which each crawler is used,
(d) the legal entities to which operators provide data scraped by the crawlers they operate, and
(e) a single point of contact to enable copyright owners to communicate 35 with them and to lodge complaints about the use of their copyrighted works.
(2) The information disclosed under subsection (1) must be available on an easily accessible platform and updated at the same time as any change.
(3) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to deploy distinct crawlers for different purposes, including but not limited to—
(a) web indexing for search engine results pages,
(b) general-purpose AI model pre-training, and
(c) retrieval-augmented generation.
(4) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to ensure that the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires operators of web crawlers and AI models to disclose their identity, purpose, data-sharing practices, and use separate crawlers for different functions.
New clause 4—Transparency of copyrighted works scraped—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding text and data used in the pre-training, training and fine-tuning of general purpose AI models, including but not limited to—
(a) the URLs accessed by crawlers deployed by them or by third parties on their behalf or from whom they have obtained text or data,
(b) the text and data used for the pre-training, training and fine-tuning, including the type and provenance of the text and data and the means by which it was obtained, and
(c) information that can be used to identify individual works, and (d) the timeframe of data collection.
(2) The disclosure of information under subsection (1) must be updated on a monthly basis in such form as the regulations may prescribe and be published in such manner as the regulations may prescribe so as to ensure that it is accessible to copyright owners upon request.
(3) The Secretary of State must lay before Parliament a draft of the statutory 35 instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause mandates transparency about the sources and types of data used in AI training, requiring monthly updates accessible to copyright owners.
New clause 5—Enforcement—
“(1) The Secretary of State must by regulations make provision requiring the Information Commission (under section 114 of the Data Protection Act 2018) (‘the Commissioner’) to monitor and secure compliance with the duties by an operator of a web crawler or general-purpose artificial intelligence (AI) model whose service has links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 (‘a relevant operator’), including but not limited to the following—
(a) the regulations must provide for the Commissioner to have the power by written notice (an ‘information notice’) to require a relevant operator to provide the Commissioner with information that the Commissioner reasonably requires for the purposes of investigating a suspected failure to comply with the duties;
(b) the regulations must provide for the Commissioner to have the power by written notice (an ‘assessment notice’) to require and to permit the Commissioner to carry out an assessment of whether a relevant operator has complied or is complying with the duties and to require a relevant operator to do any of the acts set out in section 146(2) of the Data Protection Act 2018;
(c) the regulations must provide that where the Commissioner is satisfied 15 that a relevant operator has failed, or is failing to comply with the duties, the Commissioner may give the relevant operator a written notice (an ‘enforcement notice’) which requires it—
(i) to take steps specified in the notice, or
(ii) to refrain from taking steps specified in the notice;
(d) the regulations must provide that where the Commissioner is satisfied that a relevant operator has failed or is failing to comply with the duties or has failed to comply with an information notice, an assessment notice or an enforcement notice, the Commissioner may, by written notice (a ‘penalty notice’), require the person to pay to the Commissioner an amount in sterling specified in the notice, the maximum amount of the penalty that may be imposed by a penalty notice being the ‘higher maximum amount’ as defined in section 157 of the Data Protection Act 2018; and
(e) the regulations may provide for the procedure and rights of appeal 30 in relation to the giving of an information notice, an assessment notice, an enforcement notice or a penalty notice.
(2) The regulations must provide that any failure to comply with the duties by a relevant operator shall be directly actionable by any copyright owner who is adversely affected by such failure, and that such copyright owner will be entitled to recover damages for any loss suffered and to injunctive relief.
(3) The regulations must provide that the powers of the Commissioner and the rights of a copyright owner will apply in relation to a relevant operator providing a service from outside the United Kingdom (as well as such one provided from within the United Kingdom).
(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing the regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause grants the Information Commissioner enforcement powers to ensure compliance with AI and web crawler transparency rules, including penalties for breaches.
New clause 6—Technical solutions—
“(1) The Secretary of State must conduct a review of the technical solutions that may be adopted by copyright owners and by the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to prevent and to identify the unauthorised scraping or other unauthorised use of copyright owners’ text and data.
(2) Within 18 months of the day on which this Act is passed, the Secretary of State must report on such technical solutions and must issue guidance as to the technical solutions to be adopted and other recommendations for the protection of the interests of copyright owners.”
This new clause requires the Secretary of State to review and report on technical measures to prevent unauthorised data scraping by web crawlers and AI models.
New clause 7—Right to use non-digital verification services—
“(1) This section applies when an organisation—
(a) requires an individual to use a verification service; and
(b) uses a digital verification service for that purpose.
(2) Where it is reasonably practicable for an organisation to offer a non-digital method of verification, the organisation must—
(a) make a non-digital alternative method of verification available to any individual required to use a verification service; and
(b) provide information about digital and non-digital methods of verification to those individuals before verification is required.”
This new clause would create a duty upon organisations to support digital inclusion by offering non-digital verification services where practicable.
New clause 8—Data Vision and Strategy—
“Within six months of Royal Assent of this Act, the Secretary of State must publish a ‘Data Vision and Strategy’ which outlines—
(a) the Government’s data transformation priorities for the next five years; and
(b) steps the Government will take to ensure the digitisation of Government services.”
New clause 9—Departmental Board Appointments—
“(1) Within six months of the day on which this Act is passed—
(a) Government departments;
(b) NHS England; and
(c) NHS trusts
shall appoint to their departmental board or equivalent body at least one of the following—
(i) Chief Information Officer;
(ii) Chief Technology Officer;
(iii) Chief Digital Information Officer;
(iv) Service Transformation Leader; or
(v) equivalent postholder.
(2) The person or persons appointed as under subsection (1) shall provide an annual report on the progress of the department or body towards the Government’s Data Vision and Strategy.”
This new clause would require digital leaders to be represented at executive level within Government departments and other bodies.
New clause 10—Data use in Public Service Delivery Review—
“(1) The Secretary of State must, every 12 months, lay before Parliament a ‘Data use in Public Service Delivery Review’.
(2) The Data use in Public Service Delivery Review shall include, but is not limited to assessment of the steps being taken to—
(a) improve the Government’s use of data in public service delivery over the previous 12 months;
(b) expand the use of data to support increased and improved digital services in public service delivery;
(c) improve expertise and digital talent within Government departments to help expand the use of data for public service delivery; and
(d) facilitate and regulate for better use of data in the delivery of public services.”
This new clause would require an annual assessment by the Secretary of State to examine the steps being taken to facilitate and regulate the use of data in the delivery of public services using digital and online technologies.
New clause 11—Access to a deceased child’s social media data—
“(1) Where a person under 18 years of age has deceased, a parent or legal guardian (the ‘requestor’) may request from any internet service provider (ISP) the child’s user data from up to 12 months prior to the date of death.
(2) The ISP must provide a copy of the requested data, or direct account access, upon verification of the requestor’s identity and relationship to the deceased person, and no court order shall be required for such disclosure.
(3) ‘User data’ includes all content, communications, or metadata generated by or associated with the deceased person’s online activity, including stored messages and posts, except where the deceased person had explicitly directed otherwise prior to death.
(4) The ISP may refuse or redact specific data only where—
(a) disclosure would unduly infringe the privacy rights of another individual,
(b) the deceased person had explicitly opted out before death,
(c) there is a conflicting court order, or
(d) a serious risk to public safety or national security would result.
(5) In providing data under this section, the ISP must comply with data protection legislation.
(6) This section constitutes a lawful basis for disclosure under Article 6 of the UK GDPR.
(7) The Secretary of State may, by regulations subject to the affirmative resolution procedure—
(a) provide guidance on verifying parent or guardian status,
(b) clarify any additional grounds for refusal, and
(c) prescribe safeguards to protect third-party confidentiality.
(8) For the purposes of this section—
‘internet service provider (ISP)’ includes any provider of social media, messaging, or other online platforms; and
‘data protection legislation’ has the meaning given in section 51 of this Act.”
This new clause would allow parents of a deceased minor to obtain that child’s social media data without a court order, subject to privacy safeguards for third parties.
New clause 12—Raising the minimum age at which users can consent to processing of personal data—
“(1) The UK GDPR is amended in accordance with subsection (2) of this section.
(2) (2) After paragraph 1 of Article 8 of the UK GDPR (Conditions applicable to child’s consent in relation to information society services) insert—
‘(1A) References to “13 years old” and “age of 13 years” in paragraph 1 shall be read as “16 years old” and “age of 16 years” in the case of processing of personal data.
(1B) Paragraph (1A) does not apply to—
(a) platform systems and services operated where the primary purpose of processing of personal data is for the advancement of a charitable purpose as defined in the Charities Act 2011;
(b) publicly owned platform systems and services operated for the primary purpose of law enforcement, child protection, education, or healthcare;
(c) cases in which the Secretary of State determines it is in the best interests of the child for an operator to accept the child’s own consent.’”
This new clause would raise the age for processing personal data from 13 to 16 years old with certain exceptions for charitable purposes and child safety.
New clause 13—Code of practice for the use of children’s educational data—
“(1) Within 6 months of the passage of this Act, the Information Commissioner must prepare a code of practice which contains such guidance as the Information Commissioner considers appropriate on the processing of children’s data in connection with the provision of education.
(2) Guidance under subsection (1) must consider—
(a) all aspects of the provision of education including learning, school management, and safeguarding;
(b) all types of schools and learning settings in the development of guidance;
(c) the use of AI systems in the provision of education;
(d) the impact of profiling and automated decision-making on children’s access to education opportunities;
(e) children’s consent to the way their personal data is generated, collected, processed, stored and shared;
(f) parental consent to the way their children’s personal data is being generated, collected, processed, stored and shared;
(g) the security of children’s data;
(h) the exchange of information for safeguarding purposes.”
This new clause requires the Information Commissioner to produce a code of practice for accessing children’s educational data.
New clause 14—Transparency of business and customer data used in training Artificial Intelligence models—
“(1) The Secretary of State must by regulations make provision requiring operators of general-purpose AI models to disclose upon request information about business data and customer data processed for the purposes of pre-training, training, fine-tuning, and retrieval-augmented generation in an AI model, or any other data input to an AI model.
(2) Business data and customer data must include, but is not limited to, the whole or any substantial part of a literary, dramatic, musical or artistic work, sound recording, film or broadcast included in any text, images and data used for the purposes set out in subsection (1).
(3) Information disclosable under subsection (1) must include but is not limited to:
(i) Digital Object Identifiers and file names;
(ii) Details of how the work was identified, including metadata;
(iii) The source from which it was scraped or otherwise obtained; and
(iv) The URLs accessed by crawlers deployed by operators, or by third parties, to obtain the data.
(4) The owner of rights in any individual work identifiable in information disclosed under subsection (1) must be provided upon request to the relevant operator with information as to whether and how they have complied with the laws of the United Kingdom in respect to that work.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause would require the Secretary of State to set out transparency provisions requiring generative AI developers to provide information to enable individuals and creative businesses to determine whether their data, works and other subject matter have been used in training datasets.
New clause 15—Complaints procedure for vulnerable individuals—
“(1) The Data Protection Act 2018 is amended in accordance with subsections (2) to (4).
(2) After section 165(3) insert—
‘(3A) For complaints under subsection (2), the Information Commissioner must provide appropriate complaints-handling procedures for—
(a) victims of modern slavery,
(b) victims of domestic abuse,
(c) victims of gender-based violence, or
(d) data subjects otherwise in a position of vulnerability.
(3B) Procedures under subsection (3A) must include—
(a) appropriate support for vulnerable individuals;
(b) provision of specialised officers for sensitive cases;
(c) signposting to support services;
(d) provision of a helpline;
(e) de-escalation protocols.’
(3) After section 166(1)(c) insert—
‘(d) fails to investigate a complaint appropriately or take adequate action to remedy findings of inadequacy.’
(4) After section 166(2)(b), insert—
‘(c) to use formal powers as appropriate to investigate a complaint and to remedy any findings of inadequacy, unless the request from the data subject is manifestly unfounded or excessive.’”
This new clause would require the Information Commission to introduce a statutory complaints procedure for individuals in a position of vulnerability and new grounds of appeal to an Information Tribunal.
New clause 18—Report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies—
“(1) The Secretary of State must within six months of the passing of this Act—
(a) prepare and publish a report examining the need for a specific statutory public interest test to determine and safeguard access to NHS data by third-parties and companies.
(b) within 28 days of a report being laid under subsection (1) the Government must schedule a debate and votable motion on the findings of the report in each House.
(2) The report must consider—
(a) whether and in what situations it would be necessary, proportionate and lawful to share NHS data with third-parties and companies when the interests and risks to both the individual and/or public is considered.
(b) when it would be in the public interest and in the best interests of patients and the NHS to allow access by third-parties and companies to NHS data in relation to the provision of health care services and for promotion of health.”
This new clause would require the Secretary of State to produce a report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies and then to schedule a debate on it in each House.
New clause 19—Secretary of State’s duty to review the age of consent for data processing under the UK GDPR—
“(1) The Secretary of State must, within 12 months of Royal Assent of this Act, have conducted a review and published a report into the operation of Article 8 (Conditions applicable to child's consent in relation to information society services) of the UK GDPR in relation to the data processed by social media platforms of children under the age of 16.
(2) As part of this review, the Secretary of State must consider—
(a) the desirability of increasing the digital age of consent under the UK GDPR from 13 to 16, taking into account the available evidence in relation to the impact of social media platforms on the educational, social and emotional development of children; and
(b) the viability of increasing the digital age of consent under Article 8 of the UK GDPR in relation to specific social media platforms which are shown by the evidence to be unsuitable for use by children under the age of 16.
(3) Within six months of the publication of the report under subsection (1), the Secretary of State must lay a plan before Parliament for raising the digital age of consent to 16 through amendments to Article 8 GDPR, unless the review concludes that such changes are unnecessary.”
New clause 20—Duties of the Secretary of State in relation to the use by web-crawlers and artificial intelligence models of creative content—
“The Secretary of State must—
(a) by 16 September 2025, issue a statement, by way of a copyright notice issued by the Intellectual Property Office or otherwise, in relation to the application of the Copyright, Designs and Patents Act 1988 to activities conducted by web-crawlers or artificial intelligence models which may infringe the copyright attaching to creative works;
(b) by 16 September 2025, lay before Parliament a report which includes a plan to help ensure proportionate and effective measures for transparency in the use of copyright materials in training, refining, tuning and generative activities in AI;
(c) by 16 September 2025, lay before Parliament a report which includes a plan to reduce barriers to market entry for start-ups and smaller AI enterprises on use of and access to data;
(d) by 1 July 2026, publish a technological standard for a machine-readable digital watermark for the purposes of identifying licensed content and relevant information associated with the licence.”
New clause 21—Directions to public authorities on recording of sex data—
“(1) The Secretary of State must, within three months of the passage of this Act, issue regulations relating to the code of practice set out in section 49 of this Act which require public authorities to—
(a) collect, process and retain sex data only where it is lawful to do so in accordance with data protection legislation;
(b) request and record sex data accurately, in every circumstance where sex data is collected, in accordance with following category terms and definitions—
(i) ‘Sex’ meaning male or female only based on ‘sex at birth’, ‘natal sex’ or ‘biological sex’ (these terms carrying the same meaning and capable of being used interchangeably); and,
(ii) in addition, where it is lawful to do so in accordance with data protection legislation and the Gender Recognition Act 2004, ‘Acquired Gender’ meaning male or female only, as recorded on a gender recognition certificate issued in accordance with the Gender Recognition Act 2004;
(c) have updated relevant organisation guidance to stipulate that, where sex data is collected, this must be done in accordance with the definitions set out by subsection (1)(b) within three months of these regulations coming into force;
(d) have conducted a review of the accuracy of data held in relation to the sex of data subjects to ensure that the data is accurate in recording sex at birth and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate within 12 months of these regulations coming into force;
(e) have taken every reasonable step to ensure that any data held in relation to the sex and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate of a data subject that is found to be inaccurate is rectified or erased within 18 months of these regulations coming into force; and
(f) have produced and submitted to the Secretary of State a report setting out the findings of its review in relation to the matters set out by subsection (1)(d) and, where relevant, a description of the steps taken to ensure that the data held by the relevant public authority is accurate within the definitions set out subsection (1)(b) with 18 months of these regulations coming into force.
(2) The Secretary of State may, on receipt of a report in accordance with subsection (1)(f) instruct a public authority to take any further remedial steps within a specified timeframe reasonably necessary to ensure the accuracy of the sex and acquired gender data held by the relevant public authority.
(3) The Secretary of State must, within one month of the passage of this Act, establish and maintain a register of public authorities approved to act as sources of data relating to the attribute of sex for persons providing digital verification services.
(4) The register in subsection (3) must be published on the website of the Office for Digital Identities & Attributes or any successor body.
(5) Until such time as a public authority is added to the register under subsection (3), persons providing digital verification services may only obtain data on the sex of an individual requesting the provision of digital verification services from the record of births held by the General Register Office in accordance with subsection (6).
(6) Information supplied by the General Register Office pursuant to subsection (5) must specify sex as recorded at birth, as well as any subsequent corrections to the register in the field marked ‘Sex’.
(7) The Secretary of State may, from time to time, add public authorities to the register as under subsection (3) only upon being satisfied on the basis of a report issued under subsection (1)(f), or satisfaction of such further steps required by the Secretary of State under subsection (2) that the data held by the relevant public authority in relation to sex and, where relevant, acquired gender as recorded on a gender recognition certificate, as defined in subsection (1)(b), is accurate.”
This new clause requires the Secretary of State to issue regulations relating to the code of practice in section 49 requiring public authorities to record sex data in line with these regulations when data are collected. This clause is linked to amendments 39 and 40.
New clause 22—Recording of ethnicity data for the purposes of public service delivery—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data in the process of public service delivery and associated data collection.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be collected in the process of public service delivery.
New clause 23—Recording of ethnicity data on the Register of Births and Deaths—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data during birth and death registration.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be able to be collected during birth and death registration.
Government amendments 11 to 32.
Amendment 39, in clause 45, page 42, line 30, at the beginning insert—
“Save in respect of data relating to sex,”.
This amendment is consequential on NC21.
Amendment 40, page 43, line 15, at end insert—
“”gender recognition certificate” means a gender recognition certificate issued in accordance with the Gender Recognition Act 2004.”
This amendment is consequential on NC21.
Government amendments 1 to 8.
Amendment 37, in clause 67, page 75, line 24, at end insert—
“(2A) For the purposes of paragraph 2, ‘scientific research’ means creative and systematic work undertaken in order to increase the stock of knowledge, including knowledge of humankind, culture and society, and to devise new applications of available knowledge.
(2B) To meet the reasonableness test in paragraph 2, the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards.”
This amendment incorporates clarifications to help reduce potential misuse of the scientific research exception. The first is a definition of scientific research based on the Frascati Manual. The second is a requirement that research be conducted in line with frameworks and standards in the UKRI Code of Practice for Research.
Amendment 41, in clause 80, page 95, line 19, at end insert—
“3. For the purposes of paragraph 1(a), a human’s involvement is only meaningful if they are a natural person with the necessary competence, authority and capacity to understand, challenge and alter the decision.”
See explanatory statement for Amendment 44.
Amendment 45, page 96, line 2, at end insert—
“5. Consent in accordance with paragraph 2 cannot be given by persons under the age of 18 where—
(a) the automated decision-making is likely to produce legal or similarly significant effects on the child, or
(b) the processing involves the profiling of a child to determine access to essential services, education, or other significant opportunities.
6. The controller shall not be obliged to maintain, acquire or process additional information in order to identify the age of a data subject for the sole purpose of complying with this Regulation.
7. A significant decision may not be taken based solely on automated processing, if the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child, taking into account their rights and development stage, authorised by law to which the controller is subject, and after suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are made publicly available.
8. Profiling or solely automated processing of children’s data may not occur for the purposes of targeted advertising or behavioural analysis.”
This amendment ensures that automated decision-making cannot take place in circumstances where it would affect a child’s access to significant opportunities or would not be in their best interests, as well as protections against practices such as behavioural analysis.
Amendment 46, page 96, leave out lines 13 to 19 and insert—
“(a) communicate to the data subject before and after the decision is taken the fact that automated decision-making is involved in the decision, the extent of any human involvement, and the availability of safeguards under this Article;
(b) provide the data subject with information about decisions described in paragraph 1 taken in relation to the data subject including meaningful information about the logic involved, the significance and the envisaged consequences of such processing for the data subject, and a personalised explanation for the decision;
(c) enable the data subject to make representations about such decisions;
(d) enable the data subject to obtain human intervention on the part of the controller in relation to such decisions;
(e) enable the data subject to contest such decisions.
3. For the purposes of paragraph 2(b), a personalised explanation must—
(a) be clear, concise and in plain language of the data subject’s choice in a readily available format;
(b) be understandable, and assume limited technical knowledge of algorithmic systems;
(c) address the reasons for the decision and how the decision affects the individual personally, which must include—
(i) the inputs, including any personal data;
(ii) parameters that were likely to have influenced or were decisive to decision or a counterfactual of what change would have resulted in a more favourable outcome;
(iii) the sources of parameters and inputs;
(d) be available free of charge and conveniently accessible to the data subject, free of deceptive design patterns.
4. Where the safeguards apply after a decision is made, the controller must give effect to data subject requests as soon as reasonably practicable and within one month of the request.
5. The controller must ensure the safeguards are fully in place and complete a data protection impact assessment under Article 35 before a decision under Article 22A is taken, documenting their implementation of the safeguards in addition to the requirements of that Article.
6. The controller must publish details of their implementation of the safeguards and how data subjects can make use of them.”
This amendment would ensure that data subjects are informed of automated decisions made about them in a timely way, and that that explanation is personalised to enable them to understand why it was made. It also ensures processors are incentivised to put the safeguards in place before commencing automated decision-making.
Amendment 42, page 96, line 23, after “Article 22A(1)(a),” insert
“and subject to Article 22A(3)”.
See explanatory statement for Amendment 44.
Amendment 43, page 97, line 19, at end insert—
“(3) To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”
See explanatory statement for Amendment 44.
Amendment 44, page 98, line 31, after “and 50C(3)(c),” insert “and subject to 50A(3)”.
This amendment and Amendments 41, 42 and 43 would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person who is empowered to change the decision in practice.
Amendment 9, in clause 81, page 100, line 7, at end insert—
“Age assurance
1C. Information society services which are likely to be accessed by children must use highly effective age verification or age estimation measures for the purpose of delivering on children’s higher protection matters.”
This amendment requires services which are likely to be accessed by children to use highly effective age verification measures.
Amendment 38, in clause 86, page 103, line 22, at end insert—
“(2A) Where personal data is processed for the purposes of scientific research under section 87(4) of the 2018 Act (‘reuse’), the processor or controller must publish details of the data sources used.
(2B) These details must as a minimum include a description of the scientific research, the provenance and method of acquisition of the personal data being reused, the original lawful basis for processing, the number of data subjects affected, and whether the data subjects have been notified of the reuse.
(2C) The processor or controller must notify the Information Commission when processing data for the purposes of scientific research under section 87(4) of the 2018 Act with the same details.”
This amendment ensures transparency for the use of scientific research exemptions by requiring those reusing personal data to publish details of that reuse and notify the Information Commission of that reuse.
Government amendments 33 and 34.
Amendment 10, in schedule 7, page 201, line 5, at end insert—
“(1B) A third country cannot be considered adequate or capable of providing appropriate safeguards by any authority where there exists no credible means to enforce data subject rights or obtain legal remedy.
(1C) For the purposes of paragraph 1A, the Secretary of State must make a determination as to whether credible means are present in a third country.
(1D) In making a determination regarding credible means, the Secretary of State must have due regard to the view of the Information Commissioner.
(1E) Credible means do not exist where the Secretary of State considers that any of the following are true:
(a) judicial protection of persons whose personal data is transferred to that third country is insufficient;
(b) effective administrative and judicial redress are not present;
(c) effective judicial review mechanisms do not exist; and
(d) there is no statutory right to effective legal remedy for data subjects.”
The amendment would prohibit personal data transfer to countries where data subject rights cannot be adequately upheld and prohibit private entities from using contracts to give the impression that data security exists.
Government amendments 35 and 36.
Earlier I appeared as a Department for Culture, Media and Sport Minister, and now I appear as a Department for Science, Innovation and Technology Minister. I hate to embarrass Members, but they will get two bouts of me today. I will start with the Government amendments, and then once I have heard the arguments from Members advancing other amendments, I will speak to those later in the debate. If I do not cover subjects in this initial speech, I will get back to them later.
The right hon. Gentleman is enticing me. I hope he will be nicer to me than the Chair of the Culture, Media and Sport Committee, the hon. Member for Gosport (Dame Caroline Dinenage) was earlier.
I am sure that the Chair of the Committee and I will always be nice to Minister. I was only going to say that I have experienced the slight schizophrenia he has referred to in holding roles in the Department for Science, Innovation and Technology and in DCMS at the same time. Although he is appearing as a DSIT Minister this afternoon, can he assure the House that he will not forget his responsibilities as a DCMS Minister for the creative industries?
I model myself in all things on the right hon. Gentleman, apart from the fact that I left the Tory party many years ago, and it is about time that he came over to the Labour Benches.
It is not too late.
No, the right hon. Member for Maldon (Sir John Whittingdale) could come over here; I am not going back over there.
The point I was going to make is that I am fully cognisant of my duties. I think the right hon. Gentleman was referring to the artificial intelligence copyright issues that we will be addressing fairly shortly. I like the fact that I am in both Departments, because it means I can bring the knowledge of both sectors to bear on each other. If we are lucky, and if we work hard at it, I hope that I will be able to persuade him that we can come to a win-win solution. As he knows, this is not easy. When I had my first meeting with him after I was appointed in the post, he said, “This is not an easy area to resolve.” I hope I am not breaking a confidence—but he is smiling.
I have a large number of topics to cover, and I am conscious that many Members will think this is the data Bill, when we will actually be dealing with an awful lot of subjects this afternoon that do not feel as if they have anything to do with the measures in the original version brought forward by the right hon. Gentleman and previously. I hope that Members will bear with me. I intend to address the Government’s amendments as follows: first, AI and copyright; secondly, deepfakes; thirdly, the national underground assets register; and then smart data and other minor and technical amendments.
I will start with AI and intellectual property. As Members know, it was never the Government’s intention to legislate on that issue at all in this Bill. It is a complex and important issue, which is why we have consulted on a package of measures. That consultation had more than 11,500 responses, which we are still considering. Several hon. Members have said to me, “Will you remove the opt-out clause in the Bill?” I need to make it absolutely clear that no such opt-out clause is in the Bill. We never laid one in the Bill, so there is not an opt-out clause to remove.
As Members will also know, the Lords inserted a set of amendments on AI and copyright, which we removed in Committee. They reappear on the amendment paper today as new clauses 2 to 6, tabled by the hon. Member for Harpenden and Berkhamsted (Victoria Collins). A similar measure has been tabled as new clause 14 by my hon. Friend the Member for Leeds Central and Headingley (Alex Sobel).
We oppose all these new clauses for several reasons. First, they pre-empt the results of the consultation. It must surely be better to legislate on this complex subject in the round rather than piecemeal. The amendments are also unworkable. New clause 5, for instance, would make the Information Commissioner the regulator of transparency requirements, but the Information Commissioner’s Office has neither the skills nor the resources to perform that function. Obviously, transparency requirements without an effective enforcement mechanism are worse than useless, which means the other clauses on transparency are also unworkable in this context. The new clauses also fail to address some of the most important questions in this area. They effectively legislate piecemeal rather than in the round. Whenever Parliament has done that in the past, it has rued the day, and I think the same is true today.
Does the Minister not understand the urgency? Generative AI is ingesting our whole creative catalogue as we speak. We need something in place now. We cannot wait a year for reports or three years for legislation; we need action now. Does he not understand that something needs to be brought forward here today? These amendments offer that.
I do not think the amendments do offer that, because I do not think they work. We need to legislate in the round, as I say, and not piecemeal. I point out to the hon. Member that there is something of a two-edged sword here. I have been repeatedly told—and I understand the point—that there is no legal uncertainty as to the copyright status of works that are being scraped. At the same time, people are saying they want legislative change. Those two things cannot be true at the same time. I am determined to get us to a better place on this, as I will perhaps explain in a couple of moments.
I think there is an intention to push new clause 2 to a vote later, which I urge hon. Members not to do, although I do not always get my way. New clause 2 basically says that people should comply with the law. I mean, it is a simple fact: people should comply with the law. We cannot legislate to tell people that they should comply with the law; the law is the law. If none of these amendments is passed today, the law will remain as it is today and copyright law in the UK will be robust and clear.
For the absolute avoidance of doubt, some people have talked to me about text and data mining exceptions, which, as Members will know, exist, for instance, in the European Union. There is a text and data mining exception already in UK law. It was introduced in 2014 via a statutory instrument, which added section 29A to the Copyright, Designs and Patents Act 1988. However, it is an exception for the sole purpose of non-commercial research. I think that that is absolutely clear in law, and I do not think it needs any clarifying.
I understand the point that the Minister is making about existing copyright law, but, as he has said, the Government opened a consultation that has, for many of our constituents who work in the creative industries, prefigured a substantial change in copyright when it comes to AI. Does he see the merit that many of us see in making it clear that the principles behind copyright from which our creative constituents should be able to benefit, and which should protect their own works, are what is at stake here? Having said that the existing law stands, will he at least make a commitment that that is what the Government want as well? I think he can understand why people are concerned, and the source of the concerns that have merited these amendments.
I completely understand and, in large measure, share those concerns. We wanted to ensure, in this fast-changing world, that the creative industries in the United Kingdom could be remunerated for the work they had produced. We are not in the business of giving away other people’s work to third parties for nothing: that would be to sell our birthright for a mess of pottage, to use a term from an old translation of the Bible, and we are determined not to do it. As my hon. Friend—and several other Members—will have heard me say many times before, we would only proceed with the package of measures included in the consultation if we believed that we were advancing the cause of the creative industries in the UK, rather than putting them in danger or legal peril.
I think that some of the things I will say in a moment will be of assistance. We want to reach a point at which it is easier for the creative industries—whether they are large businesses with deep pockets and able to use lawyers, or very small individual photographers or painters—to assert and protect their rights, and to say, if they wish, “No, you cannot scrape my material for the purpose of large language model learning, unless you remunerate me.” That remuneration might happen via a collective licensing scheme, or it might happen individually. Either way, we want to get to more licensing rather than less. As, again, I have said several times at this Dispatch Box, we have looked at what has happened in the European Union and what is happening in the United States of America, and we believe that although the EU said that its package was designed to deliver more licensing, it has not led to more licensing or to more remuneration of the creative industries, and we want to avoid that pitfall.
As I have said, I take the concerns of the creative industries seriously, both as a DSIT Minister and as a DCMS Minister; of course I do. I agree—we, the Government, agree—that transparency is key. We want to see more licensing of content. We believe that the UK is a creative content superpower, and we want UK AI companies to flourish on the basis of high-quality data. I have spoken to a fair number of publishing companies, in particular UK companies such as Taylor & Francis, a largely academic publisher. As Members will know, the UK is the largest exporter of books in the world. Those companies are deliberately trying to get all their material licensed to AI companies, for two reasons: first, they want to be remunerated for the work that they have provided, and secondly, just as importantly, they want AI to come up with good answers. If you put dirty water into a pipe, dirty water will come out at the other end, and if you put good data into AI, good answers will come out of AI. That is an important part of why we want to ensure that we have strong AI based on high-quality data, and much of that is premium content from our creative industries.
We also agree that the Government must keep an open mind, and must take full account of the economic evidence. That is why we have tabled new clauses 16 and 17, which set out binding commitments to assess the impact of any and all proposals and to consider and report on the key areas raised in debate. That includes any and all of the options that were involved in the consultation that we published after the amendments were tabled in the House of Lords. As the Government take forward the commitments made by these amendments, they will consider all potential policy options. I must emphasise that the Government have not prejudged the outcome of the consultation, and take the need to consider and reflect on the best approach for all parties very seriously.
Members will, I am sure, have read new clause 17; it requires the Government to report on four matters. First, there is the issue of technical solutions that would enable copyright owners to control whether their copyright works could be used to develop AI.
Will the Minister give way?
Will the hon. Lady just let me finish this paragraph, because it might read better in Hansard? Actually, I have now added that bit, so it is ruined, and I might as well give way to her.
The question of technical solutions is very important, but my challenge is this. I have spoken to representatives of some of the big tech companies who are pushing for that, and who are saying that it is hard for them to do it at scale but creatives can do it. Why can the tech companies not be leading on an opt-in system for creatives? Let me hand that back to the Minister.
I should point out that the hon. Lady, as the spokesperson for the Liberal Democrat party, will be speaking very shortly.
I know, but she is wonderful, so we will let her—or you will let her, Madam Deputy Speaker.
This is a really important point. Surely it cannot be impossible for us to find a technical solution. People who can develop AI—and they are now developing AI on their laptops, especially following DeepSeek; they do not need massive computers—should be able to develop a very simple system, as I have said before, whereby all creatives who are copyright owners are able to assert their rights, very simply, across all platforms, without any great exertion. That is what I want to achieve.
The hon. Lady was quite right to raise that question, so what are we going to do next? We say in new clause 17 that we will report in 12 months’ time. If we were to report in 12 months’ time that we had done absolutely nothing, I think that everyone would rightly rant and rave at us. It is our intention that the Secretary of State for Science, Innovation and Technology and the Secretary of State for Culture, Media and Sport will together co-ordinate a special taskforce specifically to consider how we can facilitate, incentivise and enable the development of these technical solutions. I suspect that, if we can get there, opt-out will look remarkably like opt-in.
The second matter on which new clause 17 requires us to report is access to data for AI developers to train AI systems in the UK, the third is transparency, and the fourth relates to measures to facilitate the licensing of copyright works for AI training. The publication will be required within 12 months of Royal Assent, and will of course be laid before Parliament. New clause 16 supplements these reports with a full economic impact assessment that will go further than previous assessments, and will present an analysis of the economic impact of a range of policy options available in this context, supported by the additional evidence that the Government have received in response to their consultation. The reporting requirements are important: they mean that we will have to engage with each of these issues apace and in depth, and we will do that. We are determined to find and incentivise technical solutions that support our objectives, and I believe that if we do that we can be a world leader. As I said earlier, the two Secretaries of State will convene working groups to tackle each of these issues.
I have heard people say that we are legislating to water down copyright, but that is simply not true. If Members support the Government’s position today, the UK’s copyright law will remain precisely as robust tomorrow as it is today. For activities in the UK, people will, in law, only be able to use copyright material if they are permitted and licensed to do so or if a copyright exception allows it, such as the existing copyright exceptions for education, public libraries and non-commercial work.
It was a pleasure to serve on the Bill Committee. May I take up the point about timelines in the new clause? The Minister has said that the reports must be made before the end of a period of 12 months, but, as other Members have said, there is a great deal of concern about what may happen. Does he expect this to take a year, or might it possible to work faster so that more reassurance can be given? I accept that there will need to be further consultation, and examination of the responses.
Obviously, a series of different things will happen. We will have to respond to the consultation at some point, and I guess that the Culture, Media and Sport Committee will want to respond as well. In the meantime, we will be running a working group. I am very happy to keep the House updated on how that work progresses, but I do not want to commit to producing something within 12 months without being absolutely certain that I can do so. If new clause 17 is carried today, it will be a requirement by law that we produce a response within 12 months.
I fully get the point about urgency. As the right hon. Member for Maldon knows well, this issue has been hanging around for a considerable period of time. We in the UK have perhaps been a little slow, but I want to make sure that we get it right, rather than legislate piecemeal.
I apologise if I have missed this, but has the Minister outlined when the Government will respond to the consultation?
No, I have not—my hon. Friend has not missed anything. Obviously, we want to respond as soon as possible, but we have 11,500 consultation responses to consider.
Some issues have hardly been referred to in the public debate on this matter. One issue that Equity is understandably pursuing, and that we referred to in the consultation, is about personality rights, which exist in some states in the United States of America. That is quite complicated to legislate for, which is one of the reasons we have consulted on it.
We have also consulted on the question—again, nobody has referred to this in the public debate—of whether a work that is generated by AI has any copyright attached to it. If so, who owns that copyright? It is slightly moot in British law. One could argue that British copyright law has always presumed that copyright applies only where a work is the expression of an individual, so it does not apply to AI-generated material, but there are other elements. Section 9(3) of the Copyright, Designs and Patent Act 1988 says that machine-generated material can have copyright attached to it, which is one of the other issues that we want to address.
As I said earlier, one of the issues to which nobody has yet come up with an answer is how we will provide proper enforcement of whatever transparency requirements we propose. I am conscious that in discussions I have had with our European counterparts, including my Spanish counterpart and members of the European Commission, there has been some concern about precisely what they will do by virtue of transparency. This issue is made more complicated by the advent of DeepSeek—for a whole series of different reasons, which I am happy to explain at some other point—but we need to end up with a transparency system that is both effective and proportionate. Simply dumping a list of millions and millions of URLs that have been visited on the internet is neither effective nor proportionate, so we will have to come up with something.
Does the Minister envisage that any model of enforcement around transparency will be compulsory and not a voluntary system?
By its nature, enforcement would have to be compulsory, but we are running ahead of ourselves, because nobody has actually come up with a system that has an enforcement mechanism. Who would do it? What body would do it? How would that body be resourced? That is one of the things that we need to look into, and it is one of the elements of the consultation.
I will move on to another subject: the issue of purported intimate images. Government amendment 34 deals with the creation of intimate images or deepfakes. Earlier in the Bill’s passage, my colleague Lord Ponsonby added a new offence of creating purported intimate images without consent or reasonable belief in consent, and I am sure all hon. Members agree that this is a really important addition. In Committee, we introduced the offence of requesting the creation of purported images without consent or reasonable belief in consent, as hon. Members who were on the Public Bill Committee with me will know. It seems axiomatic that the courts should have the power to deprive offenders of the image and anything containing it that relates or is connected to the offence. This is already the case for the creating offence, which was introduced in the House of Lords. Government amendment 34 amends the sentencing code to achieve that for the requesting offence. It ensures that the existing regime of court powers to deprive offenders of property also applies to images and devices containing the image that relate to the requesting offence.
We have tabled a series of amendments to clauses 56 to 59 to reflect our discussions with the devolved Governments on the national underground asset register. The amendments will require that the Secretary of State to obtain the consent of Welsh Ministers and the Department for Infrastructure in Northern Ireland, rather than merely consult them, before making regulations in relation to the provisions. Co-operation with the devolved Governments has been consistent and constructive throughout the Bill’s passage. We have secured legislative consent from Scotland, and the Senedd in Wales voted in favour of granting the Bill legislative consent only yesterday. We regret that for procedural reasons, the process with Northern Ireland has not yet reached the stage of legislative consent. We are, however, working constructively with the Department of Finance to ensure that we can make progress as quickly as possible. We continue to work closely with the Northern Ireland Executive to secure legislative consent, and to ensure that citizens and businesses of Northern Ireland feel the full benefits of the Bill.
Before I finish, I turn to our amendments to help ensure that smart data schemes can function optimally, and that part 1 of the Bill is as clear as possible. Amendments to fee charging under clauses 11 and 15 follow extensive stakeholder engagement, and will maximise the commercial viability of smart data systems by enabling regulations to make tailored provision on fee charging within each smart data scheme. For example, amendments 19 to 21 enable the fees charged to exceed expenses where appropriate. This is necessary to fulfil the commitment in the national payments vision to establish a long-term regulatory framework for open banking. Outside smart data, Government amendment 35
“adds references to investigating crime to existing references in the Data Protection Act 2018 to detecting or preventing crime”,
which will bring these references into line with other parts of the legislation.
It is a privilege to respond to this debate on behalf of His Majesty’s official Opposition, and to speak to the new clauses and amendments. This is an ambitious piece of legislation, which will enable us to harness data—the currency of our digital age—and use it in a way that drives the economy and enhances the delivery of public services. Since its original inception under the Conservatives in the last Parliament, the Bill has also become the platform for tackling some of the most pressing social and technological issues of our time. Many of these are reflected in the amendments to the Bill, which are the subject of debate today.
I start with new clause 20. How do we regulate the interaction of AI models with creative works? I pay tribute to the work of many Members on both sides of this House, and Members of the other place, who have passionately raised creatives’ concerns and the risks posed to their livelihoods by AI models. Conservative Members are clear that this is not a zero-sum game. Our fantastic creative and tech industries have the potential to turbocharge economic growth, and the last Government rightly supported them. The creative and technology sectors need and deserve certainty, which provides the foundation for investment and growth. New clause 20 would achieve certainty by requiring the Government to publish a series of plans on the transparency of AI models’ use of copyrighted works, removing market barriers for smaller AI market entrants and digital watermarking and, most important of all, a clear restatement of the application of copyright law to AI-modelling activities.
I cannot help but have a sense of déjà vu in relation to Government new clause 17: we are glad that the Government have acted on several of the actions we called for in Committee, but once again they have chosen PR over effective policy. Amid all the spin, the Government have in effect announced a plan to respond to their own consultation—how innovative!
What is starkly missing from the Government new clauses is a commitment to make it clear that copyright law applies to the use of creative content by AI models, which is the primary concern raised with me by industry representatives. The Government have created uncertainty about the application of copyright law to AI modelling through their ham-fisted consultation. So I offer the Minister another opportunity: will he formally confirm the application of copyright law to protect the use of creative works by AI, and will he provide legal certainty and send a strong signal to our creative industries that they will not be asked to pay the price for AI growth?
I thank the Minister for making that statement at the Dispatch Box. As he knows, we need to have that formally, in writing, as a statement from the Government to make it absolutely clear, given that the consultation has muddied the waters.
I am sorry, but I said that in my speech, and I have said it several times in several debates previously.
I would therefore be grateful if the Minister said why there remains uncertainty among creatives about the application of copyright in this area. Is that not why we need to move this forward?
I now turn to Government amendment 34 and others. I congratulate my noble Friend Baroness Owen on the tremendous work she has done in ensuring that clauses criminalising the creation of and request for sexually explicit deepfake images have made it into the Bill. I also thank the Government for the constructive approach they are now taking in this area.
I should have said earlier that, as the shadow Minister knows, in Committee we changed the clause on “soliciting” to one on “requesting” such an image, because in certain circumstances soliciting may require the exchange of money. That is why we now have the requesting offence.
I thank the Minister for his clarification and reiteration of that point, and again for his work with colleagues to take forward the issue, on which I think we are in unison across the House.
New clause 21 is on directions to public authorities on recording of sex data. One does not need to be a doctor to know that data accuracy is critical, particularly when it comes to health, research or the provision of tailored services based on protected characteristics such as sex or age. The accuracy of data must be at the heart of this Bill, and nowhere has this been more high-profile or important than in the debate over the collection and use of sex and gender data. I thank the charity Sex Matters and the noble Lords Arbuthnot and Lucas for the work they have done to highlight the need for accurate data and its relevance for the digital verification system proposed in the Bill.
I have been very clear on this, and it is important in such a complex area to look at the detail and nuance of the challenges around—(Interruption.) Well, it is very easy to create a new clause where we click our fingers and say, “Let’s make this more illegal; let’s bring in x, y or z restriction.” As a responsible Opposition, we are looking at the detail and complexities around implementing something like this. [Interruption.] I have been asked a few questions and the hon. Member for Cheltenham (Max Wilkinson) might want to listen to the rationale of our approach.
One question is how to define social media. Direct messaging services such as WhatsApp and platforms such as YouTube fall in the scope of social media. There are obviously social media platforms that I think all of us are particularly concerned about, including Snapchat and TikTok, but by changing the age of digital consent we do not want to end up capturing lower-risk social media platforms that we recognise are clearly necessary or beneficial, such as education technology or health technology platforms. And that is before we start looking at whether age verification can work, particularly in the 13-to-16 age group.
Sorry, I am getting a bit lost. Does the Minister think, and does the Conservative party think, that the digital age of consent should rise from 13 to 16 or not?
I rise to support new clauses 2 to 5 in the name of the hon. Member for Harpenden and Berkhamsted (Victoria Collins); to pay tribute to Baroness Kidron, who has driven forward these amendments in the other place; and to speak in favour of new clause 20 in the name of the official Opposition.
I am beginning to sound a bit like a broken record on this matter, but our creative industries are such a phenomenal UK success story. They are our economic superpower and are worth more than automotive, aerospace and life sciences added together, comprising almost 10% of UK registered businesses and creating nearly 2.5 million jobs. More than that, our creative industries have so much intrinsic value; they underpin our culture and our sense of community. Intellectual property showcases our nation around the world and supports our tourism sector. As a form of soft power, there is simply nothing like it—yet these social and economic benefits are all being put at risk by the suggested wholesale transfer of copyright to AI companies.
The choice presented to us always seems, wittingly or unwittingly, to pit our innovative AI sector against our world-class creative industries and, indeed, our media sector. It is worth noting that news media is often overlooked in these debates, but newspapers, magazines and news websites license print and content online. In turn, that helps to support high-quality and independent journalism, which is so vital to underpinning our democratic life. That is essential considering recent news that the global average press freedom score has fallen to an all-time low.
I want to push back against the false choice that we always seem to be presented with that, somehow, our creative industries are Luddites and are not in favour of AI. I have seen time and again how our creators have been characterised by big tech and its lobbyists as somehow resistant to technological progress, which is of course nonsensical.
I want to knock on the head the idea that any Government Minister thinks that the creative industries are Luddites. As I said in the debate in Westminster Hall—I know that the hon. Lady was not able to be there—many creative industries use all sorts of technical innovations every single day of the week. They are not Luddites at all; they are the greatest innovators in the country.
I thank the Minister for that reassurance. I did take part in a Westminster Hall debate on this matter a couple of weeks ago, but one of his colleagues was responding. I made the same point then. Quite often in the media or more generally, AI seems to be pitted against our creative industries, which should not be the case, because we know that our creative industries embrace technology virtually more than any other sector. They want to use AI responsibly. They do not want to be replaced by it. The question before us is how lawmakers can ensure that AI is used ethically without this large-scale theft of IP. We are today discussing amendments that go somewhere towards providing an answer to that question.
I agree with my right hon. Friend: that is the peculiarity. The Minister knows only too well about the nature of what goes on in countries such as China. Chinese companies are frankly scared stiff of cutting across what their Government tell them they have to do, because what happens is quite brutal.
We have to figure out how we protect data from ill use by bad regimes. I use China as an example because it is simply the most powerful of those bad regimes, but many others do not observe data protection in the way that we would assume under contract law. For example, BGI’s harnessing of the data it has gleaned from covid tests, and its dominance in the pregnancy test market, is staggering. It has been officially allowed to take 15% of the data, but it has taken considerably more, and that is just one area.
Genomics is a huge and vital area right now, because it will dominate everything in our lives, and it populates AI with an ability to describe and recreate the whole essence of individuals, so this is not a casual or small matter. We talk about AI being used in the creative industries—I have a vested interest, because my son is in the creative industries and would support what has been said by many others about protecting them—but this area goes a whole quantum leap in advance of that. We may not even know in the future, from the nature of who they are, who we are talking to and what their vital statistics are.
This amendment is not about one country; it is about providing a yardstick against which all third countries should be measured. If we are to maintain the UK’s standing as a nation that upholds privacy, the rule of law, democracy and accountability, we must not allow data to be transferred to regimes that fundamentally do not share those values. It is high time that we did this, and I am glad to see the Minister nodding. I hope therefore that he might look again at the amendment. Out of old involvement in an organisation that he knows I am still part of, he might think to himself that maybe this is worth doing or finding some way through.
I do not resile from my views just because I have become a Minister, just as the right hon. Member did not when he became a Minister. He makes an important set of points. I do think, however, that they are already met by the changes in the schedule to article 45B, which is not an exhaustive list of things that the Secretary of State may consider. The points he refers to are certainly things that the Secretary of State could—and should, I would argue—consider.
I am grateful to the Minister, and I hope that that might find its way on to the face of the Bill with a little more description, but I understand that and I acknowledge that he does as well.
I welcome the opportunity to speak in support of the Bill and to address some of the amendment proposed, particularly Government new clauses 16 and 17.
New clause 17 is entitled “Report on the use of copyright works in the development of AI systems”. I am pleased to note, in subsection (3)(b), that the report will
“the effect of copyright on access to, and use of, data by developers of AI systems (for example, on text and data mining)”.
I also note that “developers” are specifically broken down into
“individuals, micro businesses, small businesses or medium-sized businesses”.
It is right to provide for that level of granularity. Similarly, I note that the report will
“consider, and make proposals in relation to… the disclosure of information by developers of AI systems about”
their use of copyright data to develop AI systems and “how they access” that copyrighted data,
“for example, by means of web crawlers”.
I am pleased to see discussions of licensing included in the report, and an exploration, again in granular detail, of the impact of a licensing system on all levels of developers. However, I would have liked to see an equal level of granularity for copyright owners to understand the effects of proposals outlined in subsection (3). Subsection (4) states that
“In preparing the report, the Secretary of State must consider the likely effect of proposals, in the United Kingdom, on… copyright owners”
as well as developers and users of AI systems. Although I note that new subsection (4) refers to individuals, microbusinesses and so on, I feel that there is a little vagueness as to whether this level of granularity is afforded to copyright owners as well.
That is not intentional. It is exactly the same level of granularity that we will go into in our reporting.
Well, I will just throw the rest of my speech away, then. I shall persevere. Will the report explore the effects of the proposed solutions and the resulting protections on individual creators?
There seem to be an awful lot of David Attenborough TikTok videos, but it is not him. I wonder whether this measure will apply to personality rights, and about the definition of a “small rights owner”. I will just squeeze that in.
Personally, I am in favour of doing something about personality rights, but it is one of the things that is in the consultation, to which will we respond. It is one of the things for which we will need to legislate in the round.
Perfect.
I asked the Secretary of State what reassurances can be given that smaller creatives, including microbusinesses and small creative businesses, will be considered in the report so that they can have confidence that the systems finally applied will work for them, particularly when we consider an individual’s early career—think of Ed Sheeran strumming away in his bedroom in his pre-fame days—and how they can protect their copyrighted works against the global tech giants.
New clause 16 addresses the economic impact on both copyright owners and AI developers, and I want to switch from talking about copyright owners to trying to defend the AI industry. If we do not get the controls right, we risk the mid and long-term success of the AI industry. If we do not get a fair solution for the creative and AI industries, we risk a reduction in the quantity, and potentially in the quality, of human-created data and an increase in AI-generated creative data.
I will briefly segue, because we are developing a lot of AI-created content that might be subject to copyright. A report recently pointed out that 18% of Spotify content is now AI-generated. People might remember the big hoo-ha when an AI-generated image won a photographic competition, which caused a lot of disturbance, but a lot of creative skill was involved in how the photographer developed and produced that image. No, it was not a photograph, but it is in a category of its own. I feel that is also creative content and copyrighted data, so there is a grey area.
If we start to generate more and more AI-created data and less and less high-quality human-generated data, because of the challenges to the creative industry, there is a danger that AI models will start scraping and training on AI-generated data, potentially leading to a reductive spiral into mediocrity, with some even suggesting that this could result in model collapse. On new clauses 16 and 17, I encourage the House to consider the impact of not employing proposals such as licensing and protecting the generation of new human-created content, given the risks posed to AI models and developers in the long term.
I will briefly comment on amendments 37 and 38, tabled by my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah). She ably outlined the reasons for the amendments, so I will not go into a lot of detail, but I want to point out that getting the definitions correct will prevent a loophole whereby AI companies can misuse personal data by claiming that their commercial development is scientific research. The amendments would provide transparency on the use of data by researchers in order to maintain confidence in this country’s ethical, legal and professional high standards in academic research. I hope the Minister will give careful consideration to the points I have raised.
I am now going to give my Whip, my hon. Friend the Member for Cardiff North (Anna McMorrin), a heart attack because I am going to refer to amendments 41 to 46 to clause 80 on article 22 of the UK GDPR, which is close to my heart. She is not to worry, though; I read those amendments with great interest and I understand the back-up they would provide, but although I am a newbie MP, as I read them—in my understanding, given the little work I did in my previous job with a regulator—I felt that they were more like secondary legislation. They could be considered for the future, particularly amendment 46, which includes some very welcome additions. However, when it comes to primary legislation, I feel that the Bill works better as it stands.
As many Members will be aware, my constituent Ellen Roome knows only too well the tragedies that can take place as a result of social media. I am pleased that Ellen joins us in the Gallery to hear this debate in her pursuit of Jools’ law.
In 2022, Ellen came home to find her son Jools not breathing. He had tragically lost his life, aged just 14. In the following months, Ellen battled the social media giants—and she is still battling them—to try to access his social media data, as she sought answers about what had happened leading up to his death. I am grateful to the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), for raising this in his speech. In her search for answers, Ellen found herself blocked by social media giants that placed process ahead of compassion. The police had no reason to suspect a crime, so they did not see any reason to undertake a full investigation into Jools’ social media. The inquest did not require a thorough analysis of Jools’ online accounts. None of the social media companies would grant Ellen access to Jools’ browsing data, and a court order was needed to access the digital data, which required eye-watering legal fees.
The legal system is unequipped to tackle the complexities of social media. In the past, when a loved one died, their family would be able to find such things in their possession—perhaps in children’s diaries, in school books or in cupboards. However, now that so much of our lives are spent online, personal data is kept by the social media giants. New clause 11 in my name would change that, although I understand that there are technical and legal difficulties.
The Minister and the Secretary of State met Ellen and me this morning, along with the hon. Member for Darlington (Lola McEvoy), and we are grateful for the time they gave us. My new clause will not go to a vote today, but we will keep pushing because Ellen and other parents like her should not have to go through this to search for answers when a child has died. I understand that there are provisions in the Bill that will be steps forward, but we will keep pushing and we will hold the Government’s and all future Governments’ feet to the fire until we get a result.
It was great to meet this morning, although I am sorry it was so late and so close to Report stage; I wish it had been earlier. We were serious in the meeting this morning: we will do everything we possibly can to make sure that coroners understand both their powers and their duties in this regard, and how they should be operating with families and the prosecuting authorities as well if necessary. We will also do everything we can to ensure that the technical companies embrace the point that they need to look after the families of those who have lost loved ones when they are young.
I thank the Minister for his intervention. He is absolutely right. There are clear issues of process here. There are differential approaches across the country—different coroners taking different approaches and different police forces taking different approaches. The words of Ministers have weight and I hope that coroners and police forces are taking note of what needs to happen in the future so that there are proper investigations into the deaths of children who may have suffered misadventure as a result of social media.
On related matters, new clause 1 would gain the support of parents like Ellen up and down this country. We need to move further and faster on this issue of social media and online safety—as this Government promised on various other things—and I am pleased that my party has a very clear position on it.
I will now turn to the issue of copyright protections. I held a roundtable with creatives in Cheltenham, which is home to many tech businesses and AI companies. The creative industries in my town are also extremely strong, and I hear a lot of concern about the need to protect copyright for our creators. The industry, is worth £124 billion or more every year, remains concerned about the Government’s approach. The effects of these issues on our culture should not be understated.
We would be far poorer both culturally and financially if our creatives were unable to make a living from their artistic talents. I believe there is still a risk of the creative industry being undermined if the Government remove protections to the benefit of AI developers. I trust that Ministers are listening, and I know that they have been listening over the many debates we have had on this issue. If they were to remove those protections, they would tip the scales in favour of AI companies at the cost of the creative industry. When we ask AI companies and people in tech where the jobs are going to come from, the answers are just not there.
The amendments tabled by my hon. Friend the Member for Harpenden and Berkhamsted (Victoria Collins) would reinstate copyright protections at all levels of AI development and reinforce the law as it currently stands. It is only fair that when creative work is used for AI development, the creator is properly compensated. The Government have made positive noises on this issue in multiple debates over the last few months. That is a positive sign, and I think that in all parts of this House we have developed a consensus on where things need to move—but creatives remain uneasy about the implications for their work and are awaiting firm action.
Ministers may wish to tackle this issue with future action, and I understand that it might not be dealt with today, but our amendments would enable that to happen. They also have an opportunity today: nothing would send a stronger signal than Government support and support from Members from across the House for my hon. Friend’s amendments, and I implore all Members to back them.
For the fourth time, and as I have said, new clause 19 would effectively create a de facto position whereby there are restrictions on the use of inappropriate social media services by children. It seeks to tackle the challenges of implementation, age verification and the scope of social media. It says that there needs to be work to make sure that we can actually do so and that, when we can, we should move in that direction, unless there is overwhelming evidence that it is not needed, such as with the shaking out of the Online Safety Act.
Finally, I return to new clause 21. Sadly, it has been widely misrepresented. The laws in this area are clear: the Equality Act puts in place obligations in relation to protected characteristics. The Supreme Court says that “sex” means biological sex, and that public authorities must collect data on protected characteristics to meet their duties under the Equality Act. The new clause would put that clear legal obligation into effect, and build in data minimisation principles to preserve privacy. There would be no outing of trans people through the new clause, but where public authorities collect and use sex data, it would need to be biological sex data.
As ever, it is good to see you in the Chair, Madam Deputy Speaker. I thank all right hon. and hon. Members who have taken part in the debate. If I do not manage to get to any of the individual issues that have been raised, and to which people want answers, I am afraid that is because of a shortness of time, and I will seek to write to them. I thank the officials who helped to put the Bill together, particularly Simon Weakley—not least because he not only did this Bill, but all the previous versions in the previous Parliament. He deserves a long-service medal, if not something more important.
I will start with the issues around new clauses 1, 11, 12 and 13, and amendment 9. The Government completely share the concern about the vulnerability of young people online, which lots of Members have referred to. However, the age of 13 was set in the Data Protection Act 2018—I remember, because I was a Member at the time. It reflects what was considered at the time to be the right balance between enabling young people to participate online and ensuring that their data is protected. Some change to protecting children online is already in train. As of last month, Ofcom finalised the child safety codes, a key pillar of the Online Safety Act. Guidance published at the same time started a three-month period during which all in-scope services likely to be accessed by children will be required to assess the risk of harm their services pose to them.
From July, the Act will require platforms to implement measures to protect children from harm, and this is the point at which we expect child users to see a tangible, positive difference to their online experiences. I wish it had been possible for all this to happen earlier— I wish the Act had been in a different year—but it is the Act it is. The new provisions include highly effective age checks to prevent children encountering the most harmful content, and adjusting algorithms to reduce the exposure to harmful content. Services will face tough enforcement from Ofcom if they fail to comply.
The Act very much sets the foundation for protecting children online. The Government continue to consider further options in pursuit of protecting children online, which is why the Department for Science, Innovation and Technology commissioned a feasibility study to understand how best to investigate the impact of smartphones and social media on children’s wellbeing. This will form an important part of our evidence base.
I am going to come to the right hon. Member’s amendment in a moment.
The study is being led by Dr Amy Orben of Cambridge University, and it is supported by scientists from nine of the UK’s premier universities, all with established expertise in this field. The study will report to the Government this month on the existing evidence base, ongoing research and recommendations for future research that will establish any causal links between smartphones, social media and children’s wellbeing. The Government will publish the report along with the planned next steps to improve the evidence base in this area to support policy making. Considering the extra work we are doing, I hope Members will not press their amendments.
I am afraid that I will not give way.
On new clause 13, tabled by the hon. Member for Harpenden and Berkhamsted (Victoria Collins), we share the concern that children’s data in education must be safeguarded. We have already committed to instructing the Information Commissioner’s Office to produce a statutory code of practice on the use of children’s data by edtech services once the findings of their audits have been published. We believe that defining the scope of the code in legislation now or imposing a six-month deadline for its publication risks undermining that evidence-led process.
Amendment 9, tabled by the right hon. Member for East Hampshire (Damian Hinds), seeks to ensure that platforms adopt strong age-assurance mechanisms when designing their services under the new children’s higher protection matters duty in clause 81. Of course, we subscribe to that policy aim, but the clause already strengthens UK GDPR by requiring providers of information society services to take account of how children can best be protected and supported when they are designing their processing activities. The ICO’s age-appropriate design code will be updated to provide clear and robust guidance on how services can meet these obligations, including through proportionate risk-based age assurance, where appropriate. I will take the right hon. Member’s intervention if he wants—he asked first—but I am afraid I have to be very careful because I have a lot of questions to answer.
Very quickly, I want the Minister to confirm that the Ofcom children’s codes, to which he has referred, are all about the 18 age threshold. They are a very welcome move to filter out wholly inappropriate content that is designed for over-18s and other very harmful content, but they do not do anything for the initial threshold—the age minimum—at age 13.
I am terribly sorry, but I do need to crack on because I have very little time.
I have not yet mentioned new clause 21 and amendments 39 and 40. Let me start by saying that the Government accept the Supreme Court ruling, but it is paramount that we work through this judgment carefully, with sensitivity and in line with the law. We cannot simply flick a switch; we must work through the impacts of this judgment properly, recognising that this is broader than data used by digital verification services. I reflect the comment made earlier by the shadow Minister, the hon. Member for Runnymede and Weybridge (Dr Spencer), when he said that data accuracy is important.
I thank my hon. Friend for giving way. Trans people and trans-led groups have been very concerned by new clause 21 tabled by the Opposition. They have rightly described it as an attack on trans people’s rights and their privacy. Can the Minister offer some reassurance that, as well as opposing this amendment today, the Government will not seek to introduce similar legislation via other means in the future?
We are opposing the amendment and are not intending to introduce similar legislation.
As I said, data accuracy is important. That is equally true for any data used in a digital verification service. That is why the Government are already engaged in an appropriate and balanced range of work on data standards and data accuracy. We are already developing data standards on the monitoring of diversity information, including sex, via the Data Standards Authority. Following a review, the Office for Statistics Regulation published updated guidance on collecting and reporting data and statistics about sex and gender identity last year, and all Government Departments are now considering how best to address the recommendations of the Sullivan review, which we published. That is the first reason why we will not be supporting this new clause or the amendment today. Simply, we believe the concerns regarding the way in which public authorities process sex and gender data should be considered holistically, taking into account the effects of the Supreme Court ruling and the specific and particular requirements of public authorities. By contrast, the new clause and the amendment would undermine the work the Government are already doing. Giving the Secretary of State a new regulatory rule would undermine the existing processes that ensure compliance with the UK’s data protection.
Secondly, the new clause is misplaced because the Bill does not alter the evidence which can be relied upon to prove sex or gender. Indeed, it does not seek to alter any of the content of data used by digital verification services. Instead, the Bill enables people to do digitally what they can presently do physically, and it is for organisations to consider what specific information they need to verify in their particular circumstances. Any inconsistency between what they can do digitally and what they can do physically would obviously sow further division.
Thirdly, the new clause is unnecessary, because it is very unlikely that digital verification services would be used in many, if not all, of the cases specifically raised by or with hon. Members, such as within the NHS to gain access to single-sex wards or for screening or to enter other female-only spaces. We expect digital verification services to be used primarily to prove things such as one’s right to work, or one’s age, address or professional or educational qualifications, which are not matters where sex or gender is relevant at all.
Fourthly, the new clause goes significantly further than the findings of the Supreme Court. Finally, the proposals have the potential to interfere with the right to respect for private and family life under the Human Rights Act by requiring public authorities to record sex as biological sex in all cases regardless of whether it is justified or proportionate in that given circumstance. In addition, the amendment does not take account of the fact that the Gender Recognition Act 2004 gives those with gender recognition certificates a level of privacy and control over who has access to information about their gender history. As for amendment 39, it will create further uncertainty as it appears to prevent use of clause 45 in all cases involving sex.
As I have set out, while I understand the reason for tabling these amendments, I fear they would create legal confusion, uncertainty and inconsistency. I also note that they were not part of the previous Government’s version of this Bill, in which in nearly all respects this part of the Bill was identical to ours. Given the narrow scope of digital verification service measures, the need to consider this area holistically to ensure alignment with existing legislation, and upcoming EHRC guidance and the breadth of work already being carried out, I hope the new clause and amendments will be withdrawn.
There was one other amendment referring to digital verification services: the Liberal Democrats’ new clause 7. I completely share their concerns about digital inclusion, which were also mentioned by the hon. Member for Dewsbury and Batley (Iqbal Mohamed). We have published our own digital inclusion action plan, but such obligations could be particularly challenging for businesses currently operating solely in the digital sphere—for example, online banks. Taking a blanket approach in the way proposed would not be proportionate, so I urge that the amendment be withdrawn.
On scientific research, my hon. Friend the Member for Newcastle upon Tyne Central and West (Chi Onwurah) tabled amendments 37 and 38. Amendment 37 adds further conditions to the definition of scientific research. I understand her concern and we want to prevent misuse. However, the Bill does not expand the meaning of scientific research and already contains safeguards, such as in clause 86. Moreover, the amendment replicates wording from two external documents—including the Frascati document—neither of which were intended to be legally binding or to define scientific research. I am very happy to continue having these conversations with my hon. Friend, but I urge her not to press her the amendment.
On access to NHS data, which my hon. Friend the Member for Normanton and Hemsworth (Jon Trickett) raised, let me just answer his direct question about the sale of NHS data. The Secretary of State for Health has said categorically that the NHS is not for sale and that patients’ data is not for sale—end of story. I hope we can put that one to bed.
On ethnicity data, my hon. Friend the Member for Birmingham Edgbaston (Preet Kaur Gill) made valid points that we intend to pursue. Public bodies usually collect ethnicity data in line with the Office for National Statistics’ harmonised standards. The ONS is currently reviewing that and I am sure she will want to feed into that process.
I am afraid that I have not had time to refer again to AI and copyright, but this country is a—