Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateNusrat Ghani
Main Page: Nusrat Ghani (Conservative - Sussex Weald)Department Debates - View all Nusrat Ghani's debates with the Department for Science, Innovation & Technology
(1 day, 18 hours ago)
Commons ChamberI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 17—Report on the use of copyright works in the development of AI systems.
New clause 1—Age of consent for social media data processing—
“(1) The UK GDPR is as amended as follows.
(2) In Article 8 of the UK GDPR (Conditions applicable to child's consent in relation to information society services)
After paragraph 1 insert—
‘(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.
(1B) For the purposes of paragraph 1A “social networking services” means any online service that—
(a) allows users to create profiles and interact publicly or privately with other users, and
(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.
(1C) Paragraph 1B does not apply to—
(a) educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.
(b) health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support’”.
This new clause would raise the age for processing personal data in the case of social networking services from 13 to 16.
New clause 2—Compliance with UK copyright law by operators of web crawlers and general-purpose AI models—
“(1) The Secretary of State must by regulations make provision (including any such provision as might be made by Act of Parliament), requiring the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988, regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place.
(2) Provision made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—
(a) pre-training and training,
(b) fine tuning,
(c) grounding and retrieval-augmented generation, and
(d) the collection of data for the said purposes.
(3) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires web crawlers and general-purpose AI models with UK links to comply with UK copyright law across all stages of AI development.
New clause 3—Transparency of crawler identity, purpose and segmentation—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding the identity of crawlers used by them or by third parties on their behalf, including but not limited to—
(a) the name of the crawler,
(b) the legal entity responsible for the crawler,
(c) the specific purposes for which each crawler is used,
(d) the legal entities to which operators provide data scraped by the crawlers they operate, and
(e) a single point of contact to enable copyright owners to communicate 35 with them and to lodge complaints about the use of their copyrighted works.
(2) The information disclosed under subsection (1) must be available on an easily accessible platform and updated at the same time as any change.
(3) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to deploy distinct crawlers for different purposes, including but not limited to—
(a) web indexing for search engine results pages,
(b) general-purpose AI model pre-training, and
(c) retrieval-augmented generation.
(4) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to ensure that the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires operators of web crawlers and AI models to disclose their identity, purpose, data-sharing practices, and use separate crawlers for different functions.
New clause 4—Transparency of copyrighted works scraped—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding text and data used in the pre-training, training and fine-tuning of general purpose AI models, including but not limited to—
(a) the URLs accessed by crawlers deployed by them or by third parties on their behalf or from whom they have obtained text or data,
(b) the text and data used for the pre-training, training and fine-tuning, including the type and provenance of the text and data and the means by which it was obtained, and
(c) information that can be used to identify individual works, and (d) the timeframe of data collection.
(2) The disclosure of information under subsection (1) must be updated on a monthly basis in such form as the regulations may prescribe and be published in such manner as the regulations may prescribe so as to ensure that it is accessible to copyright owners upon request.
(3) The Secretary of State must lay before Parliament a draft of the statutory 35 instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause mandates transparency about the sources and types of data used in AI training, requiring monthly updates accessible to copyright owners.
New clause 5—Enforcement—
“(1) The Secretary of State must by regulations make provision requiring the Information Commission (under section 114 of the Data Protection Act 2018) (‘the Commissioner’) to monitor and secure compliance with the duties by an operator of a web crawler or general-purpose artificial intelligence (AI) model whose service has links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 (‘a relevant operator’), including but not limited to the following—
(a) the regulations must provide for the Commissioner to have the power by written notice (an ‘information notice’) to require a relevant operator to provide the Commissioner with information that the Commissioner reasonably requires for the purposes of investigating a suspected failure to comply with the duties;
(b) the regulations must provide for the Commissioner to have the power by written notice (an ‘assessment notice’) to require and to permit the Commissioner to carry out an assessment of whether a relevant operator has complied or is complying with the duties and to require a relevant operator to do any of the acts set out in section 146(2) of the Data Protection Act 2018;
(c) the regulations must provide that where the Commissioner is satisfied 15 that a relevant operator has failed, or is failing to comply with the duties, the Commissioner may give the relevant operator a written notice (an ‘enforcement notice’) which requires it—
(i) to take steps specified in the notice, or
(ii) to refrain from taking steps specified in the notice;
(d) the regulations must provide that where the Commissioner is satisfied that a relevant operator has failed or is failing to comply with the duties or has failed to comply with an information notice, an assessment notice or an enforcement notice, the Commissioner may, by written notice (a ‘penalty notice’), require the person to pay to the Commissioner an amount in sterling specified in the notice, the maximum amount of the penalty that may be imposed by a penalty notice being the ‘higher maximum amount’ as defined in section 157 of the Data Protection Act 2018; and
(e) the regulations may provide for the procedure and rights of appeal 30 in relation to the giving of an information notice, an assessment notice, an enforcement notice or a penalty notice.
(2) The regulations must provide that any failure to comply with the duties by a relevant operator shall be directly actionable by any copyright owner who is adversely affected by such failure, and that such copyright owner will be entitled to recover damages for any loss suffered and to injunctive relief.
(3) The regulations must provide that the powers of the Commissioner and the rights of a copyright owner will apply in relation to a relevant operator providing a service from outside the United Kingdom (as well as such one provided from within the United Kingdom).
(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing the regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause grants the Information Commissioner enforcement powers to ensure compliance with AI and web crawler transparency rules, including penalties for breaches.
New clause 6—Technical solutions—
“(1) The Secretary of State must conduct a review of the technical solutions that may be adopted by copyright owners and by the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to prevent and to identify the unauthorised scraping or other unauthorised use of copyright owners’ text and data.
(2) Within 18 months of the day on which this Act is passed, the Secretary of State must report on such technical solutions and must issue guidance as to the technical solutions to be adopted and other recommendations for the protection of the interests of copyright owners.”
This new clause requires the Secretary of State to review and report on technical measures to prevent unauthorised data scraping by web crawlers and AI models.
New clause 7—Right to use non-digital verification services—
“(1) This section applies when an organisation—
(a) requires an individual to use a verification service; and
(b) uses a digital verification service for that purpose.
(2) Where it is reasonably practicable for an organisation to offer a non-digital method of verification, the organisation must—
(a) make a non-digital alternative method of verification available to any individual required to use a verification service; and
(b) provide information about digital and non-digital methods of verification to those individuals before verification is required.”
This new clause would create a duty upon organisations to support digital inclusion by offering non-digital verification services where practicable.
New clause 8—Data Vision and Strategy—
“Within six months of Royal Assent of this Act, the Secretary of State must publish a ‘Data Vision and Strategy’ which outlines—
(a) the Government’s data transformation priorities for the next five years; and
(b) steps the Government will take to ensure the digitisation of Government services.”
New clause 9—Departmental Board Appointments—
“(1) Within six months of the day on which this Act is passed—
(a) Government departments;
(b) NHS England; and
(c) NHS trusts
shall appoint to their departmental board or equivalent body at least one of the following—
(i) Chief Information Officer;
(ii) Chief Technology Officer;
(iii) Chief Digital Information Officer;
(iv) Service Transformation Leader; or
(v) equivalent postholder.
(2) The person or persons appointed as under subsection (1) shall provide an annual report on the progress of the department or body towards the Government’s Data Vision and Strategy.”
This new clause would require digital leaders to be represented at executive level within Government departments and other bodies.
New clause 10—Data use in Public Service Delivery Review—
“(1) The Secretary of State must, every 12 months, lay before Parliament a ‘Data use in Public Service Delivery Review’.
(2) The Data use in Public Service Delivery Review shall include, but is not limited to assessment of the steps being taken to—
(a) improve the Government’s use of data in public service delivery over the previous 12 months;
(b) expand the use of data to support increased and improved digital services in public service delivery;
(c) improve expertise and digital talent within Government departments to help expand the use of data for public service delivery; and
(d) facilitate and regulate for better use of data in the delivery of public services.”
This new clause would require an annual assessment by the Secretary of State to examine the steps being taken to facilitate and regulate the use of data in the delivery of public services using digital and online technologies.
New clause 11—Access to a deceased child’s social media data—
“(1) Where a person under 18 years of age has deceased, a parent or legal guardian (the ‘requestor’) may request from any internet service provider (ISP) the child’s user data from up to 12 months prior to the date of death.
(2) The ISP must provide a copy of the requested data, or direct account access, upon verification of the requestor’s identity and relationship to the deceased person, and no court order shall be required for such disclosure.
(3) ‘User data’ includes all content, communications, or metadata generated by or associated with the deceased person’s online activity, including stored messages and posts, except where the deceased person had explicitly directed otherwise prior to death.
(4) The ISP may refuse or redact specific data only where—
(a) disclosure would unduly infringe the privacy rights of another individual,
(b) the deceased person had explicitly opted out before death,
(c) there is a conflicting court order, or
(d) a serious risk to public safety or national security would result.
(5) In providing data under this section, the ISP must comply with data protection legislation.
(6) This section constitutes a lawful basis for disclosure under Article 6 of the UK GDPR.
(7) The Secretary of State may, by regulations subject to the affirmative resolution procedure—
(a) provide guidance on verifying parent or guardian status,
(b) clarify any additional grounds for refusal, and
(c) prescribe safeguards to protect third-party confidentiality.
(8) For the purposes of this section—
‘internet service provider (ISP)’ includes any provider of social media, messaging, or other online platforms; and
‘data protection legislation’ has the meaning given in section 51 of this Act.”
This new clause would allow parents of a deceased minor to obtain that child’s social media data without a court order, subject to privacy safeguards for third parties.
New clause 12—Raising the minimum age at which users can consent to processing of personal data—
“(1) The UK GDPR is amended in accordance with subsection (2) of this section.
(2) (2) After paragraph 1 of Article 8 of the UK GDPR (Conditions applicable to child’s consent in relation to information society services) insert—
‘(1A) References to “13 years old” and “age of 13 years” in paragraph 1 shall be read as “16 years old” and “age of 16 years” in the case of processing of personal data.
(1B) Paragraph (1A) does not apply to—
(a) platform systems and services operated where the primary purpose of processing of personal data is for the advancement of a charitable purpose as defined in the Charities Act 2011;
(b) publicly owned platform systems and services operated for the primary purpose of law enforcement, child protection, education, or healthcare;
(c) cases in which the Secretary of State determines it is in the best interests of the child for an operator to accept the child’s own consent.’”
This new clause would raise the age for processing personal data from 13 to 16 years old with certain exceptions for charitable purposes and child safety.
New clause 13—Code of practice for the use of children’s educational data—
“(1) Within 6 months of the passage of this Act, the Information Commissioner must prepare a code of practice which contains such guidance as the Information Commissioner considers appropriate on the processing of children’s data in connection with the provision of education.
(2) Guidance under subsection (1) must consider—
(a) all aspects of the provision of education including learning, school management, and safeguarding;
(b) all types of schools and learning settings in the development of guidance;
(c) the use of AI systems in the provision of education;
(d) the impact of profiling and automated decision-making on children’s access to education opportunities;
(e) children’s consent to the way their personal data is generated, collected, processed, stored and shared;
(f) parental consent to the way their children’s personal data is being generated, collected, processed, stored and shared;
(g) the security of children’s data;
(h) the exchange of information for safeguarding purposes.”
This new clause requires the Information Commissioner to produce a code of practice for accessing children’s educational data.
New clause 14—Transparency of business and customer data used in training Artificial Intelligence models—
“(1) The Secretary of State must by regulations make provision requiring operators of general-purpose AI models to disclose upon request information about business data and customer data processed for the purposes of pre-training, training, fine-tuning, and retrieval-augmented generation in an AI model, or any other data input to an AI model.
(2) Business data and customer data must include, but is not limited to, the whole or any substantial part of a literary, dramatic, musical or artistic work, sound recording, film or broadcast included in any text, images and data used for the purposes set out in subsection (1).
(3) Information disclosable under subsection (1) must include but is not limited to:
(i) Digital Object Identifiers and file names;
(ii) Details of how the work was identified, including metadata;
(iii) The source from which it was scraped or otherwise obtained; and
(iv) The URLs accessed by crawlers deployed by operators, or by third parties, to obtain the data.
(4) The owner of rights in any individual work identifiable in information disclosed under subsection (1) must be provided upon request to the relevant operator with information as to whether and how they have complied with the laws of the United Kingdom in respect to that work.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause would require the Secretary of State to set out transparency provisions requiring generative AI developers to provide information to enable individuals and creative businesses to determine whether their data, works and other subject matter have been used in training datasets.
New clause 15—Complaints procedure for vulnerable individuals—
“(1) The Data Protection Act 2018 is amended in accordance with subsections (2) to (4).
(2) After section 165(3) insert—
‘(3A) For complaints under subsection (2), the Information Commissioner must provide appropriate complaints-handling procedures for—
(a) victims of modern slavery,
(b) victims of domestic abuse,
(c) victims of gender-based violence, or
(d) data subjects otherwise in a position of vulnerability.
(3B) Procedures under subsection (3A) must include—
(a) appropriate support for vulnerable individuals;
(b) provision of specialised officers for sensitive cases;
(c) signposting to support services;
(d) provision of a helpline;
(e) de-escalation protocols.’
(3) After section 166(1)(c) insert—
‘(d) fails to investigate a complaint appropriately or take adequate action to remedy findings of inadequacy.’
(4) After section 166(2)(b), insert—
‘(c) to use formal powers as appropriate to investigate a complaint and to remedy any findings of inadequacy, unless the request from the data subject is manifestly unfounded or excessive.’”
This new clause would require the Information Commission to introduce a statutory complaints procedure for individuals in a position of vulnerability and new grounds of appeal to an Information Tribunal.
New clause 18—Report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies—
“(1) The Secretary of State must within six months of the passing of this Act—
(a) prepare and publish a report examining the need for a specific statutory public interest test to determine and safeguard access to NHS data by third-parties and companies.
(b) within 28 days of a report being laid under subsection (1) the Government must schedule a debate and votable motion on the findings of the report in each House.
(2) The report must consider—
(a) whether and in what situations it would be necessary, proportionate and lawful to share NHS data with third-parties and companies when the interests and risks to both the individual and/or public is considered.
(b) when it would be in the public interest and in the best interests of patients and the NHS to allow access by third-parties and companies to NHS data in relation to the provision of health care services and for promotion of health.”
This new clause would require the Secretary of State to produce a report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies and then to schedule a debate on it in each House.
New clause 19—Secretary of State’s duty to review the age of consent for data processing under the UK GDPR—
“(1) The Secretary of State must, within 12 months of Royal Assent of this Act, have conducted a review and published a report into the operation of Article 8 (Conditions applicable to child's consent in relation to information society services) of the UK GDPR in relation to the data processed by social media platforms of children under the age of 16.
(2) As part of this review, the Secretary of State must consider—
(a) the desirability of increasing the digital age of consent under the UK GDPR from 13 to 16, taking into account the available evidence in relation to the impact of social media platforms on the educational, social and emotional development of children; and
(b) the viability of increasing the digital age of consent under Article 8 of the UK GDPR in relation to specific social media platforms which are shown by the evidence to be unsuitable for use by children under the age of 16.
(3) Within six months of the publication of the report under subsection (1), the Secretary of State must lay a plan before Parliament for raising the digital age of consent to 16 through amendments to Article 8 GDPR, unless the review concludes that such changes are unnecessary.”
New clause 20—Duties of the Secretary of State in relation to the use by web-crawlers and artificial intelligence models of creative content—
“The Secretary of State must—
(a) by 16 September 2025, issue a statement, by way of a copyright notice issued by the Intellectual Property Office or otherwise, in relation to the application of the Copyright, Designs and Patents Act 1988 to activities conducted by web-crawlers or artificial intelligence models which may infringe the copyright attaching to creative works;
(b) by 16 September 2025, lay before Parliament a report which includes a plan to help ensure proportionate and effective measures for transparency in the use of copyright materials in training, refining, tuning and generative activities in AI;
(c) by 16 September 2025, lay before Parliament a report which includes a plan to reduce barriers to market entry for start-ups and smaller AI enterprises on use of and access to data;
(d) by 1 July 2026, publish a technological standard for a machine-readable digital watermark for the purposes of identifying licensed content and relevant information associated with the licence.”
New clause 21—Directions to public authorities on recording of sex data—
“(1) The Secretary of State must, within three months of the passage of this Act, issue regulations relating to the code of practice set out in section 49 of this Act which require public authorities to—
(a) collect, process and retain sex data only where it is lawful to do so in accordance with data protection legislation;
(b) request and record sex data accurately, in every circumstance where sex data is collected, in accordance with following category terms and definitions—
(i) ‘Sex’ meaning male or female only based on ‘sex at birth’, ‘natal sex’ or ‘biological sex’ (these terms carrying the same meaning and capable of being used interchangeably); and,
(ii) in addition, where it is lawful to do so in accordance with data protection legislation and the Gender Recognition Act 2004, ‘Acquired Gender’ meaning male or female only, as recorded on a gender recognition certificate issued in accordance with the Gender Recognition Act 2004;
(c) have updated relevant organisation guidance to stipulate that, where sex data is collected, this must be done in accordance with the definitions set out by subsection (1)(b) within three months of these regulations coming into force;
(d) have conducted a review of the accuracy of data held in relation to the sex of data subjects to ensure that the data is accurate in recording sex at birth and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate within 12 months of these regulations coming into force;
(e) have taken every reasonable step to ensure that any data held in relation to the sex and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate of a data subject that is found to be inaccurate is rectified or erased within 18 months of these regulations coming into force; and
(f) have produced and submitted to the Secretary of State a report setting out the findings of its review in relation to the matters set out by subsection (1)(d) and, where relevant, a description of the steps taken to ensure that the data held by the relevant public authority is accurate within the definitions set out subsection (1)(b) with 18 months of these regulations coming into force.
(2) The Secretary of State may, on receipt of a report in accordance with subsection (1)(f) instruct a public authority to take any further remedial steps within a specified timeframe reasonably necessary to ensure the accuracy of the sex and acquired gender data held by the relevant public authority.
(3) The Secretary of State must, within one month of the passage of this Act, establish and maintain a register of public authorities approved to act as sources of data relating to the attribute of sex for persons providing digital verification services.
(4) The register in subsection (3) must be published on the website of the Office for Digital Identities & Attributes or any successor body.
(5) Until such time as a public authority is added to the register under subsection (3), persons providing digital verification services may only obtain data on the sex of an individual requesting the provision of digital verification services from the record of births held by the General Register Office in accordance with subsection (6).
(6) Information supplied by the General Register Office pursuant to subsection (5) must specify sex as recorded at birth, as well as any subsequent corrections to the register in the field marked ‘Sex’.
(7) The Secretary of State may, from time to time, add public authorities to the register as under subsection (3) only upon being satisfied on the basis of a report issued under subsection (1)(f), or satisfaction of such further steps required by the Secretary of State under subsection (2) that the data held by the relevant public authority in relation to sex and, where relevant, acquired gender as recorded on a gender recognition certificate, as defined in subsection (1)(b), is accurate.”
This new clause requires the Secretary of State to issue regulations relating to the code of practice in section 49 requiring public authorities to record sex data in line with these regulations when data are collected. This clause is linked to amendments 39 and 40.
New clause 22—Recording of ethnicity data for the purposes of public service delivery—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data in the process of public service delivery and associated data collection.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be collected in the process of public service delivery.
New clause 23—Recording of ethnicity data on the Register of Births and Deaths—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data during birth and death registration.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be able to be collected during birth and death registration.
Government amendments 11 to 32.
Amendment 39, in clause 45, page 42, line 30, at the beginning insert—
“Save in respect of data relating to sex,”.
This amendment is consequential on NC21.
Amendment 40, page 43, line 15, at end insert—
“”gender recognition certificate” means a gender recognition certificate issued in accordance with the Gender Recognition Act 2004.”
This amendment is consequential on NC21.
Government amendments 1 to 8.
Amendment 37, in clause 67, page 75, line 24, at end insert—
“(2A) For the purposes of paragraph 2, ‘scientific research’ means creative and systematic work undertaken in order to increase the stock of knowledge, including knowledge of humankind, culture and society, and to devise new applications of available knowledge.
(2B) To meet the reasonableness test in paragraph 2, the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards.”
This amendment incorporates clarifications to help reduce potential misuse of the scientific research exception. The first is a definition of scientific research based on the Frascati Manual. The second is a requirement that research be conducted in line with frameworks and standards in the UKRI Code of Practice for Research.
Amendment 41, in clause 80, page 95, line 19, at end insert—
“3. For the purposes of paragraph 1(a), a human’s involvement is only meaningful if they are a natural person with the necessary competence, authority and capacity to understand, challenge and alter the decision.”
See explanatory statement for Amendment 44.
Amendment 45, page 96, line 2, at end insert—
“5. Consent in accordance with paragraph 2 cannot be given by persons under the age of 18 where—
(a) the automated decision-making is likely to produce legal or similarly significant effects on the child, or
(b) the processing involves the profiling of a child to determine access to essential services, education, or other significant opportunities.
6. The controller shall not be obliged to maintain, acquire or process additional information in order to identify the age of a data subject for the sole purpose of complying with this Regulation.
7. A significant decision may not be taken based solely on automated processing, if the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child, taking into account their rights and development stage, authorised by law to which the controller is subject, and after suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are made publicly available.
8. Profiling or solely automated processing of children’s data may not occur for the purposes of targeted advertising or behavioural analysis.”
This amendment ensures that automated decision-making cannot take place in circumstances where it would affect a child’s access to significant opportunities or would not be in their best interests, as well as protections against practices such as behavioural analysis.
Amendment 46, page 96, leave out lines 13 to 19 and insert—
“(a) communicate to the data subject before and after the decision is taken the fact that automated decision-making is involved in the decision, the extent of any human involvement, and the availability of safeguards under this Article;
(b) provide the data subject with information about decisions described in paragraph 1 taken in relation to the data subject including meaningful information about the logic involved, the significance and the envisaged consequences of such processing for the data subject, and a personalised explanation for the decision;
(c) enable the data subject to make representations about such decisions;
(d) enable the data subject to obtain human intervention on the part of the controller in relation to such decisions;
(e) enable the data subject to contest such decisions.
3. For the purposes of paragraph 2(b), a personalised explanation must—
(a) be clear, concise and in plain language of the data subject’s choice in a readily available format;
(b) be understandable, and assume limited technical knowledge of algorithmic systems;
(c) address the reasons for the decision and how the decision affects the individual personally, which must include—
(i) the inputs, including any personal data;
(ii) parameters that were likely to have influenced or were decisive to decision or a counterfactual of what change would have resulted in a more favourable outcome;
(iii) the sources of parameters and inputs;
(d) be available free of charge and conveniently accessible to the data subject, free of deceptive design patterns.
4. Where the safeguards apply after a decision is made, the controller must give effect to data subject requests as soon as reasonably practicable and within one month of the request.
5. The controller must ensure the safeguards are fully in place and complete a data protection impact assessment under Article 35 before a decision under Article 22A is taken, documenting their implementation of the safeguards in addition to the requirements of that Article.
6. The controller must publish details of their implementation of the safeguards and how data subjects can make use of them.”
This amendment would ensure that data subjects are informed of automated decisions made about them in a timely way, and that that explanation is personalised to enable them to understand why it was made. It also ensures processors are incentivised to put the safeguards in place before commencing automated decision-making.
Amendment 42, page 96, line 23, after “Article 22A(1)(a),” insert
“and subject to Article 22A(3)”.
See explanatory statement for Amendment 44.
Amendment 43, page 97, line 19, at end insert—
“(3) To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”
See explanatory statement for Amendment 44.
Amendment 44, page 98, line 31, after “and 50C(3)(c),” insert “and subject to 50A(3)”.
This amendment and Amendments 41, 42 and 43 would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person who is empowered to change the decision in practice.
Amendment 9, in clause 81, page 100, line 7, at end insert—
“Age assurance
1C. Information society services which are likely to be accessed by children must use highly effective age verification or age estimation measures for the purpose of delivering on children’s higher protection matters.”
This amendment requires services which are likely to be accessed by children to use highly effective age verification measures.
Amendment 38, in clause 86, page 103, line 22, at end insert—
“(2A) Where personal data is processed for the purposes of scientific research under section 87(4) of the 2018 Act (‘reuse’), the processor or controller must publish details of the data sources used.
(2B) These details must as a minimum include a description of the scientific research, the provenance and method of acquisition of the personal data being reused, the original lawful basis for processing, the number of data subjects affected, and whether the data subjects have been notified of the reuse.
(2C) The processor or controller must notify the Information Commission when processing data for the purposes of scientific research under section 87(4) of the 2018 Act with the same details.”
This amendment ensures transparency for the use of scientific research exemptions by requiring those reusing personal data to publish details of that reuse and notify the Information Commission of that reuse.
Government amendments 33 and 34.
Amendment 10, in schedule 7, page 201, line 5, at end insert—
“(1B) A third country cannot be considered adequate or capable of providing appropriate safeguards by any authority where there exists no credible means to enforce data subject rights or obtain legal remedy.
(1C) For the purposes of paragraph 1A, the Secretary of State must make a determination as to whether credible means are present in a third country.
(1D) In making a determination regarding credible means, the Secretary of State must have due regard to the view of the Information Commissioner.
(1E) Credible means do not exist where the Secretary of State considers that any of the following are true:
(a) judicial protection of persons whose personal data is transferred to that third country is insufficient;
(b) effective administrative and judicial redress are not present;
(c) effective judicial review mechanisms do not exist; and
(d) there is no statutory right to effective legal remedy for data subjects.”
The amendment would prohibit personal data transfer to countries where data subject rights cannot be adequately upheld and prohibit private entities from using contracts to give the impression that data security exists.
Government amendments 35 and 36.
Earlier I appeared as a Department for Culture, Media and Sport Minister, and now I appear as a Department for Science, Innovation and Technology Minister. I hate to embarrass Members, but they will get two bouts of me today. I will start with the Government amendments, and then once I have heard the arguments from Members advancing other amendments, I will speak to those later in the debate. If I do not cover subjects in this initial speech, I will get back to them later.
The question of technical solutions is very important, but my challenge is this. I have spoken to representatives of some of the big tech companies who are pushing for that, and who are saying that it is hard for them to do it at scale but creatives can do it. Why can the tech companies not be leading on an opt-in system for creatives? Let me hand that back to the Minister.
I should point out that the hon. Lady, as the spokesperson for the Liberal Democrat party, will be speaking very shortly.
I know, but she is wonderful, so we will let her—or you will let her, Madam Deputy Speaker.
This is a really important point. Surely it cannot be impossible for us to find a technical solution. People who can develop AI—and they are now developing AI on their laptops, especially following DeepSeek; they do not need massive computers—should be able to develop a very simple system, as I have said before, whereby all creatives who are copyright owners are able to assert their rights, very simply, across all platforms, without any great exertion. That is what I want to achieve.
The hon. Lady was quite right to raise that question, so what are we going to do next? We say in new clause 17 that we will report in 12 months’ time. If we were to report in 12 months’ time that we had done absolutely nothing, I think that everyone would rightly rant and rave at us. It is our intention that the Secretary of State for Science, Innovation and Technology and the Secretary of State for Culture, Media and Sport will together co-ordinate a special taskforce specifically to consider how we can facilitate, incentivise and enable the development of these technical solutions. I suspect that, if we can get there, opt-out will look remarkably like opt-in.
The second matter on which new clause 17 requires us to report is access to data for AI developers to train AI systems in the UK, the third is transparency, and the fourth relates to measures to facilitate the licensing of copyright works for AI training. The publication will be required within 12 months of Royal Assent, and will of course be laid before Parliament. New clause 16 supplements these reports with a full economic impact assessment that will go further than previous assessments, and will present an analysis of the economic impact of a range of policy options available in this context, supported by the additional evidence that the Government have received in response to their consultation. The reporting requirements are important: they mean that we will have to engage with each of these issues apace and in depth, and we will do that. We are determined to find and incentivise technical solutions that support our objectives, and I believe that if we do that we can be a world leader. As I said earlier, the two Secretaries of State will convene working groups to tackle each of these issues.
I have heard people say that we are legislating to water down copyright, but that is simply not true. If Members support the Government’s position today, the UK’s copyright law will remain precisely as robust tomorrow as it is today. For activities in the UK, people will, in law, only be able to use copyright material if they are permitted and licensed to do so or if a copyright exception allows it, such as the existing copyright exceptions for education, public libraries and non-commercial work.
By its nature, enforcement would have to be compulsory, but we are running ahead of ourselves, because nobody has actually come up with a system that has an enforcement mechanism. Who would do it? What body would do it? How would that body be resourced? That is one of the things that we need to look into, and it is one of the elements of the consultation.
I will move on to another subject: the issue of purported intimate images. Government amendment 34 deals with the creation of intimate images or deepfakes. Earlier in the Bill’s passage, my colleague Lord Ponsonby added a new offence of creating purported intimate images without consent or reasonable belief in consent, and I am sure all hon. Members agree that this is a really important addition. In Committee, we introduced the offence of requesting the creation of purported images without consent or reasonable belief in consent, as hon. Members who were on the Public Bill Committee with me will know. It seems axiomatic that the courts should have the power to deprive offenders of the image and anything containing it that relates or is connected to the offence. This is already the case for the creating offence, which was introduced in the House of Lords. Government amendment 34 amends the sentencing code to achieve that for the requesting offence. It ensures that the existing regime of court powers to deprive offenders of property also applies to images and devices containing the image that relate to the requesting offence.
We have tabled a series of amendments to clauses 56 to 59 to reflect our discussions with the devolved Governments on the national underground asset register. The amendments will require that the Secretary of State to obtain the consent of Welsh Ministers and the Department for Infrastructure in Northern Ireland, rather than merely consult them, before making regulations in relation to the provisions. Co-operation with the devolved Governments has been consistent and constructive throughout the Bill’s passage. We have secured legislative consent from Scotland, and the Senedd in Wales voted in favour of granting the Bill legislative consent only yesterday. We regret that for procedural reasons, the process with Northern Ireland has not yet reached the stage of legislative consent. We are, however, working constructively with the Department of Finance to ensure that we can make progress as quickly as possible. We continue to work closely with the Northern Ireland Executive to secure legislative consent, and to ensure that citizens and businesses of Northern Ireland feel the full benefits of the Bill.
Before I finish, I turn to our amendments to help ensure that smart data schemes can function optimally, and that part 1 of the Bill is as clear as possible. Amendments to fee charging under clauses 11 and 15 follow extensive stakeholder engagement, and will maximise the commercial viability of smart data systems by enabling regulations to make tailored provision on fee charging within each smart data scheme. For example, amendments 19 to 21 enable the fees charged to exceed expenses where appropriate. This is necessary to fulfil the commitment in the national payments vision to establish a long-term regulatory framework for open banking. Outside smart data, Government amendment 35
“adds references to investigating crime to existing references in the Data Protection Act 2018 to detecting or preventing crime”,
which will bring these references into line with other parts of the legislation.
It is a privilege to respond to this debate on behalf of His Majesty’s official Opposition, and to speak to the new clauses and amendments. This is an ambitious piece of legislation, which will enable us to harness data—the currency of our digital age—and use it in a way that drives the economy and enhances the delivery of public services. Since its original inception under the Conservatives in the last Parliament, the Bill has also become the platform for tackling some of the most pressing social and technological issues of our time. Many of these are reflected in the amendments to the Bill, which are the subject of debate today.
I start with new clause 20. How do we regulate the interaction of AI models with creative works? I pay tribute to the work of many Members on both sides of this House, and Members of the other place, who have passionately raised creatives’ concerns and the risks posed to their livelihoods by AI models. Conservative Members are clear that this is not a zero-sum game. Our fantastic creative and tech industries have the potential to turbocharge economic growth, and the last Government rightly supported them. The creative and technology sectors need and deserve certainty, which provides the foundation for investment and growth. New clause 20 would achieve certainty by requiring the Government to publish a series of plans on the transparency of AI models’ use of copyrighted works, removing market barriers for smaller AI market entrants and digital watermarking and, most important of all, a clear restatement of the application of copyright law to AI-modelling activities.
I cannot help but have a sense of déjà vu in relation to Government new clause 17: we are glad that the Government have acted on several of the actions we called for in Committee, but once again they have chosen PR over effective policy. Amid all the spin, the Government have in effect announced a plan to respond to their own consultation—how innovative!
What is starkly missing from the Government new clauses is a commitment to make it clear that copyright law applies to the use of creative content by AI models, which is the primary concern raised with me by industry representatives. The Government have created uncertainty about the application of copyright law to AI modelling through their ham-fisted consultation. So I offer the Minister another opportunity: will he formally confirm the application of copyright law to protect the use of creative works by AI, and will he provide legal certainty and send a strong signal to our creative industries that they will not be asked to pay the price for AI growth?
Order. I point out to Mr Bryant that Dr Ben Spencer is the shadow Minister.
I think that was wishful thinking by the Minister in this debate.
Our new clause says that we need to look at the desirability of raising the digital age of consent for data processing from 13 to 16 in terms of its impact particularly on issues such as the social and educational development of children, but also the viability of doing so in terms of the fallout and the shaking out of the Online Safety Act and with regard to age verification services. Should there then be no evidence to demonstrate that it is unnecessary, we would then raise the digital age of consent to 13 to 16. It might be the case that, over the next six months, the shaking out of the Online Safety Act demonstrates that this intervention is not necessary. Perhaps concerns around particular high-risk social media platforms will change as technology evolves. We are saying that the Government should do the work with a view to raising the age in 18 months unless there is evidence to prove the contrary. [Interruption.] I have made this crystal clear, and if the Minister would choose to look at the new clause, rather than chuckling away in the corner, he might see the strategy we are proposing.
I say again that the position is that, following a careful look at the evidence regarding the desirability and validity of doing so—taking into account findings regarding the impact and implementation of the Online Safety Act and age verification and how one defines social media, particularly high-risk platforms—unless there is direct evidence to show that raising the age from 13 to 16 is unnecessary, which there may be, then we should raise it from 13 to 16. If that has not provided clarity, the hon. Gentleman is very welcome to intervene on me again and I will try and explain it a third time, but I think Members have got a grasp now.
This new clause will also tackle some of the concerns at the heart of the campaign for Jools’ law, and I pay tribute to Ellen Roome for her work in this area. I am very sympathetic to the tragic circumstances leading to this campaign and welcome the additional powers granted to coroners in the Bill, but I know that they do not fully address Ellen Roome’s concerns. The Government need to explain how they can be sure that data will be retained in the context of these tragedies, so that a coroner will be able to make sure, even if there are delays, that it can be accessed. If the Minister could provide an answer to that in his winding-up speech, and detail any further work in the area, that would be welcome.
On parental access to children’s data more broadly, there are difficult challenges in terms of article 8 rights on privacy and transparency, especially for children aged 16 to 17 as they approach adulthood. Our new clause addresses some of these concerns and would also put in place the groundwork to, de facto, raise the digital age of consent for inappropriate social media to 16 within 18 months, rendering the request for parental access to young teenage accounts obsolete.
I urge colleagues across the House to support all our amendments today as a balanced, proportionate and effective response to a generational challenge. The Bill and the votes today are an opportunity for our Parliament, often referred to as the conscience of our country, to make clear our position on some of the most pressing social and technological issues of our time.
I call the Chair of the Science, Innovation and Technology Committee.
I would like to thank colleagues in the other place and in this House who have worked so hard to improve the Bill. By modernising data infrastructure and governance, this Bill seeks to unlock the secure, efficient use of data while promoting innovation across sectors. As a tech evangelist, as well as the Chair of the Science, Innovation and Technology Committee, I welcome it, and I am pleased to see colleagues from the Select Committee, my hon. Friend the Member for Stoke-on-Trent South (Dr Gardner) and the right hon. Member for North West Hampshire (Kit Malthouse), here for this debate.
Having spent many unhappy hours when working for Ofcom trying to find out where British Telecom’s ducts were actually buried, I offer a very personal welcome to the national underground asset register, and I thank the Minister for his work on this Bill as well as for his opening comments.
I agree with the Minister that there is much to welcome in this Bill, but much of the Second Reading debate was consumed by discussion on AI and copyright. I know many Members intend to speak on that today, so I will just briefly set out my view.
The problem with the Government’s proposals on AI and copyright are that they give all the power to the tech platforms who—let us be frank—have a great deal of power already, as well as trillions of dollars in stock market capitalisation and a determination to return value to their shareholders. What they do not have is an incentive to design appropriate technology for transparency and rights reservation if they believe that in its absence they will have free access to our fantastic creators’ ingenuity. It is essential that the Minister convinces them that if they do not deliver this technology—I agree with him that it is highly possible to do so—then he will impose it.
Perhaps the Minister could announce an open competition, with a supplier contract as the prize, for whichever innovative company designs something. The Science, Innovation and Technology Committee, sitting with the Culture, Media and Sport Committee, heard from small companies that can do just that. The tech giants might not like it, but I often say that the opposite of regulation is not no regulation—it is bad regulation. If the tech platforms do not lead, they will be obliged to follow because the House will not allow the copyright of our fantastic creators to be put at risk. The Minister knows that I think him extremely charismatic and always have done, but I do not believe that “Chris from DSIT” can prevail against the combined forces of Björn from Abba and Paul from The Beatles.
The prospects for human advancement opened by using data for scientific research are immense. As a world-leading science powerhouse, the UK must take advantage of them. That is why, despite being a strong advocate of personal data rights, I welcome the Bill’s proposals to allow the reuse of data without consent for the purposes of scientific research. I am concerned, however, that the exemption is too broad and that it will be taken advantage of by data-hungry tech companies using the exemption even if they are not truly advancing the cause of scientific progress but simply, as with copyright, training their AI models.
Huge amounts of data is already collected by platforms, such as direct messages on Instagram or via web-scraping of any website that contains an individual’s personal data such as published records or people’s public LinkedIn pages. We know it can be misused because it has been, most recently with Meta’s controversial decision to use Instagram-user data to train AI models, triggering an Information Commissioner’s Office response because of the difficulty users encountered in objecting to it. Then there is the risk of data collected via tracking cookies or the profiling of browsing behaviour, which companies such as Meta use to fingerprint people’s devices and track their browsing habits. Could the data used to create ads also be freely reusable under this exemption? The US tech firm Palantir has the contract for the NHS federated data platform. Amnesty International has already raised concerns about the potential for patients’ data being mishandled. Does the Bill mean that our health data could be reused by Palantir for what it calls research purposes?
I thank my hon. Friend for that intervention. The Minister referred to that briefly, describing it, in relation to AI, as a pipeline where bad data in would mean bad data out. My hon. Friend knows that the definition of sex and gender has been controversial and contested. The Supreme Court brought some clarity and it is important that data collection reflects consistency and clarity. If we have bad data definitions, we will undoubtedly have bad consequences. As I said, it is important that we have consistency and definition when it comes to the collection of data for these purposes, and I look forward to hearing how that will be achieved.
I also want to speak briefly in support of clause 125, which introduces rules allowing researchers to access data from online services for online safety research. The Science, Innovation and Technology Committee’s inquiry into social media algorithms in misinformation heard considerable evidence on the role of algorithms in pushing misinformation generally, and particularly to children. I very much welcome this clause, which will increase transparency, but could the Minister clarify that it will fully cover the recommender algorithms used by social media platforms, which drive new content to users?
My constituents often feel that advances in technology are done to them rather than with them and for their benefit. Critically, our constituents need to feel that they have agency over the way data impacts their lives. Rather than feeling empowered by digital innovation, too many feel the opposite: disempowered, undermined, dehumanised, tracked and even attacked. Delivering the improvements promised by the Bill must therefore go hand in hand with respecting the rights of citizens to control and manage their data and driving innovation and scientific research benefits to them.
Thank you for calling me, Madam Deputy Speaker, and for your patience regarding my earlier intervention. I am very passionate about all elements of the Bill.
On Second Reading, I said:
“Data is the new gold”—[Official Report, 12 February 2025; Vol. 762, c. 302.]
—a gold that could be harnessed to have a profound impact on people’s daily lives, and I stand by that. With exponential advances in innovation almost daily, this has never been truer, so we must get this right.
I rise today to speak to the amendments and new clauses tabled in my name specifically, and to address two urgent challenges: protecting children in our digital world and safeguarding the rights of our creative industry in the age of artificial intelligence. The Bill before us represents a rare opportunity to shape how technology serves people, which I firmly believe is good for both society and business. However, I stand here with mixed emotions: pride in the cross-party work we have accomplished, including with the other place; hope for the progress we can still achieve; but also disappointment that we must fight so hard for protections that should be self-evident.
New clause 1 seeks to raise the age of consent for social media data processing from 13 to 16 years old. We Liberal Democrats are very clear where we stand on this. Young minds were not designed to withstand the psychological assault of today’s social media algorithms. By raising the age at which children can consent to have their data processed by social media services, we can take an important first step towards tackling those algorithms at source. This is a common-sense measure, bringing us in line with many of our European neighbours.
The evidence before us is compelling and demands our attention. When I recently carried out a safer screens tour of schools across Harpenden and Berkhamsted to hear exactly what young people think about the issue, I heard that they are trapped in cycles of harmful content that they never sought out. Students spoke of brain rot and described algorithms that pushed them towards extreme content, despite their efforts to block it.
The evidence is not just anecdotal; it is overwhelming. Child mental health referrals have increased by 477% in just eight years, with nearly half of teenagers with problematic smartphone use reporting anxiety. One in four children aged 12 to 17 have received unwanted sexual images. We know that 82% of parents support Government intervention in this area, while a Liberal Democrat poll showed that seven in 10 people say the Government are not doing enough to protect children online.