Online Harm: Child Protection Debate
Full Debate: Read Full DebateDanny Chambers
Main Page: Danny Chambers (Liberal Democrat - Winchester)Department Debates - View all Danny Chambers's debates with the Department for Science, Innovation & Technology
(1 day, 8 hours ago)
Commons Chamber
Dr Danny Chambers (Winchester) (LD)
This week is Eating Disorder Awareness Week, so I would like to pay tribute to the amazing staff at Leigh House in Winchester, an in-patient unit that cares for people with eating disorders in Winchester and the surrounding area. Eating disorders are possibly some of the most serious mental health conditions people can suffer from, and the most frustrating to treat and care for. They existed before social media, but social media is certainly making things more difficult. The body images that young people—teenagers and younger—are exposed to and the normalisation of AI-altered images that are impossible to attain, but which are presented as normal and aspirational, is hugely unhealthy.
We know that AI chatbots, which are often integrated into social media, are giving people mental health support and advice. I am really concerned about reports that patients with eating disorders are managing to get advice on how better to lose weight or even gain access to weight-loss drugs, which would make their condition much worse. I bring that up because there is a specific problem with AI chatbots. Some research shows that children do not recognise that a chatbot, which is often presented within social media as a companion, friend or cartoon character, does not have feelings, is not a person and does not care for their health and wellbeing.
It is very possible that, with the right regulation, AI and AI chatbots could be part of extending mental healthcare to people in the community at some point in the future. At the moment, it is dangerous and unregulated, and people accessing it are not even aware that it is giving them information that is potentially harmful to their health. I do not want this to fall through the cracks of regulation. Whatever we come forward with, whether it is about social media specifically or broadcasting licences, we should bring forward principles. Banning or regulating specific social media platforms or chatbots will be very unhelpful because of the speed at which these things are developing. It is a bit like whack-a-mole: once we regulate one, another pops up.
I draw everyone’s attention to the GUARD Act—guidelines for user age-verification and responsible dialogue—which was passed in the US last year. Very unusually, it had cross-party support despite the very fractious politics in the US at the moment. It regulates AI chatbots by requiring them to remind users regularly that they are not human or qualified to give medical advice, and it ensures that chatbots are not allowed to provide sexual content or have sexual or grooming-type discussions, and that users do not believe that they are speaking with therapists. I hope that we can focus our minds—especially during Eating Disorders Awareness Week—on the potential danger of young people being given what they believe to be medical advice by chatbots, which may be presented to them as friends or cartoon characters. That advice could be hugely harmful to their health.
I urge the House and the Government to move with extreme speed to address the problem. About two or three years ago, most of the general public had not really heard of ChatGPT. Now, we hear that around 50% of professionals use it regularly, over one in five people use it daily, and one in three adults have already turned to chatbots for mental health advice or emotional support. That is a huge and sudden change. It is penetrating our culture and daily use. We must ensure that we do not look back on this as we did with smoking. We knew for years the damage that smoking was doing to people, but action was not taken, evidence was obfuscated and lawmakers were lobbied. They delayed, and people died needlessly. We must get ahead of this and take action as quickly as possible.