Online Abuse: Protection for Children Debate
Full Debate: Read Full DebateBaroness Manzoor
Main Page: Baroness Manzoor (Conservative - Life peer)Department Debates - View all Baroness Manzoor's debates with the Department for Business and Trade
(3 days, 19 hours ago)
Lords ChamberIn this case, Ofcom can do only what legislators ask it to do or provide for it to do. It is limited in that. As noble Lords will know, Ofcom has a clear remit to implement the Online Safety Act. I know that we have discussed this several times before, but I think that as we roll out the illegal codes and the children’s safety code, they will make a profound difference to what children can see. I am confident that Ofcom has the resources and wherewithal to make that step change, which we all know is necessary.
My Lords, I declare an interest in that I am an ex-trustee of the NSPCC. One of the answers that the Minister gave regarded algorithms. What experience and expertise does Ofcom have to ensure that those algorithms capture the vast majority of harm that is put on the internet and on social media, because who develops the algorithms holds the key to this?
My Lords, the noble Baroness is absolutely right. Algorithms are a real challenge, and we know some of the damage that can be done by them if they do not operate effectively. When Ofcom published its child safety codes on 24 April, it set out 40 measures that companies are expected to take to comply with the child safety duties. Measures include age-assurance technology, changing algorithms to filter out harmful content and adopting mechanisms so that parents and children can easily report harmful content. It is part of the children’s code to address algorithms. Over time, Ofcom will be able to report on how successful it has been in expecting that of platforms.