Social Media: Artificial Intelligence

(asked on 17th January 2023) - View Source

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if her Department will take steps to help prevent the coding of social media algorithms which create and increase racially prejudiced stereotypes.


Answered by
Paul Scully Portrait
Paul Scully
This question was answered on 24th January 2023

Under the Online Safety Bill, all platforms will need to undertake risk assessments for illegal content and content that is harmful to children. This will ensure they understand the risks associated with their services, including in relation to their algorithms. They will then need to put in place proportionate systems and processes to mitigate these risks.

Platforms will need to put in place systems and processes to prevent their users from encountering priority illegal content. This includes offences relating to racial hatred. Platforms that are likely to be accessed by children will also need to fulfil these duties in relation to harmful content and activity, including online abuse and harassment.

Where content does not meet the criminal threshold, Category 1 platforms will be required to provide all adult users with tools which provide them with greater control over the content that they see, if it is likely that users will encounter it on their service. These tools will specifically apply to content that is abusive, or incites hate, on the basis of race and religion. Should users decide to utilise these tools, they will either reduce the likelihood that they encounter such content, or will alert them to the nature of it.

Reticulating Splines