Social Media: Mental Health

(asked on 5th December 2022) - View Source

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps her Department is taking to regulate social media algorithms to reduce user exposure to (a) self-harm and (b) suicide-related content.


Answered by
Paul Scully Portrait
Paul Scully
This question was answered on 8th December 2022

Under the Online Safety Bill, all platforms will need to undertake risk assessments for illegal content and content that is harmful to children. This will ensure they understand the risks associated with their services, including in relation to their algorithms. They will then need to put in place proportionate systems and processes to mitigate these risks.

Platforms that are likely to be accessed by children will need to fulfil these duties in relation to harmful content and activity, including legal self-harm and suicide content. Assisting suicide has also been designated as a priority offence in the Bill, so all platforms will be required to take proactive steps to tackle this type of illegal content. The government will also bring forward a new self-harm offence. Companies will therefore need to remove communications that intentionally encourage or assist self-harm.

The largest platforms will also have a duty to offer all adult users tools to reduce their exposure to certain kinds of legal content. On 29 November the government announced its intention for these tools to apply to legal self-harm and suicide content. These tools could include the option of switching off algorithmically recommended content.

Reticulating Splines