Video Recordings: Internet

(asked on 17th March 2021) - View Source

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what assessment his Department has made of the effectiveness of age ratings linked to parental filters as a means of preventing children’s exposure to inappropriate user-generated content on (a) YouTube and (b) other such sites.


Answered by
Caroline Dinenage Portrait
Caroline Dinenage
This question was answered on 23rd March 2021

Protecting children is at the heart of our online harms agenda, and wider government priorities. Where sites host user-generated content or facilitate online user interaction such as video and image sharing, commenting and live streaming, then that content will be subject to the new duty of care. Under our online harms proposals, companies likely to be accessed by children will be required to assess the risks that material on their service poses to children of different ages and put in place age-appropriate protective measures. Ofcom will set out the steps companies can take to protect children so there will be a consistent approach across platforms

The video sharing platform regime, for which Ofcom is the regulator, came into force on 1 November 2020. UK-established video sharing platforms must now take appropriate measures to protect the public, including minors, from illegal and harmful material. In order to comply with the video sharing platform regime, age assurance measures may be adopted by video sharing platforms along with other measures such as age ratings and parental controls. Platforms must take into account freedom of expression and should consider what measures are most appropriate and proportionate when introducing them.

We will continue to engage with industry to encourage platforms to use age ratings, and will keep the evidence for legislation in this area under review.

Reticulating Splines