To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Internet: Racial Discrimination
Monday 19th July 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, ​whether racism that falls short of the standard of a racial hatred offence will be covered by the Online Safety Bill as a priority harm.

Answered by Caroline Dinenage

Racism online is completely unacceptable and has no place in an open and tolerant society. All companies whose services are likely to be used by children will have to protect them from racist content that falls short of the criminal threshold. Companies providing high-risk, high-reach services, such as the main social media services will also need to address legal content of this type that is harmful to adults. Racist abuse falls within the definition of harmful content that companies must address.

The government will set out priority harms for both children and adults in secondary legislation following consultation with Ofcom. Racist abuse that does not meet the threshold of a criminal offence will likely be a priority harm.


Written Question
Internet: Safety
Wednesday 26th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what plans he has to include provisions to ensure cross-platform co-operation in combating online harms in the Online Safety Bill announced in the Queen's Speech 2021.

Answered by Caroline Dinenage

The Online Safety Bill will address harmful content shared across multiple services in several ways. Ofcom will have a duty to publish a risk assessment identifying risks to individuals on regulated services. This will cover risks associated with the cross platform nature of harms.

Companies will need to assess whether these harms are likely to appear on their services and mitigate the risks of them doing so. Ofcom will set out details on how this can be achieved in codes of practice. Where appropriate, these will include measures to address cross-platform harms and could include cooperation between platforms.

Ofcom will also undertake research and horizon-scanning to spot any cross-platform emerging issues, backed up by robust information gathering powers. It will have a role in sharing best practice on mitigation amongst service providers. This will drive improvements in the ways service providers identify and tackle these issues.

In addition, the super-complaints process will enable organisations to submit evidence of systemic issues that are causing harm to certain groups across multiple services, which Ofcom will review.


Written Question
Internet: Safety
Wednesday 26th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what plans he has to include provisions that tackle harmful content shared across multiple platforms in the Online Safety Bill announced in the Queen's Speech 2021.

Answered by Caroline Dinenage

The Online Safety Bill will address harmful content shared across multiple services in several ways. Ofcom will have a duty to publish a risk assessment identifying risks to individuals on regulated services. This will cover risks associated with the cross platform nature of harms.

Companies will need to assess whether these harms are likely to appear on their services and mitigate the risks of them doing so. Ofcom will set out details on how this can be achieved in codes of practice. Where appropriate, these will include measures to address cross-platform harms and could include cooperation between platforms.

Ofcom will also undertake research and horizon-scanning to spot any cross-platform emerging issues, backed up by robust information gathering powers. It will have a role in sharing best practice on mitigation amongst service providers. This will drive improvements in the ways service providers identify and tackle these issues.

In addition, the super-complaints process will enable organisations to submit evidence of systemic issues that are causing harm to certain groups across multiple services, which Ofcom will review.


Written Question
Internet: Safety
Wednesday 26th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps he is taking to support Ofcom to prepare for its role as the independent online safety regulator as announced in the Queen's Speech 2021.

Answered by Caroline Dinenage

The new Online Safety regulatory remit will entail a significant expansion of Ofcom’s existing responsibilities. We are working closely with Ofcom to ensure it is prepared for its new role, and to ensure the legislation is effectively implemented. This includes work to ensure it has the resources, skills and capabilities it needs to prepare to take on its new functions.


Written Question
Internet: Safety
Wednesday 26th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if he will publish the timetable for the Online Safety Bill announced in the Queen's Speech 2021 including for (a) pre-legislative scrutiny, (b) the date on which relevant businesses will be obliged to report their risk assessments to Ofcom and (c) post-legislative scrutiny to assess whether the regime is working.

Answered by Caroline Dinenage

The Online Safety Bill will be subject to pre-legislative scrutiny in this session. It is for Parliament to determine when the Bill will be scrutinised but I hope that the process will be able to start shortly now that the draft Bill has been published. This is a priority for my Department and for the Home Office, however the timetable for introduction is dependent on the wider parliamentary timetable.

The Online Safety Bill will place a duty on Ofcom to carry out a risk assessment of the sector and, as soon as is reasonably practicable, to issue guidance to companies about risk assessments. Companies will then have three months to carry out their risk assessments, unless they agree a longer timetable with Ofcom.

In order to assess the effectiveness of the regulatory framework, the Online Safety Bill provides for a review to be undertaken by the Secretary of State, to be published and laid before Parliament, between 2 and 5 years after the duties on services are commenced.


Written Question

Question Link

Monday 17th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what discussions he has had with the Home Secretary on the protection of children’s data online.

Answered by John Whittingdale

The Government is committed to making sure that we have high data protection standards and that people of all ages are confident that their personal data will be protected and used in an appropriate way.

All organisations in the UK that process personal data have to comply with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA). Any use of children’s data must be lawful, fair and transparent. Children should be given clear information about how their data will be used and they have the same rights as adults to access their data; request rectification; object to its processing or have it erased. Organisations offering online services directly to children must seek parental consent to process the personal data of children under the age of 13.

The DPA requires the Information Commissioner, the independent data protection regulator, to publish an Age Appropriate Design Code. The Code sets out standards of age appropriate design that companies will need to implement to ensure their services appropriately safeguard children’s personal data and process children’s personal data fairly. The Code came into force in September 2020 with a 12 month transition period for industry. It will play a key role in delivering protections for children ahead of and alongside the government’s new online safety regulatory framework. Organisations will need to conform by 2nd September 2021.

The ICO has committed to providing a package of support to organisations during the transition period to support conformance to the Code, with all guidance contained in a Children’s Code Hub on the ICO’s website at https://ico.org.uk/for-organisations/childrens-code-hub/. The ICO is ensuring they engage with experts, children and parents when developing guidance, and has recently launched a Children’s Advisory Panel to support the implementation of the Code.

The ICO has also advertised for transparency champions to submit privacy information designs so that children can easily understand how, when and why services use their data.

Discussions about data protection and online safety are held regularly across government.


Written Question

Question Link

Monday 17th May 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps he is taking to ensure the protection of children’s personal data online.

Answered by John Whittingdale

The Government is committed to making sure that we have high data protection standards and that people of all ages are confident that their personal data will be protected and used in an appropriate way.

All organisations in the UK that process personal data have to comply with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA). Any use of children’s data must be lawful, fair and transparent. Children should be given clear information about how their data will be used and they have the same rights as adults to access their data; request rectification; object to its processing or have it erased. Organisations offering online services directly to children must seek parental consent to process the personal data of children under the age of 13.

The DPA requires the Information Commissioner, the independent data protection regulator, to publish an Age Appropriate Design Code. The Code sets out standards of age appropriate design that companies will need to implement to ensure their services appropriately safeguard children’s personal data and process children’s personal data fairly. The Code came into force in September 2020 with a 12 month transition period for industry. It will play a key role in delivering protections for children ahead of and alongside the government’s new online safety regulatory framework. Organisations will need to conform by 2nd September 2021.

The ICO has committed to providing a package of support to organisations during the transition period to support conformance to the Code, with all guidance contained in a Children’s Code Hub on the ICO’s website at https://ico.org.uk/for-organisations/childrens-code-hub/. The ICO is ensuring they engage with experts, children and parents when developing guidance, and has recently launched a Children’s Advisory Panel to support the implementation of the Code.

The ICO has also advertised for transparency champions to submit privacy information designs so that children can easily understand how, when and why services use their data.

Discussions about data protection and online safety are held regularly across government.


Written Question
Video on Demand: Age Ratings
Monday 19th April 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what assessment he has made of the barriers to (a) Disney+ and (b) other platforms to prevent them from using BBFC's best practice guidelines.

Answered by Caroline Dinenage

As the designated body for age classification of film content, the Government has great trust in the British Board of Film Classification’s (BBFC) best practice age ratings and continues to support the adoption of BBFC ratings for content on video on demand platforms.

While adoption of the BBFC’s age ratings by such platforms is currently voluntary, we welcome their usage and were particularly pleased to see Netflix announce on 1 December 2020 that they have become the first platform to achieve complete coverage of their content under the BBFC’s ratings.

The Government has not made any specific assessment regarding parents’ expectations of video-on-demand platforms’ content being classified in line with the BBFC's standards, or the barriers that platforms face to adopting the ratings. We note, however, that the BBFC regularly consults with the public and publishes its research online. The Government continues to engage with platforms to adopt the BBFC’s ratings across all of their content, and will keep the evidence for legislation in this area under review.


Written Question
Video on Demand: Age Ratings
Monday 19th April 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if his Department will take steps to support the adoption of BBFC age-rating standards for content on video on demand platforms.

Answered by Caroline Dinenage

As the designated body for age classification of film content, the Government has great trust in the British Board of Film Classification’s (BBFC) best practice age ratings and continues to support the adoption of BBFC ratings for content on video on demand platforms.

While adoption of the BBFC’s age ratings by such platforms is currently voluntary, we welcome their usage and were particularly pleased to see Netflix announce on 1 December 2020 that they have become the first platform to achieve complete coverage of their content under the BBFC’s ratings.

The Government has not made any specific assessment regarding parents’ expectations of video-on-demand platforms’ content being classified in line with the BBFC's standards, or the barriers that platforms face to adopting the ratings. We note, however, that the BBFC regularly consults with the public and publishes its research online. The Government continues to engage with platforms to adopt the BBFC’s ratings across all of their content, and will keep the evidence for legislation in this area under review.


Written Question
Video on Demand: Age Ratings
Monday 19th April 2021

Asked by: Chris Elmore (Labour - Ogmore)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, whether his Department has made an assessment of the level of expectation among parents that VOD platforms should ensure that their film and TV content is age-rated in line with the BBFC's standards for content released in cinemas and on DVD.

Answered by Caroline Dinenage

As the designated body for age classification of film content, the Government has great trust in the British Board of Film Classification’s (BBFC) best practice age ratings and continues to support the adoption of BBFC ratings for content on video on demand platforms.

While adoption of the BBFC’s age ratings by such platforms is currently voluntary, we welcome their usage and were particularly pleased to see Netflix announce on 1 December 2020 that they have become the first platform to achieve complete coverage of their content under the BBFC’s ratings.

The Government has not made any specific assessment regarding parents’ expectations of video-on-demand platforms’ content being classified in line with the BBFC's standards, or the barriers that platforms face to adopting the ratings. We note, however, that the BBFC regularly consults with the public and publishes its research online. The Government continues to engage with platforms to adopt the BBFC’s ratings across all of their content, and will keep the evidence for legislation in this area under review.