To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Social Media: Abuse
Monday 6th June 2022

Asked by: Catherine McKinnell (Labour - Newcastle upon Tyne North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if she will make it her policy to (a) issue fines, or (b) otherwise sanction social media platforms that fail to prevent users previously banned from a platform for abusive behaviour from creating new accounts on the platform; and if she will make a statement.

Answered by Chris Philp - Minister of State (Home Office)

The Online Safety Bill will require social media platforms to tackle illegal content, including illegal abuse. The largest, high risk platforms will also need to set clear terms of service for such content and ensure they are properly enforced. Ofcom will set out steps that companies can take to fulfil their duties in codes of practice, and these could include measures such as preventing banned users from creating new accounts.

OFCOM will be able to sanction companies where they fail to adequately fulfil their new duties under the Bill. OFCOM will have a range of enforcement powers available to it, including: powers to issue substantial fines, require operators to take steps to remedy breaches and/or come into compliance with their duties, and to apply to the court for business disruption measures (including blocking) where appropriate.


Written Question
Internet: Hate Crime
Monday 6th June 2022

Asked by: Catherine McKinnell (Labour - Newcastle upon Tyne North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what assessment she has made of the adequacy of the protections provided by Schedule 7 of the Online Safety Bill against online hate speech specifically targeted at (a) women and (b) disabled people; and if she will make a statement.

Answered by Chris Philp - Minister of State (Home Office)

The Online Safety Bill contains robust protections for women, girls and disabled people online, who face disproportionate volumes of abuse.

All services in scope will need to put in place proportionate systems and processes to minimise the risk of priority illegal content appearing on their service and to swiftly remove it when it does appear. This will protect all online users, including from content which could constitute hate speech under the Public Order Act 1986. This includes behaviours that are threatening and abusive and result in harassment, alarm or distress. The list of priority offences includes a number of offences which disproportionately afffect women and girls, such as revenge pornography and cyberstalking, which companies must tackle.

Beyond the priority offences, all services will need to ensure that they have quickly taken down other illegal content directed at women and girls and disabled people once it has been reported or they become aware of its presence. Women, girls and disabled users will also be able to report abuse, and should expect to receive an appropriate response from the platform. Ofcom will also have a duty to consider the vulnerability of users whose circumstances appear to put them in need of special protection when performing its duties.

If major platforms don’t fulfil their own standards to keep people safe, they could face an investigation and enforcement action.


Speech in Westminster Hall - Mon 28 Feb 2022
Online Abuse

Speech Link

View all Catherine McKinnell (Lab - Newcastle upon Tyne North) contributions to the debate on: Online Abuse

Speech in Westminster Hall - Mon 28 Feb 2022
Online Abuse

Speech Link

View all Catherine McKinnell (Lab - Newcastle upon Tyne North) contributions to the debate on: Online Abuse

Speech in Westminster Hall - Tue 01 Feb 2022
Hadrian’s Wall: Newcastle’s West End

Speech Link

View all Catherine McKinnell (Lab - Newcastle upon Tyne North) contributions to the debate on: Hadrian’s Wall: Newcastle’s West End

Speech in Commons Chamber - Mon 17 Jan 2022
BBC Funding

Speech Link

View all Catherine McKinnell (Lab - Newcastle upon Tyne North) contributions to the debate on: BBC Funding

Written Question
Internet: Safety
Wednesday 14th July 2021

Asked by: Catherine McKinnell (Labour - Newcastle upon Tyne North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if he will publish (a) the social media companies, (b) other online platforms and (c) experts on online harms that his Department consulted with in the process of formulating the guidance entitled Online Safety Guidance if you own or manage an online platform, published on 29 June 2021.

Answered by Caroline Dinenage

The voluntary, non-statutory guidance entitled ‘Online Safety Guidance if you own or manage an online platform’ published by the Government is targeted at small and medium-sized enterprises (SMEs) and at start-up organisations, specifically those that are likely to be in scope of future Online Safety legislation. The guidance was developed in consultation with relevant SMEs and start-ups to understand their needs and to frame the guidance in the most user-friendly and effective way. This included multiple rounds of user research and feedback on how information should be presented.

We are unable to publicly name individual companies that took part in user testing, due to the approach agreed with these organisations when they took part in the research. DCMS also worked with relevant industry bodies who represent hundreds of SME and start-up organisations to draw up the content for the guidance. This includes Tech Nation, the Coalition for a Digital Economy (Coadec), the Federation of Small Businesses, the Independent Game Developers' Association (Tiga) and the Association for UK Interactive Entertainment (UKIE), as well as subject matter experts in government and civil society.


Speech in Commons Chamber - Wed 24 Mar 2021
Online Anonymity and Anonymous Abuse

Speech Link

View all Catherine McKinnell (Lab - Newcastle upon Tyne North) contributions to the debate on: Online Anonymity and Anonymous Abuse

Written Question
Video Recordings: Internet
Monday 15th March 2021

Asked by: Catherine McKinnell (Labour - Newcastle upon Tyne North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps he is taking to ensure that platforms carrying user-generated video content engage with the British Board of Film Classification on their best practice age labelling guidelines.

Answered by Caroline Dinenage

The BBFC offers unparalleled expertise in content classification. Ministers and officials will continue to engage with a wide range of stakeholders, including the BBFC, on the video sharing platform regime and the upcoming Online Safety Bill.

The video sharing platform regime, for which Ofcom is the regulator, came into force on 1 November 2020. UK-established video sharing platforms must now take appropriate measures to protect the public, including minors, from illegal and harmful material. Video sharing platforms are not currently mandated to adopt BBFC ratings, nor is it expected that they will be mandated to do so under Ofcom’s regulatory regime for video sharing platforms.

The Government recognises age ratings as an important tool for audience protection, however they are most effective when used in conjunction with other protection tools, such as age assurance and parental controls. Video sharing platforms encompass a broad range of services so it is important that there is flexibility in the regime to be able to adopt tailored approaches. Platforms should consider what measures are most appropriate and proportionate when introducing them on their services.

The BBFC is engaging with both Ofcom and online platforms to share their expertise on emerging technologies and the applicability of content ratings. The Government will also continue to engage with the BBFC, Ofcom and industry to encourage platforms to adopt appropriate content labelling and other age assurance measures in relation to the upcoming Online Safety Bill.


Written Question
British Board of Film Classification
Thursday 11th March 2021

Asked by: Catherine McKinnell (Labour - Newcastle upon Tyne North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, if he will consult with the British Board of Film Classification on the development of the Online Safety Bill in relation to (a) content standards and (b) the regulation of pornographic content.

Answered by Caroline Dinenage

Ministers and officials will continue to engage with a wide range of stakeholders on the development of the Online Safety Bill, including the BBFC. The BBFC offers unparalleled expertise in content classification, including pornographic content.

The video sharing platform regime, for which Ofcom is the regulator, came into force on 1 November 2020. UK-established video sharing platforms must now take appropriate measures to protect the public, including minors, from illegal and harmful material. Ofcom, and the British Board of Film Classification (BBFC) have a strong collaborative relationship when working on audience protection issues. The BBFC is engaging actively with both Ofcom and video sharing platforms to share their expertise on emerging technologies and the applicability of content ratings.

Over the past year the government has also been working with the BBFC and industry to drive the voluntary adoption of the BBFC’s age rating symbols by video on demand platforms. We will continue to engage with industry to encourage platforms to use BBFC age ratings, and will keep the evidence for legislation in this area under review.