To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Arts: Copyright
Thursday 11th January 2024

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Culture, Media and Sport, whether she has made an assessment of the potential merits of bringing forward legislative proposals to prevent the unauthorised use of creative content by AI.

Answered by Julia Lopez - Minister of State (Department for Science, Innovation and Technology)

The UK has world-leading protections for copyright and intellectual property, and we are committed to maintaining them. We want rights holders to be assured that AI firms will use their content appropriately and lawfully. The Intellectual Property Office has been working with rights holders and AI firms to clarify the relationship between AI and copyrighted works. An update on this work will be published in due course.


Written Question
Internet: Abuse
Wednesday 5th January 2022

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, whether she has made a recent assessment of the economic impact over the course of a person's life associated with online abuse for (a) women and (b) men.

Answered by Chris Philp - Minister of State (Home Office)

Assessing the prevalence and economic impact of online abuse is difficult, even more so in the context of a single individual over their lifetime. While data on online abuse is limited, the government did assess the economic and social cost of a number of online harms in its impact assessment published in May 2021 to support the draft Online Safety Bill. The full methodology used to quantify the economic cost of online harms can be found from Page 70 of the impact assessment. The Government is currently working on a final stage impact assessment for the Online Safety Bill which will provide updated estimates of the economic cost of online harms.

The Government is committed to addressing data limitations in this area. This year, we have partnered with the Alan Turing Institute to launch an Online Harms Observatory with a particular focus on online hate. It will provide real-time insights into the scope, prevalence and dynamics of harmful online content using a mix of large-scale data analysis, AI and survey data. In addition, the government and Ofcom are continuing to conduct research looking at the prevalence and impact of online harms. The prevalence and impact of online abuse in a variety of contexts will be a key focus.

Online abuse can have significant and wide-ranging impacts on victims. This is unacceptable, and under the Online Safety Bill companies in scope will need to protect users from illegal abuse. Major platforms will also need to address manifestations of online abuse which may be legal but are still harmful to adults. Priority categories of legal but harmful content for adults will be set out in secondary legislation and these are likely to include some forms of online abuse.


Written Question
Social Media: Abuse
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to ensure that social media companies do not rely solely on artificial intelligence or algorithm systems to protect users from abuse taking place on their platforms.

Answered by Chris Philp - Minister of State (Home Office)

Under the draft Online Safety Bill, social media companies will have new duties to protect their users from harmful content such as online abuse. Ofcom, as the independent regulator, will recommend proportionate systems and processes, including for content moderation, that social media companies should put in place to fulfil these duties. We anticipate that Ofcom will recommend a combination of human moderation and other systems, depending on what is effective and proportionate for in-scope services.


Written Question
Social Media: Abuse
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to ensure that social media companies (a) invest in human moderation to support reporting and content moderation systems and (b) provide training and support to enable staff to respond effectively to abuse taking place on their platforms.

Answered by Chris Philp - Minister of State (Home Office)

Under the draft Online Safety Bill, social media companies will have new duties to protect their users from harmful content such as online abuse. Ofcom, as the independent regulator, will recommend proportionate systems and processes, including for content moderation, that social media companies should put in place to fulfil these duties. We anticipate that Ofcom will recommend a combination of human moderation and other systems, depending on what is effective and proportionate for in-scope services.


Written Question
Internet: Safety
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to (a) ensure that providers of online platforms consider how their products can be used to perpetrate online abuse and (b) place a responsibility on those providers to embed safety by design in their technology.

Answered by Chris Philp - Minister of State (Home Office)

A safety by design approach will be crucial for compliance with future online safety legislation. The Online Safety Bill will help ensure companies effectively manage the risk that their services present; this includes how they may be used to perpetrate online abuse. All companies in scope will be required to complete a risk assessment. Companies will need to understand the risks that their design choices present to users and put in place appropriate mitigation actions. The Government is also taking action now, in advance of the legislation coming into force, to ensure that online platforms are designed to be safe for users. We published voluntary safety by design guidance in June this year, which sets out best practice platform design for user safety.


Written Question
Internet: Females
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, with reference to the Unsocial Spaces report, published by Refuge in October 2021, what assessment she has made of the implications for her policies of the finding by Refuge that one in three women in the UK have suffered online abuse.

Answered by Chris Philp - Minister of State (Home Office)

Online abuse is unacceptable and we are committed to protecting women’s safety.

Under the draft Online Safety Bill, companies in scope will need to minimise and remove illegal content including criminal online abuse targeted at women.

Major platforms will also need to address legal but harmful content for adults.These companies will have to set out clearly what legal content is acceptable on their platforms and enforce their terms and conditions consistently and transparently.

If platforms fail in their duties under the Bill, they will face tough enforcement action including fines of up to 10% of global annual qualifying turnover.

The government also asked the Law Commission to review existing legislation on harmful online communications. The Law Commission has published its final report putting forward recommendations for reform, including several new offences. The government is considering the Law Commission’s recommendations and will set out its position in due course.


Written Question
Internet: Females
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to ensure that statutory regulation of online platforms explicitly reflects the harms and impact of online abuse and other forms of online violence against women and girls.

Answered by Chris Philp - Minister of State (Home Office)

Under the draft Online Safety Bill, companies in scope will need to minimise and remove illegal content including criminal online abuse targeted at women. They will also have to protect children, including young girls, from harmful or inappropriate content.

Major platforms will also need to address legal but harmful content for adults. Priority categories of legal but harmful content for adults will be set out in secondary legislation and these are likely to include forms of online abuse that disproportionately affect women and girls, like misogynistic abuse. These companies will have to set out clearly what legal content is acceptable on their platforms and enforce their terms and conditions consistently and transparently.

Ofcom will have a suite of enforcement powers available to use against companies who fail their duties. These powers include fines for companies of up to £18 million or 10% of qualifying annual global turnover, and business disruption measures.


Written Question
Internet: Females
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to ensure that providers of online platforms are legally obliged to prioritise the prevention and investigation of abuse against women occurring on their platforms.

Answered by Chris Philp - Minister of State (Home Office)

Under the draft Online Safety Bill, companies in scope will need to protect users, including women, from illegal abuse. Services will need to have effective systems in place to minimise and remove illegal content.

Major platforms will also need to address legal but harmful content for adults. Priority categories of legal but harmful content for adults will be set out in secondary legislation and these are likely to include some forms of online abuse.

Ofcom will have a suite of enforcement powers available to use against companies who fail their duties. These powers include fines for companies of up to £18 million or 10% of qualifying annual global turnover, and business disruption measures.

The draft Bill has been subject to pre-legislative scrutiny by a Joint Committee. The Joint Committee reported with their recommendations on 14 December. We are considering the Committee’s recommendations and are committed to introducing the Bill as soon as possible after that.


Written Question
Social Media: Abuse
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to ensure that social media companies (a) invest in raising awareness of online abuse and (b) routinely develop and promote safety guidance for users.

Answered by Chris Philp - Minister of State (Home Office)

In July 2021, the government published its Online Media Literacy Strategy, which sets out plans for improving media literacy capabilities across the UK. The Strategy supports the empowerment of users with the key skills and knowledge they need to make informed and safe choices online. Whilst working to support the media literacy of all users, the Strategy has an amplified focus on users who are vulnerable online, such as those who experience disproportionate levels of online abuse.

The Strategy explores the role of online platforms, including social media companies, in supporting the media literacy of users. This includes calling on platforms to invest more in educational initiatives, explore the role that platform design choices can play in promoting media literacy to users, and improve transparency about data related to platform media literacy activity.


Written Question
Internet: Regulation
Tuesday 21st December 2021

Asked by: Kirsten Oswald (Scottish National Party - East Renfrewshire)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Digital, Culture, Media and Sport, what steps she is taking to (a) ensure that providers of online platforms are regulated by a robust, independent regulator and (b) require those providers to (i) monitor and (ii) report on abuse taking place on their platforms.

Answered by Chris Philp - Minister of State (Home Office)

The Online Safety Bill will entail a significant expansion of Ofcom’s existing responsibilities. We are working closely with Ofcom to ensure it is prepared for its new role, and to ensure the legislation is effectively implemented. This includes work to ensure it has the resources, skills and capabilities it needs to prepare to take on its new functions. Ofcom is already regulating UK-established video sharing platforms following the passage of the Audiovisual Media Services Regulations 2020, which came into effect in November 2020. This experience will help prepare Ofcom for its online safety regulatory role.

Under the new Bill, companies in scope will need to minimise and remove illegal content, including illegal abuse. They will also have to protect children from harmful or inappropriate content. The largest and highest risk companies will also be required to publish annual transparency reports about the steps they are taking to tackle online harms.

If platforms fail in their duties under the Bill, they will face tough enforcement action including fines of up to 10% of annual global qualifying revenue, or £18m, whichever is the greater.