Introduce Jay’s Law: specific legislation obliging platforms to remove organised misinformation that targets grieving families after tragic deaths; require platforms to act on speculative, malicious content likely to cause serious harm; and work with Ofcom on enforcement and sanctions.
You may be interested in these active petitions
Families who have suffered tragic deaths are being subjected to organised misinformation and malicious speculation online. This targets grieving families at their most vulnerable, causing severe additional trauma, distress, and emotional pain. Jay’s Law would place clear legal duties on platforms, backed by Ofcom enforcement and sanctions, to ensure all speculative, organised, and malicious content is swiftly removed, protecting grieving families’ privacy, dignity, and safety.
Wednesday 25th March 2026
The government recognises how devastating this content can be. The Online Safety Act has duties on platforms to protect users against illegal mis- and disinformation.
The government would like to thank those who have signed the petition on this important issue.
The government recognises the devastating impact abuse and misinformation can have on an individual, especially during the loss of a loved one. The Government will continue to engage with platforms on this issue, discussing their actions to combat illegal content. The Online Safety Act 2023 (OSA) introduces duties on platforms to tackle illegal content and protect users from harm. In March 2025, Ofcom’s illegal harms codes of practice came into effect, requiring platforms to implement robust measures to reduce the likelihood of users engaging in illegal activity, including harassment and abuse.
The OSA also introduced false and threatening communications offences.
Companies will therefore need to mitigate the risk arising from the use of anonymous profiles to facilitate illegal activity, on their services. In addition, the OSA states that Category 1 service providers must offer all adult users of the service the option to verify their identity. This will allow users of a platform to filter out non-verified users.
Beyond duties on platforms to field users’ complaints, individuals can also submit complaints to Ofcom where they think a provider is failing to comply with their duties. Complaints regarding illegal online content can be made directly to Ofcom. While Ofcom cannot respond to or investigate individual complaints, this action helps them assess whether regulated services are doing enough to protect their users – and if Ofcom should take any action. These complaints are an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They guide Ofcom in deciding where to focus their attention.
Ofcom use the powers parliament has made available to it. These include the power to issue fines of up to 10% of a company’s qualifying worldwide revenue. In the most serious of cases Ofcom can also apply for a court order to impose business disruption measures - stopping UK users accessing the site.
Separately, Ofcom must produce a report assessing the measures taken or in use by providers of user-to-user and search services to enable users and others to report content and make complaints to providers of such services.
Ofcom’s approach and timeline for implementation can be accessed here: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/roadmap-to-regulation/.
Department for Science, Innovation and Technology