Social media platforms and search engines must now comply with preventing and removing harmful illegal content
The UK Online Safety Act 2023 (‘OSA’) represents one of the most significant developments in UK digital regulation to date. Aimed at enhancing online safety - particularly for children – it imposes clear legal duties on social media platforms and search services to prevent harm, including the swift removal of illegal content such as child sexual abuse material.
The first set of enforceable duties under the OSA came into force on 17 March 2025. They relate to ‘illegal harms’, which require user-to-user services and search engines to conduct risk assessments and implement safety measures to protect users from encountering illegal content.
Who do the illegal harm duties apply to?
The illegal harm duties apply to in-scope services which broadly translates to user-to-user services and search engines. These include the following:
- Social media platforms, (e.g. Instagram, Facebook, Snapchat and X (formerly known as Twitter));
- Messaging and Online Dating applications (e.g. WhatsApp, Facebook Messenger);
- Video-sharing platforms (e.g. YouTube and TikTok);
- Online forums and community sites (e.g. Reddit and Quora);
- Gaming services with user chat functions, (e.g. Twitch and PlayStation Network);
- Marketplaces that host user-generated content (e.g. TikTok and Etsy);
- Search engines (e.g. Google, Bing and Yahoo!); and
- Pornography services (including those that publish/display pornographic content e.g. OnlyFans).
Notably, although a UK based Act, the OSA has extra-territorial reach, whereby in-scope services based outside the UK with a significant number of UK users and/or a target market in the UK, must also comply.
What are the illegal harms duties?
From 17 March 2025, in-scope service providers are required to take steps to address and combat criminal content, with The Office of Communications (the online safety regulator, Ofcom) assessing compliance. Ofcom has advised liable service providers of their obligations, which stem from the ‘suitable and sufficient’ illegal harms risk assessments instructed to be completed by each. In particular, in-scope services must:
- Identify and assess the risks of illegal content;
- Implement proportionate safety measures to prevent and mitigate those risks;
- Swiftly remove illegal content as soon as you become aware of it and allow people to easily report illegal content and operate a complaints procedure;
- Explain how you’ll do this in your terms of service/publicly available statement; and
- Follow Ofcom’s Codes of Practice or implement equivalent alternative measures (and be able to provide evidence as to their effectiveness).
The duties cover a variety of illegal content, such as hate speech, fraud, CSAM (child sexual abuse material) and terrorism.
Priority area for Ofcom targeted action
Ofcom has launched further enforcement programmes to address priority areas identified for targeted action. The dissemination of child sexual abuse material (CSAM) by offenders is one area which is subject to ongoing and further enforcement.
Ofcom’s enforcement programme assesses the measures being taken by file-sharing and file-storage services (e.g. Dropbox, Google Drive, iCloud) to meet their Illegal Content Duties in respect of image-based CSAM on their services which present risks of harm to UK users. Evidence has shown these services are particularly at risk of being used for the sharing of CSAM.
In Ofcom’s Statement accompanying the Codes of Practice, the regulator explained CSAM is widespread on many in-scope services and that human content moderation cannot identify and remove CSAM at a fast enough speed and scale. The Codes of Practice recommend the use of AI-driven moderation tools to assess whether content is CSAM and its immediate removal if identified as such. Perceptual hash-matching is a process which assesses image similarity and is effective for depicting previously identified illegal or violative content. Ofcom recommends the implementation of perceptual hash-matching by the in-scope services where their Illegal Content Risk Assessment identifies them as high risk of hosting image-based CSAM.
Notable recent Ofcom investigations and fines
Age assurance and age verification measures in the adult sector
As of 17 January 2025, Ofcom has been contacting in-scope service providers displaying pornographic material to inform them of their OSA obligations and to request confirmation of the age assurance and verification provisions they are implementing to achieve compliance.
On 27 March 2025, Ofcom ‘fined the provider of OnlyFans, Fenix International Limited, £1.05 million for failing to accurately respond to formal requests for information about its age assurance measures on the platform’. Fenix had originally stated to the regulator that it had set a ‘challenge age’ for its facial age estimation technology at 23 years old but on 4 January 2024, Fenix learnt from its technology provider that the challenge age for OnlyFans was set at 20 years old, not 23 years old. Fenix only informed Ofcom of this error on 22 January 2024, after realising it had been set to 20 years old since 1 November 2021. Fenix raised the challenge age range to 23 years old on 16 January 2025 but then lowered it again to 21 years old on 19 January 2025. Ofcom subsequently launched an investigation into whether Fenix had failed to comply with its duty to provide accurate and complete information, and found the Fenix had breached its duty.
Priority illegal content
Priority illegal content included within the illicit harm duties include that which constitutes the offence of intentionally encouraging or assisting the suicide (or attempted suicide) of another individual. Ofcom launched an investigation in April this year into an online suicide discussion forum to assess the service’s compliance with its illegal harms duties under the Act. Ofcom’s subsequent investigation follows the service’s failure to adequately respond to its request for a record of their illegal content risk assessment, despite receiving an information notice under the Risk Assessment Enforcement Programme and correspondence with the service. The forum has not been named but Ofcom has stated it believes it is linked to up to 50 deaths in the UK.
Ofcom’s enforcement powers under The UK Online Safety Act 2023
Ofcom has made it clear that it will open investigations into services, where necessary. The regulator holds significant enforcement powers under the OSA, which include issuing the following:
- Information notices;
- Provisional or final enforcement notices; and
- Fines/penalties of up to £18 million or 10% of the in-scope service’s global turnover, whichever is the greater figure.
We are at the outset of regulatory changes arising from this new Act and it is likely Ofcom will introduce additional enforcement programmes for priority areas in the near future. In-scope platforms should therefore be prepared to evidence their compliance to Ofcom for all the risks and obligations placed on them by the OSA. They must regard online safety as an ongoing regulatory priority, rather than a mere checklist exercise. A failure to comply or engage could result in significant financial, legal and reputational consequences with particular infringements of the OSA potentially leading to criminal liability and business disruption measures. This strict application of the enforcement duties and the OSA will help to improve the prevention of harmful online content. This legislation marks a decisive step toward creating a safer and more accountable online environment.
The Hamlins Reputation and Privacy team is widely recognised as an advisor of choice for both individuals and businesses seeking advice in relation to reputation management, including complaints in defamation and privacy law and we handle cases involving print and digital media.
If you would like to find out more about how Hamlins can help you, please get in touch.