As the UK's online safety legislation is finally brought into force, Megan Curzon outlines the key duties on in-scope services, Ofcom's regulatory priorities, its approach to enforcement and what lies ahead for the summer of 2025.
Over 18 months have passed since the Online Safety Act (OSA) was given Royal Assent. Hailed as a landmark piece of legislation, the Act has significantly transformed the UK's online safety landscape.. However, the particular detail of the obligations upon service providers took time to define, and the practical impact of the regime is only now starting to take shape. For companies caught by the OSA, it is time to focus on compliance. As duties and obligations come into focus, so too do Ofcom's powers, as the regulator, to enforce the regime. So, what are the latest developments with the OSA, and what is yet to come?
What are the key duties?
The OSA establishes legal duties of care for all online platforms in-scope, in particular user-to-user services (that is, platforms which allow users to share content and interact online) and search services. Ofcom has clarified that in certain circumstances, Generative AI chatbot tools and platforms will also be caught by the regime. These services must comply with mandatory duties, which include obligations to conduct illegal content risk assessments; to mitigate the presence of illegal content; protect children (where the service is likely to be accessed by them); and to enable effective user reporting.
Beyond these core duties, additional obligations apply to certain 'categorised' services if they fall within certain thresholds so as to be designated Category 1, 2A and 2B. "Category 1" services – broadly the largest platforms – face the most stringent duties, including in relation to user empowerment features, additional terms of service duties and protections for news publisher/journalistic content. There are also additional obligations under the OSA for internet services which provide pornographic content.
The thresholds for each of the three categorisation levels has now been set by secondary legislation (known as the Categorisation Regulations'), and Ofcom is expected to publish a list of categorised services during summer 2025.
How is the OSA being implemented?
Ofcom has played a central role in shaping and implementing the OSA, having been tasked with drafting the codes of practice which clarify how services can meet their duties. These codes are non-binding recommendations, but nonetheless a 'safe harbour' – providers who implement all applicable measures will be deemed to have complied with their duties under the Act. The starting point for providers considering their duties is a risk assessment, the outcome of which informs how services can effectively safeguard their users.
Roadmap to implementation - where are we now?
Ofcom's timeline for implementation of the OSA has - perhaps understandably given its mammoth scale - shifted in places since the regime began to take effect in October 2023. Despite this, the legislation is now being implemented at pace. In doing so, Ofcom has relied on a phased approach, prioritising what it calls the "key pillars" of illegal harms and safeguarding children.
Illegal harms
The starting gun was fired on 16 December 2024 with the publication of Ofcom's illegal harms statement, which included a risk assessment guidance and draft (but subsequently finalised) illegal content codes of practice. All services in scope were required to complete a "suitable and sufficient" illegal harms risk assessment by 16 March 2025. Illegal harms duties came into force on 17 March 2025, at which point, platforms were required to start taking steps to tackle illegal content on their services.
Age assurance measures and children's safety
In January 2025, age assurance guidance for publishers of pornographic content was published. Services that display or publish their own pornographic content (known as Part 5 providers), were required to implement highly effective age assurance (HEAA) to prevent access by children from 17 January 2025.
Children's safety
Ofcom also published the children's access assessments guidance in January 2025, pursuant to which all in-scope services were required to carry out a children's access assessment by 16 April 2025. If a platform is assessed as likely to be accessed by children, then additional children's safety duties are engaged, and a children's risk assessment is required. These risk assessments must be completed by 24 July 2025. Draft protection of children codes were published on 24 April 2025. Subject to the codes completing the Parliamentary process, the children's safety duties are expected to come into force from 25 July 2025.
Enforcement
With key duties now (or shortly to be) in force, a significant question looms: how will Ofcom wield its enforcement powers? Ofcom has serious 'muscle' to enforce the Act and to sanction services, including the ability to impose fines of up to £18 million or 10% of "qualifying worldwide revenue", and to block services in cases of serious non-compliance.
Ofcom's enforcement director has been clear that platforms who fail to introduce the relevant protections "can expect to face the full force of our enforcement action", and if early signs are anything to go by, the watchdog isn't pulling its punches. Its activities since mid-March, when the first obligations – relating to illegal harm risk assessments - came into force, have been robust. What is clear is that services falling within the scope of the regulations should be prepared to show how they are complying with the regime. Key points for service providers to note include:
- Risk-based approach to harm: Small platforms are not out of the picture. Ofcom has made clear that part of its enforcement focus has been to drive compliance among what it calls "small but risky" services. If platforms present an increased risk of illegal harm, they will be within Ofcom's line of sight.
- Early engagement is key: Ofcom has the power to compel services to provide information, via 'information notices. It is not just compliance with the substantive duties that platforms need to be mindful of; failure to provide an adequate response to an information notice can result in enforcement action (and indeed it already has).
- Children's safety is a cornerstone: Preventing significant and serious harm to children is a key priority for Ofcom, and we have seen a particular focus on mitigating the risk of child sexual abuse material (CSAM) disseminated on platforms. Services likely to be accessed by children should therefore assume they will face heightened scrutiny and prepare accordingly for the upcoming deadlines.
Ofcom has been notably quick to monitor compliance with duties after they take effect, and we outline below some of the enforcement activity seen to date.
Safeguarding children and other illegal content duties
Upon the commencement of illegal content duties, Ofcom announced an 'enforcement programme' which assesses the measures taken by providers of file-sharing and file-storage services that present a particular risk of harm from image-based CSAM. On 10 June, Ofcom announced investigations into seven file-sharing or file-storage services in relation to failures to respond to a statutory information request and illegal content duties regarding CSAM [6].
As for illegal content duties more widely, Ofcom has recently opened several investigations into service providers including Kick and discussion board 4chan (note that the investigation into Kick has since been closed). An unnamed suicide discussion forum is also being investigated.
Age assurance in the adult content sector
'Part 5' platforms (that is, platforms displaying or publishing pornographic content) had obligations to implement HEAA from January this year. Ofcom has reportedly written to hundreds of service providers to inform them of their obligations under the OSA, and to request confirmation of the actions they are taking to ensure compliance.
An update in May this year confirmed that approximately 40 Part 5 services had responded to Ofcom's request and had implemented HEAA measures such as facial age estimation and credit card payment walls. In other cases, platforms had geo-blocked the UK.
As for platforms that did not comply, Ofcom has opened three investigations regarding a failure to implement HEAA. It is notable that one of these investigations (Score Internet Group) has now been closed, without any formal enforcement action. In a public statement, Ofcom noted that Score had taken steps to implement HEAA and so "the conduct that led to the opening of the investigation has ceased...we have therefore closed it without making any findings as to Score Internet's compliance with its duties".
A similar statement has been published in relation to Ofcom's investigation into Kick: "The investigation into Kick was originally opened after Ofcom received no response to an information notice requesting the written record of the ICRA. Following further correspondence, Kick provided the written record of the ICRA to Ofcom. As such, Ofcom does not consider it appropriate to continue our investigation. We have therefore closed it without making any findings as to Kick's compliance with its duties."
This may be a reassuring indication that, whilst Ofcom has been quick to ask companies to show they are complying with the OSA, it is taking a pragmatic approach and giving services the opportunity to rectify issues before imposing enforcement penalties.
Penalties guidance
Ofcom's firm approach to enforcement is underscored by the recent publication of its penalties guidance policy statement, which addressed the implementation of its fees and penalties regime and, in particular, its decision on the definition of "qualifying worldwide revenue" (QWR). QWR is used to calculate the maximum penalty for breach of duties under the regime (along with guiding which providers are required to pay online safety fees to fund Ofcom's operating costs).
What is on the horizon?
In addition to the duties already being enforced by Ofcom, further obligations are set to come into force in the coming weeks:
24 July 2025 – deadline to complete a children's risk assessment. Ofcom has said that it expects specific services to disclose their risk assessments by 7 August 2025.
25 July 2025 – Protection of Children Codes of Practice expected to come into force.
'Summer' 2025 – Register of categorised services published.
In the background of the categorisation process is a judicial review challenge to the Categorisation Regulations brought by Wikimedia. The challenge does not relate to categorisation as a principle or to Category 1 duties per se, but rather to the legislation imposing Category 1 duties on Wikimedia. Wikimedia's concern is that the OSA obligations, which include a requirement for user verification and content filtering, would "undermine the privacy and safety" of its volunteer users who review and edit content on the platform.
We don't yet know how this challenge might impact the categorisation timeline. Ofcom has said that it will monitor the proceedings closely, provide an update once it has more information on the potential impact of the challenge, and publish the register "as soon as possible thereafter".
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.